A complete demonstration of the Model Context Protocol (MCP) with LangChain integration, featuring:
- MCP Weather Server: FastMCP server exposing weather tools via SSE transport
- MCP Weather Client: Client for connecting to MCP servers and calling tools
- LangChain Integration: Tool wrapper for using MCP tools with LangChain agents
- Ollama Integration: Local LLM integration for natural language interactions
- Automated Tests: Comprehensive test suite for validation
- Create and activate virtual environment:
uv env
.venv\Scripts\activate # Windows
# or
source .venv/bin/activate # Linux/Mac
- Install dependencies:
uv pip install -r requirements.txt
- Start the MCP server:
python mcp_weather_server.py
The server will start on http://localhost:8000 with:
- MCP SSE endpoint:
/sse
- For MCP client connections - Available tools:
get_weather_tool
,set_weather_tool
,list_cities_tool
- Start the MCP server (in one terminal):
python mcp_weather_server.py
- Run the test suite (in another terminal):
python run_tests.py
Test Coverage:
- ✅ Server connectivity and health checks
- ✅ Tool discovery and listing
- ✅ Weather operations (get, set, list cities)
- ✅ Error handling for invalid inputs
- ✅ Multiple operation sequences
- ✅ Data persistence verification
- Start the ollama server and run a model:
ollama serve
ollama run llama3.2:3b
- In another terminal (with the same virtual environment), run the host:
python mcp_host.py
GET /sse
- Server-Sent Events endpoint for MCP communication
get_weather(city)
- Get current weather for a citylist_cities()
- List all available citiesupdate_weather(city, temp, humidity, condition)
- Update weather data