You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+110Lines changed: 110 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -320,6 +320,116 @@ Special Commands:
320
320
/q, /quit, exit Exit the shell
321
321
```
322
322
323
+
### LLM Integration
324
+
325
+
MCP Tools includes a powerful LLM integration that enables AI models to interact with MCP servers through natural language. The LLM command creates an interactive chat session where AI can discover and use tools from one or more MCP servers on your behalf.
-**Multi-Provider Support**: Works with major LLM providers:
341
+
- OpenAI (default) - Uses API key from `OPENAI_API_KEY`
342
+
- Anthropic - Uses API key from `ANTHROPIC_API_KEY`
343
+
344
+
-**Multiple Server Integration**: Connect up to 3 MCP servers simultaneously:
345
+
```bash
346
+
mcp llm -M "server1" -M "server2" -M "server3"
347
+
```
348
+
Tools are automatically prefixed with server IDs (s1_, s2_, s3_) to avoid naming conflicts.
349
+
350
+
-**Tool Execution**: LLMs can:
351
+
- Discover available tools across all connected servers
352
+
- Call tools with proper parameters
353
+
- Receive and process tool results
354
+
- Make multiple tool calls in a single turn
355
+
356
+
-**Server Aliases**: Use server aliases for simpler commands:
357
+
```bash
358
+
# Using aliases for servers
359
+
mcp llm -M fs-server -M github-server
360
+
```
361
+
362
+
#### Example Session
363
+
364
+
```
365
+
mcp > MCP LLM Shell (v0.5.0)
366
+
mcp > Connecting to server 1: npx -y @modelcontextprotocol/server-filesystem ~
367
+
mcp > Server 1: Registered 8 tools
368
+
mcp > Using provider: openai, model: gpt-4o
369
+
mcp > Total registered tools: 8
370
+
mcp > Type 'exit' to quit
371
+
372
+
user > What files are in my current directory?
373
+
374
+
agent > I'll check the files in your current directory.
375
+
376
+
[Calling s1_list_dir]
377
+
mcp > [Server 1 running list_dir with params {"path":"."}]
378
+
{
379
+
"entries": [
380
+
{
381
+
"name": "README.md",
382
+
"type": "file",
383
+
"size": 12345,
384
+
"modTime": "2023-05-01T12:34:56Z"
385
+
},
386
+
{
387
+
"name": "src",
388
+
"type": "directory"
389
+
}
390
+
]
391
+
}
392
+
393
+
In your current directory, you have:
394
+
1. README.md (file, 12.1 KB)
395
+
2. src (directory)
396
+
397
+
user > Show me the contents of README.md
398
+
399
+
agent > I'll show you the contents of README.md.
400
+
401
+
[Calling s1_read_file]
402
+
mcp > [Server 1 running read_file with params {"path":"README.md"}]
403
+
{
404
+
"content": "# My Project\nThis is a sample README file."
405
+
}
406
+
407
+
Here's the content of README.md:
408
+
409
+
# My Project
410
+
This is a sample README file.
411
+
```
412
+
413
+
#### Configuration Options
414
+
415
+
```bash
416
+
# Provider selection
417
+
mcp llm --provider openai # Default
418
+
mcp llm --provider anthropic
419
+
420
+
# Model selection
421
+
mcp llm --model gpt-4o # Default for OpenAI
422
+
mcp llm --model claude-3-opus-20240229 # Default for Anthropic
423
+
424
+
# API key override (otherwise uses environment variables)
425
+
mcp llm --api-key "your-api-key-here"
426
+
427
+
# Display options
428
+
mcp llm --no-color # Disable colored output
429
+
```
430
+
431
+
The LLM integration is designed to make AI-driven workflow automation with MCP tools intuitive and powerful, allowing models to perform complex tasks by combining available tools through natural language requests.
432
+
323
433
### Project Scaffolding
324
434
325
435
MCP Tools provides a scaffolding feature to quickly create new MCP servers with TypeScript:
0 commit comments