Query Lamatic.ai documentation instantly from any AI assistant — powered by RAG and Model Context Protocol.
Add this to your MCP client config:
{
"mcpServers": {
"lamatic-docs": {
"url": "https://docmcp.lamatic.ai/api/mcp"
}
}
}No API key or account required.
- Claude Desktop
- Cursor
- Windsurf
- Cline
- Any HTTP MCP client
Ask any question about Lamatic.ai documentation. Uses RAG to search across all indexed docs and return a precise answer.
Input: text (string) — the question to ask
Example:
{
"name": "query_docs",
"arguments": {
"text": "How do I set up a RAG node in Lamatic?"
}
}Your Question
↓
MCP Client (Claude / Cursor / Windsurf)
↓
lamatic-mcp-docs.vercel.app/api/mcp
↓
Lamatic RAG Flow
↓
VectorDB (indexed Lamatic docs)
↓
Answer
Built entirely on Lamatic:
- Firecrawl scrapes lamatic.ai/docs
- Chunking + Vectorize indexes into VectorDB
- RAG Node answers questions semantically
- Next.js + Vercel exposes the public MCP endpoint
https://docmcp.lamatic.ai/api/mcp
git clone https://github.com/Lamatic/Lamatic-MCP-Docs
cd Lamatic-MCP-Docs
npm install
cp .env.example .env # fill in your Lamatic credentials
npm run dev