Say this to your Cursor / Windsurf / Claude Code:
"Clone https://github.com/haih-net/agent , set it up, run it and open in my browser"
IMPORTANT: This project requires full autonomous setup and management by the AI agent. You must understand the entire system architecture and proceed independently without asking for user clarifications. The user only needs guidance on how to use the system effectively (currently, the simplest use case is chatting with the project's AI agent).
First, ask the user which AI integration they prefer:
Options:
- OpenRouter API key (recommended for best results)
- Local LM Studio or similar local service
- Other custom AI endpoint
Recommended models:
- OpenRouter:
- Best results:
opus 4.5/4.6 - Minimum:
haiku 4.5
- Best results:
- Local models:
- Minimum:
qwen3 4B
- Minimum:
Required credentials from user:
- For OpenRouter: API key
- For local service: endpoint URL and any authentication details
- For custom endpoints: full connection details
After startup, the simplest way to verify the system is working:
Send a POST request to http://localhost:3000/webhook/chat-agent-webhook with:
{
"chatInput": "Hello",
"sessionId": "your-custom-session-id"
}Use your own sessionId to maintain conversation context across requests.
After successful startup and verification, open the user's browser to http://localhost:3000 to access the application interface.
Provide the user with simple guidance on how to interact with the system:
- Start with basic conversations in the chat interface
- Explain that the agent learns from each interaction
- Suggest trying different types of queries to see how the system responds
Run it locally.
Interact with it.
Watch it develop.
Open your local AI assistant (Cursor / Windsurf / Claude Code) and say:
Clone https://github.com/haih-net/agent
Run it
Within minutes, you will have a living knowledge system running entirely on your machine.
It starts with structure.
It grows through interaction.
It evolves through feedback.
KMS-Agent is a personal cognitive system.
It:
- Forms structured memory from every interaction
- Builds behavioral patterns over time
- Evaluates its own decisions
- Tracks confidence and contradictions
- Adjusts internal logic continuously
- Develops continuity across sessions
Every interaction modifies its internal state.
Over time, it becomes uniquely shaped by you.
Each interaction passes through a full cycle:
Stimulus → Processing → Action → Result → Conclusion → Evaluation → Adaptation
This loop creates development.
Not just responses — progression.
KMS-Agent constructs experience.
Experience forms patterns.
Patterns form tendencies.
Tendencies shape future behavior.
You influence development with every interaction.
Repeated contexts create internal reflexes.
Reflexes can be:
- Inspected
- Evaluated
- Reinforced
- Corrected
The system stabilizes over time.
Consistency emerges naturally.
KMS-Agent builds:
- Confidence-weighted knowledge
- Temporal awareness
- Explicit contradiction handling
- Self-evaluation history
- Adaptive internal structures
Its knowledge evolves continuously.
The entire cognitive process runs locally.
Its memory lives on your machine.
Its evolution is personal.
Its trajectory reflects your interactions.
You are shaping a system that develops with you.
The change becomes visible.
It reacts more coherently.
It refines its behavior.
It develops recognizable patterns.
It corrects itself.
It accumulates structure.
The longer it runs, the more defined it becomes.
- Builders designing long-term AI systems
- Researchers exploring cognitive architectures
- Developers creating persistent AI entities
- Individuals who want continuity in AI interaction
A Knowledge Management System that develops.
Clone it.
Run it.
Interact with it.
Observe the evolution.
Technical details, architecture, and configuration — wiki.
- Docker & Docker Compose
- Node.js 22+ and npm (only needed for Scenario 2)
The credentials/ directory is fully gitignored. Create these files manually before starting either scenario.
credentials/bootstrap.env — n8n owner account:
N8N_BOOTSTRAP_OWNER_EMAIL=admin@example.com
N8N_BOOTSTRAP_OWNER_PASSWORD=AdminPassword123!
N8N_BOOTSTRAP_OWNER_FIRSTNAME=Admin
N8N_BOOTSTRAP_OWNER_LASTNAME=User
credentials/system/openrouter.json — AI model API key.
Cloud (OpenRouter):
[
{
"id": "openrouter-cred",
"name": "OpenRouter",
"type": "openRouterApi",
"data": { "apiKey": "sk-or-v1-xxx" }
}
]Local API (LM Studio / LocalAI — 172.17.0.1 is the Docker bridge gateway, i.e. the host machine from inside a container):
[
{
"id": "openrouter-cred",
"name": "OpenRouter",
"type": "openRouterApi",
"data": { "apiKey": "local", "url": "http://172.17.0.1:1234/v1" }
}
]To keep multiple options in one file, rename unused ones to data_, data__ — only data is active.
credentials/agents/agent-chat.json:
{
"agentName": "Chat Agent",
"username": "agent-chat",
"password": "<any password>",
"email": "agent-chat@example.com",
"fullname": "Chat Agent"
}credentials/agents/agent-web-search.json:
{
"agentName": "Web Search Agent",
"username": "agent-web-search",
"password": "<any password>",
"email": "agent-web-search@example.com",
"fullname": "Web Search Agent"
}See
credentials/README.mdfor full documentation including SMTP/IMAP and Telegram setup.
No local Node.js setup required — everything builds inside containers.
Why prod mode for app? Dev mode mounts source files as volumes but does not include generated files (
src/gql/generated/,.next/, etc.). Prod mode builds everything inside the image duringdocker build, so it's self-contained.
cp docker/.env.sample docker/.envFill in docker/.env:
SUPABASE_DB_PASSWORD=postgres
SUPABASE_DB_NAME=postgres
DATABASE_URL=postgresql://postgres:postgres@supabase:5432/postgres
JWT_SECRET=<openssl rand -hex 32>
N8N_ENCRYPTION_KEY=<openssl rand -hex 16>
N8N_SECURE_COOKIE=false
N8N_BOOTSTRAP_ACTIVATE_WORKFLOWS=true
N8N_PERSONALIZATION_ENABLED=false
NODES_EXCLUDE=[]
N8N_CUSTOM_EXTENSIONS=./.n8n/custom
GRAPHQL_ENDPOINT=http://localhost:4000/api
DATABASE_URLmust use@supabase:5432(Docker service name), not@localhost:5432.localhostonly works when running outside Docker.
cd docker
DOCKER_BUILDKIT=0 docker compose -f docker-compose.yml -f docker-compose.prod.yml up supabase app --build -dOn first run this builds the Docker image: installs dependencies, runs DB migrations, generates types, and builds the app. Takes a few minutes.
docker compose -f docker-compose.yml -f docker-compose.dev.yml up traefik -dDo not create the
agicms-defaultDocker network manually — let Compose create it. A manually created network lacks Compose labels and will cause an error.
Result:
http://localhost:2015— app (via Traefik proxy)http://localhost:8080— Traefik dashboard
Full hot-reload development mode. Requires Node.js 22+ and npm.
npm installdocker/.env (for Supabase container):
cp docker/.env.sample docker/.envSet DATABASE_URL to @localhost:5432 (port is mapped to host):
SUPABASE_DB_PASSWORD=postgres
SUPABASE_DB_NAME=postgres
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/postgres
JWT_SECRET=<openssl rand -hex 32>
N8N_ENCRYPTION_KEY=<openssl rand -hex 16>
N8N_SECURE_COOKIE=false
N8N_BOOTSTRAP_ACTIVATE_WORKFLOWS=true
N8N_PERSONALIZATION_ENABLED=false
NODES_EXCLUDE=[]
N8N_CUSTOM_EXTENSIONS=./.n8n/custom
GRAPHQL_ENDPOINT=http://localhost:4000/apiRoot .env (read by Prisma and the app server):
cp .env.example .envDATABASE_URL=postgresql://postgres:postgres@localhost:5432/postgres
JWT_SECRET=<same as in docker/.env>
PORT=3000
N8N_ENCRYPTION_KEY=<same as in docker/.env>
N8N_SECURE_COOKIE=false
N8N_BOOTSTRAP_ACTIVATE_WORKFLOWS=true
N8N_PERSONALIZATION_ENABLED=false
NODES_EXCLUDE=[]
N8N_CUSTOM_EXTENSIONS=./.n8n/custom
GRAPHQL_ENDPOINT=http://localhost:4000/apiThe root
.envis separate fromdocker/.env— the app server reads env vars from the root, not from the Docker folder.
cd docker
docker compose -f docker-compose.yml -f docker-compose.dev.yml up supabase -dCheck it's healthy (STATUS: Up (healthy), port 5432 mapped):
docker compose -f docker-compose.yml -f docker-compose.dev.yml ps supabasenpm run prisma:deployExpected output:
Applying migration `20260119193349_initial`
Applying migration `20260122164751_knowledge_base`
Applying migration `20260125054235_experience_system`
All migrations have been successfully applied.
npm run generate
npm run build:custom-nodesgenerate— generates Prisma Client and GraphQL TypeScript types intosrc/gql/generated/build:custom-nodes— compiles theCUSTOM.agentOrchestratornode required by Chat Agent and Web Search Agent
npm run clean && npm run dev:n8n
cleanis required beforedev:n8n— it ensures n8n workflows are fully recreated from scratch on every start. Skipping it may result in stale or duplicate workflows.
Expected result:
[bootstrap] Workflow 'Chat Agent' activated
[bootstrap] Workflow 'Web Search Agent' activated
...
[bootstrap] Completed
Ready on http://localhost:3000, API at /api
Ports:
http://localhost:3000— frontendhttp://localhost:4000/api— GraphQL playgroundhttp://localhost:5678— n8n workflow editor
The
versionattribute warnings from Docker Compose are harmless and can be ignored.