A high-concurrency, event-driven multi-platform LLM bot framework with modern web dashboard and comprehensive platform integrations.
- Event-driven Architecture - High-concurrency, loosely coupled components
- Stage-based Initialization - LifecycleManager with health monitoring
- Platform Adapter System - Extensible platform integrations
- Message Processing Pipeline - Async processing with error handling and retry logic
- Modern Web Dashboard - Professional AstrBot-style interface
- Unified Startup - Single command deployment
- DingTalk - Full WebSocket integration with voice recognition
- WeCom - Enterprise messaging platform (planned)
- WeCom Customer Service - Customer service integration (planned)
- Dify API - Advanced AI workflows and knowledge base
- OpenAI - GPT models with function calling
- LangChain - Tool integration and agent capabilities
- Multiple LLM Providers - Load balancing and failover
- Voice Recognition - Speech-to-text for voice messages
- Real-time Streaming - Progressive response delivery
- Health Monitoring - Component status and performance metrics
- Error Recovery - Automatic retry and fallback mechanisms
- Hot Reload - Dynamic component updates (planned)
- Python 3.8+
- pip or uv package manager
# Clone the repository
git clone https://github.com/lycosa9527/MindBot.git
cd MindBot
# Install dependencies
pip install -r requirements.txt
# Configure system settings (optional)
cp config/env.example .env
# Edit .env with your system configuration
# Start MindBot
python run.py
# Access web dashboard at http://localhost:9529
# Configure platform adapters via web interfaceMindBot uses a hybrid configuration system:
-
System Configuration: Environment variables (
config/.envfile)- Web dashboard settings (host, port, SSL)
- Security settings (rate limiting, CORS)
- Performance settings (concurrency, timeouts)
- Monitoring settings (logging, health checks)
-
Platform Configuration: Web interface
- DingTalk adapters (Client ID, Secret, Robot Code)
- WeCom adapters (Corp ID, Secret, Agent ID)
- Dify API key and settings
- All AI provider configurations
- Adapter-specific configurations
# Web Dashboard
WEB_DASHBOARD_HOST=0.0.0.0
WEB_DASHBOARD_PORT=9529
EXTERNAL_IP=your_server_ip
EXTERNAL_DOMAIN=your_domain.com
# Security
RATE_LIMITING_ENABLED=true
CORS_ORIGINS=http://localhost:9529,https://your_domain.com
# Performance
MAX_CONCURRENT_MESSAGES=50
MESSAGE_TIMEOUT=30.0
# Monitoring
LOG_LEVEL=INFO
HEALTH_CHECK_INTERVAL=30- Start MindBot:
python run.py - Open web dashboard:
http://localhost:9529 - Add adapters via "Add Adapter" button
- Configure Dify API key via "Edit Dify API Key"
from mindbot_framework import MindBotApplication
# Create application
config = {
"logging": {"level": "INFO"},
"platforms": {
"dingtalk": {
"client_id": "your_client_id",
"client_secret": "your_client_secret",
"robot_code": "your_robot_code"
}
}
}
app = MindBotApplication(config)
# Register platform adapters
app.register_platform_adapter(dingtalk_adapter)
# Start application
await app.start()Access the modern web dashboard at http://localhost:9529:
- Real-time Monitoring - Bot status and performance metrics
- Message Testing - Send test messages to verify functionality
- Configuration Management - View and update settings
- Logs Viewer - Real-time log streaming and management
- Health Checks - Component status monitoring
# Start with web dashboard
python run.py
# Manage logs
python tools/manage_logs.py list # List all log files
python tools/manage_logs.py stats # Show log statistics
python tools/manage_logs.py tail # Show recent log entries
python tools/manage_logs.py clean --days 30 # Clean old logs
python tools/manage_logs.py compress # Compress log files
# Run example usage
python example_usage.py
# Check framework health
python -c "from mindbot_framework import MindBotApplication; print('Framework loaded successfully')"mindbot_framework/
โโโ core/
โ โโโ application.py # MindBotApplication orchestrator
โ โโโ lifecycle.py # LifecycleManager with stages
โ โโโ event_bus.py # Event routing and processing
โ โโโ message_processor.py # Async message pipeline
โโโ platforms/
โโโ base.py # PlatformAdapter interface
Platform Message โ PlatformAdapter โ EventBus โ MessageProcessor โ LLM โ Response โ Platform
- Setup Logging - Initialize logging system
- Load Configuration - Load and validate config
- Initialize Database - Setup data storage
- Start Platform Adapters - Initialize platform connections
- Start Event Processing - Begin message processing
- Configuration Guide - Complete configuration setup guide
- Web Dashboard Guide - Web interface usage and features
- Implementation Plan - Detailed development roadmap
- Comprehensive Framework - Technical architecture
- Wiki - Setup guides and troubleshooting
- Changelog - Version history and updates
mindbot_poc/
โโโ mindbot_framework/ # Core framework
โโโ src/ # Existing DingTalk bot code
โโโ templates/ # Web dashboard templates
โโโ tools/ # Utility scripts
โ โโโ manage_logs.py # Log management tool
โโโ logs/ # Log files directory
โโโ config/ # Configuration files
โ โโโ mindbot_config.json # Main configuration
โ โโโ env_example.txt # Environment template
โ โโโ .env # Environment variables (create from template)
โโโ docs/ # Documentation
โโโ run.py # Main startup script
โโโ example_usage.py # Framework usage example
โโโ requirements.txt # Dependencies
from mindbot_framework.platforms.base import PlatformAdapter
class MyPlatformAdapter(PlatformAdapter):
async def initialize(self) -> bool:
# Platform-specific initialization
pass
async def start(self) -> None:
# Start platform connection
pass
async def send_message(self, response: Response) -> bool:
# Send message to platform
pass# Run example usage
python example_usage.py
# Test framework components
python -c "from mindbot_framework.core import *; print('All components loaded')"# Build and run with Docker
docker build -t mindbot .
docker run -p 9529:9529 mindbot# Use production WSGI server
gunicorn -w 4 -b 0.0.0.0:9529 app:app- Core framework architecture
- Lifecycle management system
- Platform adapter interface
- Event bus architecture
- Message processing pipeline
- Web dashboard
- Complete platform integrations
- Vector database & RAG
- LLM provider system
- LangChain integration
- Plugin system
- Workflow capabilities
- Alert & monitoring
- Hot-reload features
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- DingTalk Open Platform for Stream Mode API
- Dify for AI knowledge base integration
- LangChain for LLM tool integration
- AstrBot for dashboard design inspiration
- Python Community for excellent async libraries
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: Wiki
Maintainer: MindSpring Team
Last Updated: September 7, 2025
Version: v0.5.0
Status: Active Development