🎯 Turn Reactive AI into Proactive Intelligence
Schedule workflows, orchestrate agents, automate everything—
so your AI works while you sleep
Features • Quick Start • Roadmap • Development • Contributing
Automagik Spark is the automation engine that gives your AI agents a sense of time. While LangFlow and Hive create powerful workflows and agents, Spark makes them proactive—running tasks on schedules, triggering actions automatically, and working 24/7 without manual intervention.
Think of Spark as the alarm clock, calendar, and scheduler for your entire AI infrastructure.
The Challenge: Your AI workflows are brilliant but passive. They sit idle until you remember to trigger them:
- Daily reports that you forget to generate
- Monitoring tasks that need constant attention
- Routine automations that require manual kicks
- Agents that only respond when asked
Our Solution: Spark transforms passive workflows into active systems that work autonomously on your schedule.
Before Spark:
- ❌ Manually trigger workflows every time
- ❌ Set reminders to run routine tasks
- ❌ AI that only reacts when prompted
- ❌ Miss opportunities because agents weren't watching
After Spark:
- ✅ Workflows run automatically on schedule
- ✅ Set it once, runs forever
- ✅ AI that takes initiative proactively
- ✅ Never miss a beat with 24/7 automation
- ⏰ Smart Scheduling: Cron expressions, intervals, or one-time runs—schedule anything
- 🔄 Multi-Source Integration: Connect unlimited LangFlow and Hive instances
- 🤖 Workflow Orchestration: Sync and manage flows/agents from a unified control center
- 👷 Distributed Workers: Scale task execution with Celery-powered workers
- 📊 Real-time Monitoring: Track every task execution with detailed logs and status
- 🎯 MCP Native: Control Spark programmatically via Model Context Protocol
- 🔒 Privacy-First Telemetry: Anonymous usage analytics with full opt-out control
- 📡 REST API: Full-featured API with interactive Swagger documentation
- 🛠️ Powerful CLI: Manage everything from the command line
- 🐳 Flexible Deployment: Docker, PM2, or bare metal—your choice
# Morning briefing agent runs daily at 9 AM
automagik-spark schedule create my-briefing-agent "0 9 * * *"
# Database backup runs every Sunday at 2 AM
automagik-spark schedule create backup-flow "0 2 * * 0"
# Monitor API health every 5 minutes
automagik-spark schedule create health-check "*/5 * * * *"
- Customer Support: Auto-respond to tickets during off-hours
- DevOps: Automated health checks and incident response
- Data Teams: Scheduled ETL pipelines and report generation
- Marketing: Automated content generation and social posts
- Compliance: Scheduled audits and compliance checks
- Operations: Proactive monitoring and alerting systems
- Analytics: Automated data processing and insights
- Finance: Scheduled reconciliation and reporting
graph TB
subgraph "Your AI Infrastructure"
LF[LangFlow Instances]
HV[Automagik Hive]
OM[Automagik Omni]
end
subgraph "Spark Engine"
API[API Server<br/>FastAPI]
WK[Workers<br/>Celery]
DB[(PostgreSQL<br/>State & Tasks)]
RD[(Redis<br/>Queue)]
end
subgraph "Interfaces"
CLI[CLI Tool]
MCP[MCP Server]
REST[REST API]
end
LF <-->|Sync Flows| API
HV <-->|Sync Agents| API
API -->|Queue Tasks| RD
RD -->|Execute| WK
WK -->|Trigger| LF
WK -->|Trigger| HV
WK -->|Notify| OM
API <-->|Store State| DB
CLI -->|Commands| API
MCP -->|Control| API
REST -->|HTTP| API
- Connect: Link your LangFlow/Hive instances
- Sync: Discover all available workflows and agents
- Schedule: Define when and how often tasks should run
- Execute: Workers trigger workflows automatically
- Monitor: Track execution history and results
- Scale: Add more workers as your automation grows
# 1. Add your LangFlow instance
automagik-spark source add http://langflow:7860 YOUR_API_KEY
# 2. Sync available flows
automagik-spark workflow sync "daily-sales-report"
# 3. Schedule it to run every weekday at 8 AM
automagik-spark schedule create WORKFLOW_ID "0 8 * * 1-5"
# That's it! Your report generates automatically every morning
- Python 3.12+ (we use the latest async features)
- PostgreSQL 12+ (for persistent state)
- Redis 6+ (for task queuing)
- Optional: Docker & Docker Compose (for containerized setup)
git clone https://github.com/namastexlabs/automagik-spark.git
cd automagik-spark
./scripts/setup_local.sh
git clone https://github.com/namastexlabs/automagik-spark.git
cd automagik-spark
./scripts/setup_dev.sh
After setup completes, you'll have:
- API Server:
http://localhost:8883
- Interactive Docs:
http://localhost:8883/api/v1/docs
- PostgreSQL:
localhost:15432
- Redis:
localhost:6379
- CLI Tool:
automagik-spark
command - Workers: Running and ready to execute tasks
# Check API health
curl http://localhost:8883/api/v1/health
# Try CLI commands
source .venv/bin/activate
automagik-spark --help
# View API documentation
open http://localhost:8883/api/v1/docs
# Add a workflow source
automagik-spark source add https://my-langflow.com API_KEY_HERE
# List all available workflows
automagik-spark workflow list
# Sync a specific workflow
automagik-spark workflow sync "email-processor"
# Create a schedule (runs daily at midnight)
automagik-spark schedule create WORKFLOW_ID "0 0 * * *"
# List all schedules
automagik-spark schedule list
# View task execution history
automagik-spark task list
# Add a source
curl -X POST http://localhost:8883/api/v1/sources \
-H "Content-Type: application/json" \
-d '{"url": "https://langflow.example.com", "api_key": "..."}'
# Create a schedule
curl -X POST http://localhost:8883/api/v1/schedules \
-H "Content-Type: application/json" \
-d '{"workflow_id": "...", "cron": "0 0 * * *"}'
Spark is available as an MCP tool in Automagik Tools:
# Install MCP server
uvx automagik-tools hub
# Now use natural language from Claude Code, Cursor, etc:
# "Schedule my daily-report workflow to run every morning at 9am"
# Database
DATABASE_URL=postgresql://user:pass@localhost:15432/spark
# Redis
REDIS_URL=redis://localhost:6379
# API
API_HOST=0.0.0.0
API_PORT=8883
# Telemetry (optional)
AUTOMAGIK_SPARK_DISABLE_TELEMETRY=false
"*/5 * * * *" # Every 5 minutes
"0 * * * *" # Every hour
"0 0 * * *" # Daily at midnight
"0 9 * * 1-5" # Weekdays at 9 AM
"0 0 * * 0" # Every Sunday at midnight
"0 0 1 * *" # First day of every month
# View recent task executions
automagik-spark task list --limit 20
# Check worker status
automagik-spark worker status
# View system health
curl http://localhost:8883/api/v1/health
Spark collects anonymous usage metrics to improve the product:
What we collect:
- Command usage frequency
- API endpoint usage patterns
- Workflow execution statistics
- Error rates (no error details)
What we DON'T collect:
- Personal information
- Workflow content or data
- API keys or credentials
- File paths or environment variables
Disable telemetry anytime:
# Environment variable
export AUTOMAGIK_SPARK_DISABLE_TELEMETRY=true
# CLI command
automagik-spark telemetry disable
# Opt-out file
touch ~/.automagik-no-telemetry
Spark is the heartbeat of the Automagik ecosystem:
- Automagik Hive: Schedule multi-agent workflows
- Automagik Omni: Send notifications on schedule
- Automagik Forge: Trigger task execution
- Automagik Tools: Control Spark via MCP
- LangFlow: Schedule visual AI workflows
# Clone and setup
git clone https://github.com/namastexlabs/automagik-spark.git
cd automagik-spark
./scripts/setup_dev.sh
# Activate virtual environment
source .venv/bin/activate
# Run tests
pytest
# Check code quality
ruff format . && ruff check . && mypy .
# Run API server in dev mode
python -m automagik_spark.api
automagik-spark/
├── automagik_spark/ # Main package
│ ├── api/ # FastAPI application
│ ├── workers/ # Celery workers
│ ├── models/ # SQLAlchemy models
│ ├── services/ # Business logic
│ └── cli/ # CLI commands
├── tests/ # Test suite
├── scripts/ # Setup and utility scripts
└── docs/ # Documentation
# Run all tests
pytest
# Run with coverage
pytest --cov=automagik_spark --cov-report=html
# Run specific test file
pytest tests/test_scheduler.py
# Run with verbose output
pytest -v
- Multi-source workflow management (LangFlow + Hive)
- Cron-based scheduling with interval support
- Distributed worker architecture with Celery
- REST API with Swagger documentation
- Powerful CLI tool for workflow management
- MCP server integration via Automagik Tools
- Privacy-focused telemetry system
- Step-by-step Workflows: Break complex automations into discrete steps
- Step Outputs: Pass data between workflow steps seamlessly
- Natural Language Scheduling: "Run every morning" instead of cron syntax
- /wish System: AI assistant that helps build workflows
- Improved DX: Simpler setup and instance management
- Workflow Dependencies: Trigger workflows based on other completions
- Omni Integration: Deep integration with messaging platform
- Forge Integration: Trigger tasks from Kanban board
- Conditional Execution: Run workflows based on conditions
- Retry Policies: Auto-retry failed tasks with backoff
- Workflow Marketplace: Share and discover automation templates
- Visual Workflow Builder: Drag-and-drop scheduling interface
- Advanced workflow orchestration with branching logic
- Multi-tenant support for enterprise deployments
- Real-time collaboration on workflow management
- AI-powered workflow optimization suggestions
- Integration with major observability platforms
- Cloud-hosted version for zero-setup deployments
We love contributions! However, to maintain project coherence:
- Discuss First: Open an issue before starting work
- Align with Roadmap: Ensure changes fit our vision
- Follow Standards: Match existing code patterns (async/await, type hints)
- Test Thoroughly: Include tests for new features (>70% coverage)
- Document Well: Update docstrings and documentation
- Quality Checks: Run
ruff format . && ruff check . && mypy .
- Conventional Commits: Use proper commit messages with co-author:
feat: add step-based workflow execution Co-authored-by: Automagik Genie 🧞 <[email protected]>
See CONTRIBUTING.md for detailed guidelines.
Special thanks to:
- The LangFlow team for building an amazing visual workflow platform
- The Celery project for robust distributed task execution
- All our early adopters and contributors who helped shape Spark
- The open-source community for inspiration and support
MIT License - see LICENSE file for details.
- GitHub: github.com/namastexlabs/automagik-spark
- Discord: discord.gg/xcW8c7fF3R
- Twitter: @automagikdev
- DeepWiki Docs: deepwiki.com/namastexlabs/automagik-spark
🚀 Stop manually triggering workflows. Start automating everything.
Spark - The Automation Engine That Never Sleeps ⚡
Star us on GitHub •
Join our Discord
Made with ❤️ by Namastex Labs
AI that elevates human potential, not replaces it