The Universal Job Manager Epic has been broken down into 6 testable, buildable phases to enable incremental development, testing, and validation. Each phase can be developed independently and has clear success criteria.
Build the foundational infrastructure that everything else depends on.
- File:
phase-1a-database-schemas.md - Duration: 3-5 days
- Focus: MongoDB collections, job models, basic storage layer
- Testing: Database operations, CRUD functionality, schema migration
- Success: Can create, store, and retrieve job data
- File:
phase-1b-jobqueue-workers.md - Duration: 5-7 days
- Focus: Job processing engine, worker pools, Redis integration
- Testing: Job execution, queue operations, crash recovery
- Success: Can process jobs asynchronously with crash recovery
Integrate job system with existing crawler and create API endpoints.
- File:
phase-2a-job-api-controllers.md - Duration: 4-6 days
- Focus: REST API endpoints, job submission, status queries
- Testing: API functionality, authentication, performance
- Success: Can submit and manage jobs via HTTP API
- File:
phase-2b-crawler-integration.md - Duration: 5-7 days
- Focus: Integrate existing Crawler with job system
- Testing: Crawl jobs, progress reporting, backward compatibility
- Success: Existing crawler works within job framework
Build real-time monitoring and web dashboard for users.
- File:
phase-3a-realtime-status.md - Duration: 6-8 days
- Focus: WebSocket, SSE, polling fallbacks, Redis pub/sub
- Testing: Real-time updates, connection management, scalability
- Success: Live job status updates in web browsers
- File:
phase-3b-frontend-dashboard.md - Duration: 6-8 days
- Focus: Responsive web interface, job management UI
- Testing: UI functionality, mobile responsiveness, accessibility
- Success: Complete job management dashboard
Phase 1a → Phase 1b → Phase 2a → Phase 2b → Phase 3a → Phase 3b
Each phase builds upon the previous phase and has clear interfaces between components.
- Unit Tests: Component-level functionality
- Integration Tests: Cross-component interaction
- Performance Tests: Benchmark critical paths
- End-to-End Tests: Complete workflow validation
- Build and compile successfully
- Run test suite with >90% coverage
- Performance benchmarks meet targets
- Manual testing of key scenarios
- Code review and documentation update
# 1. Start with database schemas
cd /root/search-engine-core
git checkout -b feature/phase-1a-database-schemas
# 2. Create MongoDB collections
# 3. Implement job models
# 4. Write unit tests
# 5. Test with Docker container
# 6. Validate phase completion
./build/test_job_storage --test=connection
./build/test_job_models --test=crudEach phase includes specific commands to validate completion:
./build/test_job_storage --test=connection
./build/test_job_models --test=crud
docker exec mongodb_test mongosh --eval "db.jobs.find().limit(1)"./build/test_job_queue --test=enqueue_dequeue
./build/test_worker_service --test=worker_lifecycle
./build/test_crash_recovery --test=restart_recoverycurl -X POST http://localhost:3000/api/v2/jobs -H "Content-Type: application/json"
./tests/api/test_job_endpoints.sh
./tests/performance/load_test_job_api.shFor each phase, ensure:
- All tasks in phase issue completed
- Unit tests passing (>90% coverage)
- Integration tests working
- Performance targets met
- Documentation updated
- Code review approved
- Manual validation successful
- Week 2: Phase 1 Complete (Database + Queue System)
- Week 4: Phase 2 Complete (API + Crawler Integration)
- Week 6: Phase 3 Complete (Real-time Dashboard)
- Week 8: Production Ready (Testing + Optimization)
Each phase must follow these project-specific rules:
- MongoDB Integration: Always use
MongoDBInstance::getInstance()before creating clients - uWebSockets Safety: Always pair
res->onData()withres->onAborted() - Controller Initialization: Use lazy initialization pattern (no service init in constructors)
- Debug Output: Use
LOG_DEBUG()instead ofstd::cout(configurable via LOG_LEVEL)
- Unit Tests: Test individual components in isolation
- Integration Tests: Test component interactions
- Performance Tests: Validate speed and scalability targets
- Regression Tests: Ensure existing functionality preserved
Before moving to next phase:
- All tests must pass
- Performance benchmarks must be met
- Code review must be approved
- Manual testing scenarios validated
- Documentation must be updated
- API Response Time: < 100ms for job submission
- System Reliability: >99.9% uptime, zero job loss on crashes
- Performance: Support 1000+ concurrent jobs
- Scalability: Handle 10,000+ jobs per hour
- Build Time: Each phase adds <30 seconds to build time
- Test Time: Full test suite completes in <5 minutes
- Development Speed: New job types can be added in <1 day
- Debugging: Clear logs and monitoring for troubleshooting
All phase issues are in: /root/search-engine-core/.github/ISSUE_TEMPLATE/
- API Documentation: Will be generated during Phase 2a
- Architecture Documentation: Updated during each phase
- Deployment Guide: Completed during Phase 3b
- Each issue contains detailed implementation notes
- Critical implementation rules are documented in each phase
- Common pitfalls and solutions are included
- Performance targets and validation commands provided
🚀 Ready to start? Begin with Phase 1a: Core Database Schemas & Models
📈 Total Estimated Timeline: 8-12 weeks for complete implementation