A comprehensive Python web service template using FastAPI with a full-featured Docker-based stack. This template provides both REST API endpoints and web interface via Jinja2 templating, complete with authentication, file storage, background tasks, and production-ready deployment features.
- FastAPI - Modern, fast web framework for building APIs
- Jinja2 - Template engine for web interface
- PostgreSQL - Primary database with async support
- Redis - Caching and session storage
- Celery - Background task processing
- MinIO - S3-compatible object storage
- Nginx - Reverse proxy with SSL/TLS support
- Certbot - Automated SSL certificate management
- uv - Ultra-fast Python package management
- Ruff - Modern Python linter and formatter
- MyPy - Static type checking
- GitHub Actions - CI/CD pipeline with automated testing
- Docker & Docker Compose - Full containerization
- Makefile - Easy setup and common tasks
- VS Code Debugger - Development debugging support
- SSH Support - Private repository access
- Comprehensive Testing - 51+ automated tests covering all components
- JWT Authentication - Secure token-based auth
- Password Security - Bcrypt hashing with policies
- CORS Support - Cross-origin resource sharing
- Security Middleware - Request validation and protection
- Structured Logging - Enhanced logging with context
- Async Database Operations - Non-blocking database access
- Database Migrations - Alembic for schema management
- File Upload/Management - S3-compatible storage integration
- Multi-Database Support - Multiple database configurations
- Data Validation - Pydantic models with validation
-
Install Docker and Docker-compose 2.Copy the .env-copy file to .env and fill in the variables
-
If the docker requires access to a private repository you need to load your ssh key to your ssh-agent using
ssh-add
command.eval "$(ssh-agent -s)" ssh-add ~/.ssh/<your-ssh-key> #set the path to your ssh key
-
Build the docker compose file
docker-compose build
- Run the docker compose file
docker-compose up
- The webserver should be running on localhost on the port defined in the .env file
- Create a bucket in the minio server with the name defined in the .env file
- The project uses alemic to manage the database. To create the database run the following command
alemic upgrade head
-
Install Docker and Docker-compose
-
Copy the .env-copy file to .env and fill in the variables 3.If the docker requires access to a private repository you need to load your ssh key to your ssh-agent using
ssh-add
command.eval "$(ssh-agent -s)" ssh-add ~/.ssh/<your-ssh-key> #set the path to your ssh key
-
Run the makefile
make init
- The webserver should be running on localhost on the port defined in the .env file
The project uses uv for ultra-fast Python dependency management.
- uv (latest version) Installation Guide
- Python 3.10+
- Docker (for full stack development)
- docker-compose (for orchestration)
-
Install uv (if not already installed):
curl -LsSf https://astral.sh/uv/install.sh | sh
-
Clone and setup:
git clone <repository-url> cd Web-Service-Python cp .env-copy .env # Edit with your settings
-
Install dependencies:
uv sync --dev # Install all dependencies including dev tools
-
Activate virtual environment:
source .venv/bin/activate # Or use uv run <command>
uv add package-name # Production dependency
uv add --dev package-name # Development dependency
uv add package-name==1.2.3 # Specific version
uv remove package-name
uv sync # Sync with uv.lock
uv lock --upgrade # Update lock file with latest versions
uv add package-name@latest --dev # Upgrade specific package
The project includes comprehensive testing with 51+ automated tests:
# Run all working tests
uv run pytest src/tests/test_basic.py \
src/tests/test_config.py \
src/tests/test_utils.py \
src/tests/test_s3.py \
src/tests/test_celery.py -v
# Run tests with coverage
uv run pytest --cov=src --cov-report=html
# Run specific test file
uv run pytest src/tests/test_config.py -v
# Run with environment variables
POSTGRES_USER=test_user POSTGRES_PASSWORD=test_pass \
S3_ACCESS_KEY_ID=test_key S3_ACCESS_KEY=test_secret \
uv run pytest src/tests/ -v
uv run ruff check . # Lint code
uv run ruff format . # Format code
uv run mypy src/ # Type checking
uv run pre-commit run --all-files # Run all hooks
The project includes a comprehensive GitHub Actions workflow that automatically:
- Tests across Python versions (3.10, 3.11, 3.12)
- Runs code quality checks (linting, formatting, type checking)
- Executes all test suites with coverage reporting
- Builds and tests Docker images
- Uploads coverage to Codecov
- Runs on pull requests and pushes to main/develop branches
The CI pipeline ensures code quality and prevents regressions before merging changes.
Pre-commit hooks run code quality checks before each commit:
uv run pre-commit install # Install git commit hooks
uv run pre-commit run --all-files # Run on all files manually
This will automatically run ruff
(linting and formatting) and mypy
(type checking) before each commit.
To create a new migration run the following command
alembic revision --autogenerate -m "migration message"
To apply the migration run the following command
alembic upgrade head
To downgrade the migration run the following command
alembic downgrade -1
- Install the python extension for VS Code.
- Use the included
launch.json
file to run the debugger. - Use the custom docker compose file to run the debugger.
docker-compose -f docker-compose.debug.yml up
- Set breakpoints in the code and run the debugger vs code(Wait for debugpy to be installed).
- The debugger should be running on port 5678.
Comment out the debugpy
start-debug.sh
file and run the docker compose file above.
This project uses semantic versioning (SemVer) with automated releases through GitHub Actions.
- Major: Breaking changes (1.0.0 β 2.0.0)
- Minor: New features, backward compatible (1.0.0 β 1.1.0)
- Patch: Bug fixes, backward compatible (1.0.0 β 1.0.1)
- Prerelease: Alpha/beta versions (1.0.0-alpha.1)
Use the convenient Make targets for version management:
# Check current version
make version
# Bump version (creates commit + tag + triggers release)
make bump-patch # 1.0.0 β 1.0.1 (bug fixes)
make bump-minor # 1.0.0 β 1.1.0 (new features)
make bump-major # 1.0.0 β 2.0.0 (breaking changes)
make bump-prerelease # 1.0.0 β 1.0.1-alpha.0 (pre-release)
# Generate release notes
make release-notes # Show changes since last release
make changelog # Update CHANGELOG.md
The project supports automated releases based on commit messages using conventional commits:
feat:
triggers minor release (new feature)fix:
triggers patch release (bug fix)feat!:
orBREAKING CHANGE:
triggers major release- Other commits don't trigger releases
Example commits:
git commit -m "feat: add user authentication" # β Minor release
git commit -m "fix: resolve database connection" # β Patch release
git commit -m "feat!: redesign API endpoints" # β Major release
When you push a version tag or use automated releases:
- Automated testing runs (all 84+ tests must pass)
- Python package is built and attached to release
- GitHub release is created with auto-generated notes
- Changelog is updated automatically
Each release includes:
- Source code (tar.gz, zip)
- Python wheel (built package)
- Release notes with changelog
- Installation instructions
To run nginx and certbot run the following command:
docker-compose --profile nginx up
-
Modify configuration in
nginx/app.conf
,init_cert.sh
with the appropriate config/credentials. -
Run the init script(Ensure that you have made the appropriate dns mapping for the server at your domain provider):
./init_cert.sh