This repository provides a setup for running a Llama3 model FastAPI application using Docker Compose and Ollama. It simplifies managing multiple services and configurations, making it easy to deploy and run locally.
- Docker Compose Setup: Orchestrates the application and its dependencies.
- FastAPI Framework: High-performance API development made simple.
- Easy Deployment: Launch the entire stack with a single command.
- Extensible Design: Add more services or scale as needed.
git clone https://github.com/surya-1729/Ollama-server-with-Docker-and-FastAPI.git
cd Ollama-server-with-Docker-and-FastAPIStart the application stack using Docker Compose:
docker-compose up --buildThis command will:
- Build the necessary Docker images.
- Start all services defined in the
docker-compose.ymlfile.
The FastAPI application will be accessible at http://localhost:8000.
Once the application is running, explore the API using:
- Swagger UI:
http://localhost:8000/docs - ReDoc Documentation:
http://localhost:8000/redoc
These interfaces provide an interactive way to test and understand the API.
To stop the running containers, use:
docker-compose downThis will stop and remove all containers defined in the docker-compose.yml file.
For local development:
- Edit the FastAPI source code in the repository.
- Restart the stack to apply changes:
docker-compose up --build
- Test your updates via Swagger UI or other tools.
Ollama-server-with-Docker-and-FastAPI/
├── fastapi/ # FastAPI application code
│ ├── app.py # Main entry point for the FastAPI app
│ ├── Dockerfile # Dockerfile for building FastAPI service image
│ ├── requirements.txt # Python dependencies for FastAPI
├── ollama/ # FastAPI application code
│ ├── Dockerfile # Dockerfile for building ollama image
│ ├── pull-model.sh # shell script to pull llama3 model
├── compose.yml # Docker Compose configuration file
└── README.md # Project documentation (this file)
This project is licensed under the MIT License. See the LICENSE file for details.
Special thanks to the creators of FastAPI, Docker, and Docker Compose for their incredible tools that make this project possible!