A streamlined Streamlit interface for coding with local AI models through Ollama. Powered by Maux.
💻 Your personal AI coding assistant running entirely on your machine
💡 Want cloud-based AI coding? Try our hosted version at ai.maux.space
Ollama Coder is an intuitive, open-source application that provides a modern chat interface for coding assistance using your local Ollama models. It features real-time streaming responses, automatic model detection, and session-based chat history - all running locally on your machine.
- Python 3.8+
- Ollama installed and running locally
- At least one model installed in Ollama (e.g., codellama, llama2)
- Clone this repository:
git clone https://github.com/xmannii/ollama-coder.git
cd ollama-coder
- Install the required packages:
pip install -r requirements.txt
- Make sure Ollama is running and you have models installed:
ollama run qwen2.5-coder
- Start the Streamlit app:
streamlit run app.py
- Open your browser and go to
http://localhost:8501
- 🔍 Automatically detects your local Ollama models
- 💬 Modern chat interface with streaming responses
- 📝 Maintains chat history during the session
- 🎨 Clean and intuitive UI
- 🔄 Easy model switching
- 🗑️ Clear chat history option
- Select a model from the dropdown in the sidebar
- Type your coding question in the chat input
- The AI will respond with code examples and explanations
- You can continue the conversation with follow-up questions
- Use the "Clear Chat History" button to start a new conversation
Make sure Ollama is running locally at http://localhost:11434
before starting the app.
This is an open-source project powered by Maux. For a hosted solution with additional features and no setup required, visit ai.maux.space.
MIT License - feel free to use this code for your own projects!