Transform your locally-running Ollama models into an intelligent coding agent with capabilities similar to GitHub Copilot or Cursor—all running 100% locally with complete privacy. No cloud, no subscriptions, no data leaks.
- Autonomous Task Execution: Agent can read files, search codebases, create/edit files, and run tasks
- Multi-Step Reasoning: Up to 8-step problem-solving with context awareness
- Smart Tool Usage: Automatically explores your project structure and applies changes
- Read & Agent Modes: Choose between safe read-only or full agent capabilities
- Modern WebView UI: Clean, responsive chat interface with streaming responses
- Multi-Model Support: Chat with multiple Ollama models simultaneously
- Context-Aware: Automatically includes selected code, open files, and project structure
- Conversation History: Keep track of all interactions within sessions
- Inline Code Generation: Generate code directly at cursor position
- Smart Refactoring: Simplify, optimize, or add types to selected code
- Function Extraction: Automatically extract functions with AI assistance
- Code Translation: Convert code between languages
- README Generator: Dedicated mode to generate comprehensive project documentation
- Docstring Completion: AI-generated documentation for functions and classes
- Code Explanation: Understand complex code with AI explanations on hover
- PR Summaries: Generate pull request descriptions and review comments
- Unit Test Generation: Create comprehensive test suites automatically
- Problem Analysis: AI-powered debugging and error resolution
- Code Review: Get suggestions for improvements and best practices
- Project Indexer: Fast semantic search across your entire codebase
- Commit Message Generation: Smart git commit messages from staged changes
- Shell Command Explanation: Understand complex terminal commands
- File Creation with AI: Generate boilerplate files with context
- Configurable Settings: Host, port, system prompts, and behavior modes
- Keyboard Shortcuts: Quick access to all commands
- Context Menu Integration: Right-click access in editor
- Status Bar Integration: Quick model switching and status updates
- Install Ollama: Download from ollama.ai
- Pull Models: Choose your preferred models
ollama pull llama3.1:8b ollama pull mistral ollama pull codellama ollama pull deepseek-coder
- Start Ollama: Ensure it's running on
localhost:11434ollama serve
From VS Code Marketplace (Coming Soon):
- Open VS Code
- Go to Extensions (
Ctrl+Shift+X/Cmd+Shift+X) - Search for "Ollama Agent"
- Click Install
From VSIX (Manual):
- Download the
.vsixfile from GitHub Releases - Open VS Code
- Run command:
Extensions: Install from VSIX... - Select the downloaded file
- Click the Ollama icon in the Activity Bar (left sidebar)
- Or run command:
Ollama Agent: Open Chat - Or use keyboard shortcut:
Ctrl+Shift+P→ "Ollama Agent"
- Read Mode (Safe): AI can only read and suggest changes
- Agent Mode (Powerful): AI can automatically create/edit files
- Type your question or request
- Select one or more models to query
- Get streaming responses in real-time
- Select code → Right-click → Ollama Agent options
- Ask about selection, generate code, refactor, explain, and more
Access via File > Preferences > Settings → Search "Ollama Agent"
| Setting | Default | Description |
|---|---|---|
ollamaAgent.host |
localhost |
Ollama server host |
ollamaAgent.port |
11434 |
Ollama server port |
ollamaAgent.systemPrompt |
You are a helpful coding assistant. |
Default system prompt |
ollamaAgent.provider |
ollama |
AI provider (ollama/openai/anthropic) |
ollamaAgent.mode |
read |
Interaction mode (read/agent) |
ollamaAgent.enableInlineCompletions |
false |
Experimental: inline completions |
ollamaAgent.enableExplainOnHover |
false |
Experimental: hover explanations |
ollamaAgent.enableDocstringCompletion |
false |
Experimental: docstring suggestions |
{
"ollamaAgent.host": "localhost",
"ollamaAgent.port": 11434,
"ollamaAgent.mode": "agent",
"ollamaAgent.systemPrompt": "You are an expert TypeScript developer focused on clean code and best practices."
}- Place cursor where you want the function
- Run:
Ollama Agent: Generate Code Here - Describe what you want: "Create a function to validate email addresses"
- Review and insert the generated code
- Select the code to refactor
- Right-click →
Ollama Agent: Brush – Simplify - AI will suggest cleaner, more readable code
- Review changes in diff view and apply
- Select a function or class
- Run:
Ollama Agent: Generate Unit Tests - AI generates comprehensive test suite
- Tests appear in new file or clipboard
- Run:
Ollama Agent: README Generator - Choose style (Technical/Product/Library)
- Choose placement (GitHub/Marketplace/Website)
- Add custom notes
- Enable deep scan for comprehensive analysis
- AI generates complete README with all sections
- Click on a problem in Problems panel
- Run:
Ollama Agent: Fix This Problem - AI analyzes error and suggests fixes
- Apply fix with one click
Ollama Agent: Open Chat- Open main chat panelOllama Agent: README Generator- Generate project documentationOllama Agent: Ask About Selection- Query AI about selected code
Ollama Agent: Generate Code Here- Insert AI-generated codeOllama Agent: Generate Unit Tests- Create test suiteOllama Agent: Generate Commit Message- Smart git messagesOllama Agent: Generate README- Auto-generate documentation
Ollama Agent: Edit Selection by Instruction- Guided refactoringOllama Agent: Brush – Simplify- Simplify codeOllama Agent: Brush – Add Types- Add TypeScript typesOllama Agent: Brush – Optimize- Performance optimizationOllama Agent: Extract Function- Extract selected code
Ollama Agent: Explain Selection- Understand codeOllama Agent: Analyze Problems- Debug assistanceOllama Agent: Explain Command- Shell command helpOllama Agent: Show Index Stats- View codebase stats
Ollama Agent: PR Summary- Generate PR descriptionsOllama Agent: PR Review Comments- AI code reviewOllama Agent: Rebuild Project Index- Refresh semantic searchOllama Agent: Add File (AI)- Generate boilerplate files
Ollama Agent: Translate Selection- Convert between languages
Problem: "Failed to connect to Ollama"
Solutions:
- Verify Ollama is running:
ollama serve - Check settings: host=
localhost, port=11434 - Test connection:
curl http://localhost:11434/api/tags - Restart VS Code
Problem: "No models found"
Solutions:
- Pull a model:
ollama pull llama3.1 - List models:
ollama list - Refresh extension: Run
Developer: Reload Window
Problem: AI responses are very slow
Solutions:
- Use smaller models (e.g.,
llama3.1:8binstead of70b) - Reduce context: Disable deep scan in README generator
- Close other resource-intensive applications
- Consider using GPU acceleration with Ollama
Problem: Agent mode doesn't apply changes
Solutions:
- Check mode setting:
ollamaAgent.mode=agent - Verify file permissions (read/write access)
- Check workspace trust: File → Trust Workspace
- Review error messages in Output panel
Problem: Extension doesn't activate
Solutions:
- Check VS Code version (minimum 1.105.0)
- View logs: Output → Ollama Agent
- Disable conflicting extensions
- Reinstall extension
# Clone repository
git clone https://github.com/IamNishant51/Ollama-Agent-vs-code-extension.git
cd Ollama-Agent-vs-code-extension
# Install dependencies
npm install
# Build webview
npm run build:webview
# Compile TypeScript
npm run compile
# Watch mode (auto-recompile)
npm run watchSee PUBLISHING.md for a step-by-step guide to package and publish this extension to the VS Code Marketplace.
# Run linter
npm run lint
# Run tests
npm test- Open project in VS Code
- Press
F5to launch Extension Development Host - Test features in new window
- View logs in Debug Console
Contributions are welcome! Please read our Contributing Guide for details.
- Add support for more AI providers
- Improve agent reasoning capabilities
- Add more code generation templates
- Enhance UI/UX with additional themes
- Write comprehensive tests
- Improve documentation
- Report bugs and suggest features
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for providing the local LLM runtime
- VS Code for the excellent extension API
- The open-source community for inspiration and support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Email: Create an issue for support
- Multi-file editing in single operation
- Voice input support
- Custom prompt templates
- Extension marketplace for agent tools
- Collaborative coding sessions
- Code review workflows
- Integration with popular dev tools
- Mobile companion app
If you find this extension useful, please consider giving it a star on GitHub!
Made with ❤️ by Nishant Unavane
Powered by Ollama • Privacy-First • Open Source