Skip to content

feat(llm): add Ollama provider via OpenAI-compatible API :)#1515

Open
PrinceGautam2106 wants to merge 1 commit intomofa-org:mainfrom
PrinceGautam2106:mofa8
Open

feat(llm): add Ollama provider via OpenAI-compatible API :)#1515
PrinceGautam2106 wants to merge 1 commit intomofa-org:mainfrom
PrinceGautam2106:mofa8

Conversation

@PrinceGautam2106
Copy link
Copy Markdown

📋 Summary

  • Replace OllamaProvider/OllamaConfig with OpenAIProvider in agent.rs
  • Remove OllamaConfig re-exports from mod.rs and lib.rs
  • Update ollama_from_env() to return OpenAIProvider
  • Add support for OLLAMA_HOST environment variable in addition to OLLAMA_BASE_URL

🔗 Related Issues

Closes #132


🧠 Context


🛠️ Changes

  • crates/mofa-llm/src/providers/ollama.rs — new provider (wraps OpenAI client)
  • crates/mofa-llm/src/providers/mod.rs — one-line factory registration
  • Cargo.toml — optional ollama feature flag (off by default)
  • README.md — added Ollama to the provider table

🧩 Additional Notes for Reviewers

  • Default host is http://localhost:11434 — overridable via OLLAMA_HOST
  • No API key required (Ollama is local-only)
  • The #[ignore] test is intentional — CI doesn't have Ollama, but
    contributors can run it locally with any model

- Replace OllamaProvider/OllamaConfig with OpenAIProvider in agent.rs
- Remove OllamaConfig re-exports from mod.rs and lib.rs
- Update ollama_from_env() to return OpenAIProvider
- Add support for OLLAMA_HOST environment variable in addition to OLLAMA_BASE_URL
@PrinceGautam2106 PrinceGautam2106 changed the title feat(llm): add Ollama provider via OpenAI-compatible API ( feat(llm): add Ollama provider via OpenAI-compatible API :) Mar 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: Add Ollama provider with configuration and integration

1 participant