-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: agent provider overriding not working #1097
Comments
same here.... "agent": { |
Is this still a problem? |
did you use different images, from different servers? |
Images? Locally served Ollama, internal network, many different models. |
set a 1b model as "agent": { and also set a 14b + model as code monkey |
Yes, works as expected:
Default and Importer and CodeMonkey model overides, sent 3 START messages, and received 3 back. |
TO override agents in config, you must use the class name of the agent that is to be overridden. class CodeMonkey(BaseAgent):
agent_type = "code-monkey"
display_name = "Code Monkey"
...
class Importer(BaseAgent):
agent_type = "importer"
display_name = "Project Analyist" Config: "agent": {
"default": {
"provider": "ollama",
"model": "hf.co/LoupGarou/deepseek-coder-6.7b-instruct-pythagora-v3-gguf:Q6_K",
"temperature": 0.5
},
"CodeMonkey": {
"provider": "ollama",
"model": "llama3.3",
"temperature": 0.5
},
"Importer": {
"provider": "ollama",
"model": "codellama",
"temperature": 0.5
}
}, As that is how the BaseAgent loads the config, from what I can tell in the code. |
Version
Command-line (Python) version
Operating System
MacOS
What happened?
ollama didn't get a single request
The text was updated successfully, but these errors were encountered: