You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is my configuration for Avante (in nix as I am using NixOS) trying to use the local Deepseek model that I have running with Ollama. When I go to http://localhost:11434 in my browser, it shows me that Ollama is running, and I can see the Ollama service when I check btop. When I open up the Avante ask UI panel and enter in a question, I get a 404 error.
And here is the output of ollama list showing the model is installed:
Has anyone had success getting a locally running Deepseek coder model via Ollama to work properly in Avante? I feel like I must be close, but am just missing something I don't see. I have been able to get Avante to use online models with an API endpoint, but not a local model yet.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
This is my configuration for Avante (in nix as I am using NixOS) trying to use the local Deepseek model that I have running with Ollama. When I go to http://localhost:11434 in my browser, it shows me that Ollama is running, and I can see the Ollama service when I check btop. When I open up the Avante ask UI panel and enter in a question, I get a 404 error.

And here is the output of

ollama list
showing the model is installed:Has anyone had success getting a locally running Deepseek coder model via Ollama to work properly in Avante? I feel like I must be close, but am just missing something I don't see. I have been able to get Avante to use online models with an API endpoint, but not a local model yet.
Beta Was this translation helpful? Give feedback.
All reactions