-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in trying to make Ollama to work #33
Comments
Try testing like this: http://ollama:11434 |
Obrigado Cristiano, But it's still giving this error
I am using the ollama server so that it shows 55005 to the exterior. Internal address is the one with 11434. And from outside of docker I interrogate the site it gives me the following. and when I ask for localhost:55005/api/tags I get: A nicer view is from firefox: Can't imagine what I have to do. Do you have a working environment with containers? |
@peterromao Como você é português você responder em pt-br que fica melhor Configuração dele: A minha duvida é você WSL?
Para você ter acesso e conseguir importar os modelos |
Obrigado Cristiano. Estou com servidores de teste Mac Silicon M4. Mas, se não se importar, vou continuar em inglês para bem dos outros leitores que passem por aqui e que poderão querer aprender algo. So this is on an M4 MacMini where I use only test and staging servers so no WSL. I do have another MacBook Pro where I do my development and I am starting to use doodba. Great for development work but quircky for production and staging. At least for me for the time being. Got a stack for odoo, plain and simple without any proxys as this is a do and undo server we access for demos. My docker run command for ollama is very similar to yours. Mine differs in the port from the outside of docker. I have 55005 from outside but odoo finds the container via "http://Ollama:11434" which is how it should be, I guess. With this setting I am finally communicating from odoo with the ollama container as can be seen from the information before the error message in the logs:
The error message is hinting that there's something fishy going on with the data going through the methods.
And that's it really. Thank you again for your answers, |
We are making progress! You are now encountering at least the same error I faced in Issue #23 missing the field name when importing the model. When I first opened the issue, I thought it was just a bug on my end and ended up closing it. However, I believe this error should be reported to the project maintainers. I suspect it might be happening because we are running it via Docker. At the time, I updated the local code to import it correctly. Running Ollama locally worked for me, but with Docker, I ended up modifying the code. |
Thank you Cris, can it be checked if ollama is running inside a docker container or outside of it? That would enable the code to run one or other fetching of models. What was the code change you did exactly in line 71 or around it to make it work? |
OK here's the thing. I deleted the container and installed ollama on the Mac itself (outside of docker). I then went into the code and the only way I could get it to work was to substitute name for model and had to serialize the datetime object returned in response by the self.client.list. Without cleaning it too much here's the rewritten code:
Now I'll get a look into some tests to see if it is working well with the received data. Also I didn't check if it is working with the containered ollama. I'll leave that for later. |
At the time I ended up doing this too, the issue of even running outside of Docker could be as I had modified the code to meet the needs of running in Docker it also worked |
So you made the same changes to the code as I did? Shouldn't you or I do a PR if you made it work in both ways (on and off docker)? PS. In order to work outside of docker I had to configure docker to enable the host network to be visible to the containers and in the API Base I had to input http://host.docker.internal:11434. Now if I could only make one of the LLM's retrieve records from the database and work on them. |
please do! |
OK. I'll test it out on docker and tell you guys if it worked but I do not have the dev environment setup to make a fork and PR. Maybe @CristianoMafraJunior could grab the code I changed and put it in a PR? Did any of you guys managed to make any of the Ollama models retrive data from the odoo database. All models I create simply hallucinate and do not execute the tools. I have been at it for hours on end. Maybe it's the choice of model? |
This week I'll open a PR for this fix. I haven't tested the database much, but I still believe it's under development. |
Did you put the serialization issue inside the PR as well @CristianoMafraJunior ? Just curious because I only see the line with the name t omodel change. I don't see the one with the datetime. Without thzt I get a serialization error. Can it be because of the python libraries? |
I opened this issue on Ollama to see if anyone else has experienced this, but they said that since version 0.1.30, the return is "name." However, in our environment, it's still returning "model." Now I’m going to analyze the reason for this, as I haven’t quite understood it yet. |
probably some bug on the python client, I opened a ticket here: |
Looks like we're using Ollama-Python lol |
Curious as to what the answer will be for @adar2378's ticket. |
i upgraded the python client and the brew / docker to the latest versions and the bug is gone. could somebody confirm? |
it magically worked for me yesterday and now it's broken again - we should get it fixed upsteam |
Hello,
I would kindly ask any help setting up one of the LLM's in order to see the nodule's working. But I am hitting the wall here r.
I am en route to setup a development environment and I wanted to have Ollama as a provider for llm in an odoo 16 environment.
I have a docker stack with odoo 16 and postgres 15 inside it and I got an ollama container set up but I am constantly getting this error when defining the provider:
docker PS gives me this:
I have tried all possible combinations of API base strings but the best I could get was a 404 message.
Any ideas?
The text was updated successfully, but these errors were encountered: