Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in trying to make Ollama to work #33

Open
peterromao opened this issue Mar 20, 2025 · 19 comments
Open

Error in trying to make Ollama to work #33

peterromao opened this issue Mar 20, 2025 · 19 comments

Comments

@peterromao
Copy link

Hello,

I would kindly ask any help setting up one of the LLM's in order to see the nodule's working. But I am hitting the wall here r.

I am en route to setup a development environment and I wanted to have Ollama as a provider for llm in an odoo 16 environment.

I have a docker stack with odoo 16 and postgres 15 inside it and I got an ollama container set up but I am constantly getting this error when defining the provider:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/odoo/http.py", line 1632, in _serve_db
    return service_model.retrying(self._serve_ir_http, self.env)
  File "/usr/lib/python3/dist-packages/odoo/service/model.py", line 133, in retrying
    result = func()
  File "/usr/lib/python3/dist-packages/odoo/http.py", line 1659, in _serve_ir_http
    response = self.dispatcher.dispatch(rule.endpoint, args)
  File "/usr/lib/python3/dist-packages/odoo/http.py", line 1863, in dispatch
    result = self.request.registry['ir.http']._dispatch(endpoint)
  File "/usr/lib/python3/dist-packages/odoo/addons/base/models/ir_http.py", line 154, in _dispatch
    result = endpoint(**request.params)
  File "/usr/lib/python3/dist-packages/odoo/http.py", line 716, in route_wrapper
    result = endpoint(self, *args, **params_ok)
  File "/usr/lib/python3/dist-packages/odoo/addons/web/controllers/dataset.py", line 42, in call_kw
    return self._call_kw(model, method, args, kwargs)
  File "/usr/lib/python3/dist-packages/odoo/addons/web/controllers/dataset.py", line 33, in _call_kw
    return call_kw(request.env[model], method, args, kwargs)
  File "/usr/lib/python3/dist-packages/odoo/api.py", line 468, in call_kw
    result = _call_kw_multi(method, model, args, kwargs)
  File "/usr/lib/python3/dist-packages/odoo/api.py", line 453, in _call_kw_multi
    result = method(recs, *args, **kwargs)
  File "/usr/lib/python3/dist-packages/odoo/models.py", line 6485, in onchange
    defaults = self.default_get(missing_names)
  File "/mnt/extra-addons/odoo-llm/llm/wizards/fetch_models_wizard.py", line 115, in default_get
    for model_data in provider.list_models():
  File "/mnt/extra-addons/odoo-llm/llm_ollama/models/ollama_provider.py", line 66, in ollama_models
    response = self.client.list()
  File "/usr/local/lib/python3.9/dist-packages/ollama/_client.py", line 567, in list
    return self._request(
  File "/usr/local/lib/python3.9/dist-packages/ollama/_client.py", line 178, in _request
    return cls(**self._request_raw(*args, **kwargs).json())
  File "/usr/local/lib/python3.9/dist-packages/ollama/_client.py", line 124, in _request_raw
    raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download

The above server error caused the following client error:
null

docker PS gives me this:

Image

I have tried all possible combinations of API base strings but the best I could get was a 404 message.

Any ideas?

@CristianoMafraJunior
Copy link
Contributor

Try testing like this: http://ollama:11434

url_ollama

@peterromao
Copy link
Author

Obrigado Cristiano,

But it's still giving this error

  File "/usr/local/lib/python3.9/dist-packages/ollama/_client.py", line 124, in _request_raw
    raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download

The above server error caused the following client error:
null

I am using the ollama server so that it shows 55005 to the exterior. Internal address is the one with 11434.

And from outside of docker I interrogate the site it gives me the following.

Image

and when I ask for localhost:55005/api/tags I get:

Image

A nicer view is from firefox:

Image

Can't imagine what I have to do.

Do you have a working environment with containers?

@CristianoMafraJunior
Copy link
Contributor

@peterromao Como você é português você responder em pt-br que fica melhor
Eu sou ambiente Docker o Doodba https://github.com/Tecnativa/doodba

Configuração dele:

Image
Image

A minha duvida é você WSL?
Pode ser que vai te que apontar IP da sua maquina no docker como fiz aqui:

ports:
      - "127.0.0.1:11435:11434"

Para você ter acesso e conseguir importar os modelos

@peterromao
Copy link
Author

Obrigado Cristiano. Estou com servidores de teste Mac Silicon M4. Mas, se não se importar, vou continuar em inglês para bem dos outros leitores que passem por aqui e que poderão querer aprender algo.

So this is on an M4 MacMini where I use only test and staging servers so no WSL. I do have another MacBook Pro where I do my development and I am starting to use doodba. Great for development work but quircky for production and staging. At least for me for the time being.

Got a stack for odoo, plain and simple without any proxys as this is a do and undo server we access for demos. My docker run command for ollama is very similar to yours. Mine differs in the port from the outside of docker. I have 55005 from outside but odoo finds the container via "http://Ollama:11434" which is how it should be, I guess.

With this setting I am finally communicating from odoo with the ollama container as can be seen from the information before the error message in the logs:

2025-03-21 00:32:58 2025-03-21 00:32:58,007 1 INFO TesteLLM httpx: HTTP Request: GET http://ollama:11434/api/tags "HTTP/1.1 200 OK"

The error message is hinting that there's something fishy going on with the data going through the methods.

2025-03-21 00:32:58   File "/usr/lib/python3/dist-packages/odoo/models.py", line 6485, in onchange
2025-03-21 00:32:58     defaults = self.default_get(missing_names)
2025-03-21 00:32:58   File "/mnt/extra-addons/odoo-llm/llm/wizards/fetch_models_wizard.py", line 115, in default_get
2025-03-21 00:32:58     for model_data in provider.list_models():
2025-03-21 00:32:58   File "/mnt/extra-addons/odoo-llm/llm_ollama/models/ollama_provider.py", line 72, in ollama_models
2025-03-21 00:32:58     "name": model["name"],
2025-03-21 00:32:58   File "/usr/local/lib/python3.9/dist-packages/ollama/_types.py", line 32, in __getitem__
2025-03-21 00:32:58     raise KeyError(key)
2025-03-21 00:32:58 KeyError: 'name'

And that's it really.

Thank you again for your answers,

@CristianoMafraJunior
Copy link
Contributor

CristianoMafraJunior commented Mar 21, 2025

We are making progress! You are now encountering at least the same error I faced in Issue #23 missing the field name when importing the model.

When I first opened the issue, I thought it was just a bug on my end and ended up closing it. However, I believe this error should be reported to the project maintainers. I suspect it might be happening because we are running it via Docker.

At the time, I updated the local code to import it correctly.

Running Ollama locally worked for me, but with Docker, I ended up modifying the code.

ping @ayushin @adar2378

@peterromao
Copy link
Author

Thank you Cris,

can it be checked if ollama is running inside a docker container or outside of it? That would enable the code to run one or other fetching of models.

What was the code change you did exactly in line 71 or around it to make it work?

@peterromao
Copy link
Author

OK here's the thing. I deleted the container and installed ollama on the Mac itself (outside of docker).

I then went into the code and the only way I could get it to work was to substitute name for model and had to serialize the datetime object returned in response by the self.client.list.

Without cleaning it too much here's the rewritten code:

for model in response.get("models", []):
            # Basic model info
            # PR substitute "name" for "model" and serialize datetime
            model_info = {
                #"name": model["name"],
                "name": model.get("model", ""),
                "details": {
                    #"id": model["model"],
                    "id": model.get("model", ""), 
                    "capabilities": ["chat"],  # Default capability
                    #"modified_at": model.get("modified_at"),
                    "modified_at": model.get("modified_at").strftime('%Y-%m-%dT%H:%M:%S.%fZ') if isinstance(model.get("modified_at"), datetime) else model.get("modified_at"),
                    "size": model.get("size"),
                    "digest": model.get("digest"),
                },
            }

            # Add embedding capability if model name suggests it
            if "embedding" in model["model"].lower():
                model_info["details"]["capabilities"].append("embedding")

Now I'll get a look into some tests to see if it is working well with the received data. Also I didn't check if it is working with the containered ollama. I'll leave that for later.

@CristianoMafraJunior
Copy link
Contributor

OK here's the thing. I deleted the container and installed ollama on the Mac itself (outside of docker).

I then went into the code and the only way I could get it to work was to substitute name for model and had to serialize the datetime object returned in response by the self.client.list.

Without cleaning it too much here's the rewritten code:

for model in response.get("models", []):
            # Basic model info
            # PR substitute "name" for "model" and serialize datetime
            model_info = {
                #"name": model["name"],
                "name": model.get("model", ""),
                "details": {
                    #"id": model["model"],
                    "id": model.get("model", ""), 
                    "capabilities": ["chat"],  # Default capability
                    #"modified_at": model.get("modified_at"),
                    "modified_at": model.get("modified_at").strftime('%Y-%m-%dT%H:%M:%S.%fZ') if isinstance(model.get("modified_at"), datetime) else model.get("modified_at"),
                    "size": model.get("size"),
                    "digest": model.get("digest"),
                },
            }

            # Add embedding capability if model name suggests it
            if "embedding" in model["model"].lower():
                model_info["details"]["capabilities"].append("embedding")

Now I'll get a look into some tests to see if it is working well with the received data. Also I didn't check if it is working with the containered ollama. I'll leave that for later.

At the time I ended up doing this too, the issue of even running outside of Docker could be as I had modified the code to meet the needs of running in Docker it also worked

@peterromao
Copy link
Author

peterromao commented Mar 22, 2025

So you made the same changes to the code as I did?

Shouldn't you or I do a PR if you made it work in both ways (on and off docker)?

PS. In order to work outside of docker I had to configure docker to enable the host network to be visible to the containers and in the API Base I had to input http://host.docker.internal:11434. Now if I could only make one of the LLM's retrieve records from the database and work on them.

@ayushin
Copy link
Contributor

ayushin commented Mar 22, 2025

please do!

@peterromao
Copy link
Author

OK. I'll test it out on docker and tell you guys if it worked but I do not have the dev environment setup to make a fork and PR. Maybe @CristianoMafraJunior could grab the code I changed and put it in a PR?

Did any of you guys managed to make any of the Ollama models retrive data from the odoo database. All models I create simply hallucinate and do not execute the tools. I have been at it for hours on end. Maybe it's the choice of model?

@CristianoMafraJunior
Copy link
Contributor

OK. I'll test it out on docker and tell you guys if it worked but I do not have the dev environment setup to make a fork and PR. Maybe @CristianoMafraJunior could grab the code I changed and put it in a PR?

Did any of you guys managed to make any of the Ollama models retrive data from the odoo database. All models I create simply hallucinate and do not execute the tools. I have been at it for hours on end. Maybe it's the choice of model?

This week I'll open a PR for this fix. I haven't tested the database much, but I still believe it's under development.

@peterromao
Copy link
Author

Did you put the serialization issue inside the PR as well @CristianoMafraJunior ? Just curious because I only see the line with the name t omodel change. I don't see the one with the datetime. Without thzt I get a serialization error. Can it be because of the python libraries?

@CristianoMafraJunior
Copy link
Contributor

Did you put the serialization issue inside the PR as well @CristianoMafraJunior ? Just curious because I only see the line with the name t omodel change. I don't see the one with the datetime. Without thzt I get a serialization error. Can it be because of the python libraries?

ollama/ollama#9985

I opened this issue on Ollama to see if anyone else has experienced this, but they said that since version 0.1.30, the return is "name." However, in our environment, it's still returning "model." Now I’m going to analyze the reason for this, as I haven’t quite understood it yet.

@adar2378
Copy link
Contributor

probably some bug on the python client, I opened a ticket here:
ollama/ollama-python#487

@CristianoMafraJunior
Copy link
Contributor

probably some bug on the python client, I opened a ticket here: ollama/ollama-python#487

Looks like we're using Ollama-Python lol

@peterromao
Copy link
Author

Curious as to what the answer will be for @adar2378's ticket.

@ayushin
Copy link
Contributor

ayushin commented Apr 4, 2025

i upgraded the python client and the brew / docker to the latest versions and the bug is gone. could somebody confirm?

@ayushin
Copy link
Contributor

ayushin commented Apr 5, 2025

it magically worked for me yesterday and now it's broken again - we should get it fixed upsteam

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants