Skip to content

fix(ollama): Handle non-tool-call JSON response when format=json #9966

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

arjunprabhulal
Copy link

@arjunprabhulal arjunprabhulal commented Apr 13, 2025

Issue
fix(ollama): Handle non-tool-call JSON response when format=json
Issue occurs while using from google.adk.models.lite_llm import LiteLlm with llama3.2 , gemma3:27b

Error

venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/transformation.py", line 266, in transform_response
    "name": function_call["name"],
            ~~~~~~~~~~~~~^^^^^^^^
KeyError: 'name'

Changes
This PR addresses a KeyError: 'name' that occurs in litellm/llms/ollama/completion/transformation.py when using certain Ollama models (e.g., ollama/llama3.2:latest, ollama/gemma3:27b) with tool calling enabled (format="json").
Problem:

When LiteLLM expects a tool call response, it requests format="json" from Ollama. However, some models return a valid JSON string in the response field, but the structure of this JSON does not match the expected tool call format ({"name": ..., "arguments": ...}). Instead, it might contain other structures (e.g., Schema.org JSON as seen with llama3.2). When transformation.py parses this JSON and attempts to access function_call["name"], it results in a KeyError.
adk-litellm-issue

Copy link

vercel bot commented Apr 13, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 14, 2025 0:46am

@arjunprabhulal
Copy link
Author

This PR addresses Infinite loop issue in ADK which couple of them have raised against the Function Calling using ADK. Although codebase directly uses google.adk.models.lite_llm.LiteLlm, and ADK module acts as a wrapper around the litellm library. When configured with an Ollama model (e.g., "ollama/..."), the ADK delegates the call to litellm, which internally uses litellm/llms/ollama/completion/transformation.py to process the Ollama API response. Therefore, changes within that specific litellm file can indirectly impact the behavior observed through the ADK when using Ollama models. So this PR merge is must to resolve the issue
image

@arjunprabhulal
Copy link
Author

Here is full code repo for above fix to work with external third party tools like with ADK - Function calling with Ollama/gemma3:27B and Ollama/llama3.2:latest and other models

https://github.com/arjunprabhulal/adk-gemma3-function-calling

@arjunprabhulal
Copy link
Author

Above code solves adk web

Before above code change
image

After above fix

image

@arjunprabhulal
Copy link
Author

@ishaan-jaff - Please take a look at this and happy to provide more information

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant