fix(ollama_api): normalise tool_calls during Ollama→OpenAI conversion#1481
fix(ollama_api): normalise tool_calls during Ollama→OpenAI conversion#1481kenvandine wants to merge 1 commit intolemonade-sdk:mainfrom
Conversation
llama.cpp's common_chat_msgs_parse_oaicompat strictly validates the
OpenAI spec when processing message history. Ollama clients violate
the spec in two ways that cause 500 errors:
1. Missing "type":"function" on tool_call objects
The Ollama API spec does not require a "type" field, but llama.cpp
throws "Missing tool call type" if it is absent. Fix by injecting
"type":"function" on any tool_call that lacks it.
2. Non-JSON arguments string (e.g. arguments="{")
Ollama clients may persist incomplete streaming state to conversation
history mid-stream. llama.cpp's func_args_not_string() tries to
parse the arguments string as JSON and throws a parse_error 500.
Fix by skipping tool calls whose arguments are not valid JSON rather
than forwarding them to the backend.
There was a problem hiding this comment.
Pull request overview
Normalizes Ollama tool_calls when converting an Ollama chat request into an OpenAI-compatible chat request to avoid llama.cpp 500s caused by strict OpenAI-spec validation.
Changes:
- Injects missing
tool_calls[].type = "function"during Ollama → OpenAI request conversion. - Validates
tool_calls[].function.argumentsstrings as JSON and drops tool calls with invalid JSON arguments. - Omits
tool_callsentirely when all tool calls are filtered out.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| if (msg.contains("tool_calls")) { | ||
| openai_msg["tool_calls"] = msg["tool_calls"]; | ||
| json tool_calls = json::array(); | ||
| for (auto tc : msg["tool_calls"]) { | ||
| if (!tc.contains("type")) { | ||
| tc["type"] = "function"; | ||
| } | ||
| // Validate arguments JSON if present | ||
| if (tc.contains("function") && tc["function"].contains("arguments")) { |
There was a problem hiding this comment.
msg.contains("tool_calls") doesn’t verify the field is an array, and the loop doesn’t guard that each tc is an object. If a malformed Ollama client sends tool_calls as a non-array or includes non-object entries, tc["type"] = "function" will throw a nlohmann::json type_error and still produce a 500. Consider checking msg["tool_calls"].is_array() and skipping any tc that isn’t an object before mutating/inspecting it (and similarly ensure tc["function"] is an object before reading arguments).
There was a problem hiding this comment.
@copilot apply changes based on this feedback
llama.cpp's common_chat_msgs_parse_oaicompat strictly validates the
OpenAI spec when processing message history. Ollama clients violate
the spec in two ways that cause 500 errors:
Missing "type":"function" on tool_call objects The Ollama API spec does not require a "type" field, but llama.cpp throws "Missing tool call type" if it is absent. Fix by injecting "type":"function" on any tool_call that lacks it.
Non-JSON arguments string (e.g. arguments="{") Ollama clients may persist incomplete streaming state to conversation history mid-stream. llama.cpp's func_args_not_string() tries to parse the arguments string as JSON and throws a parse_error 500. Fix by skipping tool calls whose arguments are not valid JSON rather than forwarding them to the backend.