-
Notifications
You must be signed in to change notification settings - Fork 2.7k
fix(models): handle mixed tool responses and text/media in LiteLLM ad… #4109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 2 commits
36dfe40
1b93ecb
01c3773
4768ddf
fcd150d
827487d
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -448,20 +448,11 @@ async def _content_to_message_param( | |||||
| *, | ||||||
| provider: str = "", | ||||||
| ) -> Union[Message, list[Message]]: | ||||||
| """Converts a types.Content to a litellm Message or list of Messages. | ||||||
|
|
||||||
| Handles multipart function responses by returning a list of | ||||||
| ChatCompletionToolMessage objects if multiple function_response parts exist. | ||||||
|
|
||||||
| Args: | ||||||
| content: The content to convert. | ||||||
| provider: The LLM provider name (e.g., "openai", "azure"). | ||||||
|
|
||||||
| Returns: | ||||||
| A litellm Message, a list of litellm Messages. | ||||||
| """ | ||||||
| """Converts a types.Content to a litellm Message or list of Messages.""" | ||||||
|
|
||||||
| tool_messages = [] | ||||||
| other_parts = [] | ||||||
|
|
||||||
| for part in content.parts: | ||||||
| if part.function_response: | ||||||
| response = part.function_response.response | ||||||
|
|
@@ -477,11 +468,62 @@ async def _content_to_message_param( | |||||
| content=response_content, | ||||||
| ) | ||||||
| ) | ||||||
| if tool_messages: | ||||||
| else: | ||||||
| other_parts.append(part) | ||||||
|
|
||||||
| if tool_messages and not other_parts: | ||||||
| return tool_messages if len(tool_messages) > 1 else tool_messages[0] | ||||||
|
|
||||||
| # Handle user or assistant messages | ||||||
| role = _to_litellm_role(content.role) | ||||||
| extra_message = None | ||||||
|
|
||||||
| if other_parts: | ||||||
| role = _to_litellm_role(content.role) | ||||||
|
|
||||||
| if role == "user": | ||||||
| user_parts = [part for part in other_parts if not part.thought] | ||||||
| message_content = await _get_content(user_parts, provider=provider) or None | ||||||
| if message_content: | ||||||
| extra_message = ChatCompletionUserMessage(role="user", content=message_content) | ||||||
|
|
||||||
| else: | ||||||
| tool_calls = [] | ||||||
| content_parts: list[types.Part] = [] | ||||||
| reasoning_parts: list[types.Part] = [] | ||||||
|
|
||||||
| for part in other_parts: | ||||||
| if part.function_call: | ||||||
| tool_calls.append( | ||||||
| ChatCompletionAssistantToolCall( | ||||||
| type="function", | ||||||
| id=part.function_call.id, | ||||||
| function=Function( | ||||||
| name=part.function_call.name, | ||||||
| arguments=_safe_json_serialize(part.function_call.args), | ||||||
| ), | ||||||
| ) | ||||||
| ) | ||||||
| elif part.thought: | ||||||
| reasoning_parts.append(part) | ||||||
| else: | ||||||
| content_parts.append(part) | ||||||
|
|
||||||
| message_content = await _get_content(content_parts, provider=provider) or None | ||||||
| reasoning_content = "\n".join([p.thought for p in reasoning_parts]) if reasoning_parts else None | ||||||
|
||||||
| reasoning_content = "\n".join([p.thought for p in reasoning_parts]) if reasoning_parts else None | |
| reasoning_content = "\n".join([p.text for p in reasoning_parts if p.text]) if reasoning_parts else None |
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Uh oh!
There was an error while loading. Please reload this page.