-
Notifications
You must be signed in to change notification settings - Fork 235
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tools from Streaming Chat Completion not triggering #313
Comments
I can have gpt-4 do tool calls when using CompleteChatStreamingAsync. But when I send the tool result back to the AI, I always get "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.'" Did you get tool calling with CompleteChatStreamingAsync working? I'm using the OpenAI 2.1.0 .NET SDK. |
I have spotted the issue that stopped my code from trigger the tools. Was 100% an PEBCAK issue, I was not passing the options (that contains the tools) to my abstraction of ChatClient, but under the hood this was not being passed to the real ChatClient.
I am now getting the same error you are seeing. When adding the response from the call as a ToolChatMessage and reprocessing the Chat Completion I get the 400 response with same error.
Any idea what might be causing this error?
|
Managed to get the tools calls working. For some reason when responding to a tool call before adding the ToolChatMessage you need to add an AssistantChatMessage for the Tool Call.

Red - Adding the AssistantChatMessage to give context of the tool being called.
Blue - Adding the ToolChatMessage to give the response from the tool.
I will be honest does feel a bit strange having to feedback the tool was called. Would expect the ChatClient to know this as it made the request for the tool call. Any thoughts on this?
|
Thanks, Rich. I send back the tool result using the CompleteChat() method, not the streaming version. When the AI receives the result, the streaming continues, because we're still in the "await foreach (StreamingChatCompletionUpdate completionUpdate in completionUpdates)" loop. Do you think it's better to send the whole chat history instead of just these two messages? Maybe the reason for the need to feedback the tool is because of parallel tool calling? Although the toolCall.Id of the ToolChatMessage should be enough to determine what the message is in reponse to. But maybe it's because of the functionArguments that the tool was called with. Hope this helps other people as well. |
Thank you for reaching out, @RichOwenMercury and @Raivr-dev. I wanted to call out that the chat completions API is technically stateless (contrary to the Assistants preview API). In other words, the server does not remember any requests that you have sent previously. Because of this, you need to send all the messages in a conversation with each new request. If you omit any messages, the model has no way to know about them, and it will generate a response as if those messages didn't exist. In that sense, the reason why you need to include the We have an example on how to use tool calls with streaming that you might find helpful. You can find it here: |
Thank you, @joseharriaga, that makes sense. But, when I send just the assistantMessage and toolChatMessage in response to the ChatFinishReason.ToolCalls (as described above), it will continue doing what it was prompted in the initial user prompt. So there seems to be memory. Thanks for the examples. |
The API being stateless makes things a lot clearer and explains the behaviour I was seeing. The streaming example is exactly what I was looking for and is very similar to what I have now implemented. Working as expected now, thank you @joseharriaga. |
Have used the example here: Have spotted the StreamingChatToolCallsBuilder is not available. Have replaced this with List() and used select (LINQ) to convert to ChatToolCall array to pass to AssistantChatMessage:
This still triggers the Tools as expected, however the issue I am now getting is I have tool with parameters but when the tool is triggered the arguments are always empty. |
You can find the |
The FunctionArguments are received in chunks over several ToolCallUpdates in multiple StreamingChatCompletionUpdates. |
Amazing, makes a lot more sense now. I thought the StreamingChatToolCallsBuilder was part of the nuget package, but have found it now :). Thanks for the help @joseharriaga & @Raivr-dev |
Service
Azure OpenAI
Describe the bug
When using the CompleteChatStreamingAsync method on the ChatClient. I have tools configured with the ToolChoice set to Auto. However, whenever I send a user message that should trigger the tool, the TooCallUpdates against the StreamingChatCompletionUpdate is always empty.
To test this I have a simple get_today tool that when executed the function will return the formatted datetime. I have provide the code below.
Not sure if this is something worng with
Steps to reproduce
Expected:
Will raise the ToolCallUpdate that will then allow me to call the required method.
Actual:
No ToolCallUpdate are raised so the chat returns the text '{get_today}'
Code snippets
OS
winOS
.NET version
8
Library version
2.1.0
The text was updated successfully, but these errors were encountered: