Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tools from Streaming Chat Completion not triggering #313

Closed
RichOwenMercury opened this issue Dec 23, 2024 · 11 comments
Closed

Tools from Streaming Chat Completion not triggering #313

RichOwenMercury opened this issue Dec 23, 2024 · 11 comments
Assignees
Labels
bug Something isn't working

Comments

@RichOwenMercury
Copy link

RichOwenMercury commented Dec 23, 2024

Service

Azure OpenAI

Describe the bug

When using the CompleteChatStreamingAsync method on the ChatClient. I have tools configured with the ToolChoice set to Auto. However, whenever I send a user message that should trigger the tool, the TooCallUpdates against the StreamingChatCompletionUpdate is always empty.

To test this I have a simple get_today tool that when executed the function will return the formatted datetime. I have provide the code below.

Not sure if this is something worng with

Steps to reproduce

  1. Send a User Message "What is today's date?"

Expected:
Will raise the ToolCallUpdate that will then allow me to call the required method.

Actual:
No ToolCallUpdate are raised so the chat returns the text '{get_today}'

Code snippets

// Main Code

var endpoint = "":
var apiKey = "";
var deployment = "";

var creds = new AzureKeyCredential(apiKey);
var client = new AzureOpenAIClient(endpoint, creds);
var chatClient = client.GetChatClient(deployment);

var tools = GetTools(); // Returns Dictionary<string, ITool>

var options = new ChatCompletionOptions();
if (tools.Any())
{
    options.ToolChoice = ChatToolChoice.CreateAutoChoice();
    
    foreach (var toolData in tools)
    {
        var tool = toolData.Value;

        options.Tools.Add(ChatTool.CreateFunctionTool(tool.Name, tool.Description, tool.Parameters));
    }
}

var messages = new List<ChatMessage>
{
    new SystemChatMessage("You are a helpful assistant. Always lookup the current date and/or time using the tool 'get_today' when required."),
    new UserChatMessage("What is today's date?")
};

var completionUpdates = chatClient.CompleteChatStreamingAsync(messages, options);

await foreach (var update in completionUpdates)
{
    Console.WriteLine("Finish Reason: {0}", update.FinishReason);
    if (update.FinishReason == ChatFinishReason.FunctionCall)
    {
        // Would expect here to be hit (1/2)
    }

    foreach (var toolCall in update.ToolCallUpdates)
    {
        // Would expect here to be hit (2/2)
    }
    
    foreach (var contentPart in update.ContentUpdate)
    {
        Console.WriteLine(contentPart.Text);
    }
}

// ITool
public interface ITool
{
    public string Name { get; }
    public string Description { get; }
    public BinaryData Parameters { get; }

    Task<string> RunAsync(string args);
}

// GetTodayTool
public class GetTodayTool : ITool
{
    public string Name => "get_today";
    public string Description => "Know the date and time of today. Will return in format dd/MM/yyyy HH:mm:ss";
    public BinaryData Parameters => BinaryData.FromString("{}");

    public Task<string> RunAsync(string args)
    {
        return Task.FromResult(DateTime.Now.ToString("dd//MM/yyyy HH:mm:ss"));
    }
}

OS

winOS

.NET version

8

Library version

2.1.0

@RichOwenMercury RichOwenMercury added the bug Something isn't working label Dec 23, 2024
@RichOwenMercury RichOwenMercury changed the title Calling Tools from Streaming Chat Completion not triggering Tools from Streaming Chat Completion not triggering Dec 23, 2024
@Raivr-dev
Copy link

I can have gpt-4 do tool calls when using CompleteChatStreamingAsync.
I do receive completionUpdate.ToolCallUpdates and ChatFinishReason.ToolCalls,

But when I send the tool result back to the AI, I always get "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.'"

Did you get tool calling with CompleteChatStreamingAsync working?

I'm using the OpenAI 2.1.0 .NET SDK.

@RichOwenMercury
Copy link
Author

RichOwenMercury commented Jan 27, 2025 via email

@RichOwenMercury
Copy link
Author

RichOwenMercury commented Jan 27, 2025 via email

@Raivr-dev
Copy link

Thanks, Rich.
I also got it working when the ToolChatMessage is in response to a 'tool_calls' AssistantChatMessage.
And the empty JSON ("{}") is necessary as otherwise it gave me this error: "System.ArgumentNullException: 'Value cannot be null. (Parameter 'bytes')'".

I send back the tool result using the CompleteChat() method, not the streaming version. When the AI receives the result, the streaming continues, because we're still in the "await foreach (StreamingChatCompletionUpdate completionUpdate in completionUpdates)" loop.
I just send back the assistantMessage and toolChatMessage. The AI seems to respond faster this way:
List toolResponseMessages = new List { assistantMessage, toolChatMessage };
chatClient.CompleteChat(toolResponseMessages);

Do you think it's better to send the whole chat history instead of just these two messages?

Maybe the reason for the need to feedback the tool is because of parallel tool calling? Although the toolCall.Id of the ToolChatMessage should be enough to determine what the message is in reponse to. But maybe it's because of the functionArguments that the tool was called with.

Hope this helps other people as well.

@joseharriaga
Copy link
Collaborator

Thank you for reaching out, @RichOwenMercury and @Raivr-dev. I wanted to call out that the chat completions API is technically stateless (contrary to the Assistants preview API). In other words, the server does not remember any requests that you have sent previously. Because of this, you need to send all the messages in a conversation with each new request. If you omit any messages, the model has no way to know about them, and it will generate a response as if those messages didn't exist. In that sense, the reason why you need to include the AssistantChatMessage asking for the tool call before including the ToolChatMessage with the result of that tool call is because the model does not remember that it had previously asked for a tool call. Instead, it relies on you sending the entire "history" in your request so that it can process the whole conversation from start to finish.

We have an example on how to use tool calls with streaming that you might find helpful. You can find it here:
🔗 https://github.com/openai/openai-dotnet/blob/main/examples/Chat/Example04_FunctionCallingStreamingAsync.cs
🔗 https://github.com/openai/openai-dotnet/blob/main/examples/Chat/Example04_FunctionCallingStreaming.cs

@Raivr-dev
Copy link

Thank you, @joseharriaga, that makes sense.

But, when I send just the assistantMessage and toolChatMessage in response to the ChatFinishReason.ToolCalls (as described above), it will continue doing what it was prompted in the initial user prompt. So there seems to be memory.
My user prompt was "Ask the user for a number, then write a short story about that number, maximum three sentences". I had one ChatTool 'getNumberFromUserTool'.
It works as expected, without sending the initial system prompt and user prompt again.

Thanks for the examples.

@RichOwenMercury
Copy link
Author

The API being stateless makes things a lot clearer and explains the behaviour I was seeing.

The streaming example is exactly what I was looking for and is very similar to what I have now implemented.

Working as expected now, thank you @joseharriaga.

@RichOwenMercury
Copy link
Author

RichOwenMercury commented Jan 28, 2025

Have used the example here:
https://github.com/openai/openai-dotnet/blob/main/examples/Chat/Example04_FunctionCallingStreamingAsync.cs

Have spotted the StreamingChatToolCallsBuilder is not available. Have replaced this with List() and used select (LINQ) to convert to ChatToolCall array to pass to AssistantChatMessage:

var toolCalls = toolCallUpdates.Select(toolCall =>
{
    return ChatToolCall.CreateFunctionToolCall(toolCall.ToolCallId, toolCall.FunctionName, toolCall.FunctionArgumentsUpdate); 
});

var assistantMessage = new AssistantChatMessage(toolCalls);

This still triggers the Tools as expected, however the issue I am now getting is I have tool with parameters but when the tool is triggered the arguments are always empty.

@joseharriaga
Copy link
Collaborator

You can find the StreamingChatToolCallsBuilder in the sync version of that example here:
🔗 https://github.com/openai/openai-dotnet/blob/main/examples/Chat/Example04_FunctionCallingStreaming.cs

@Raivr-dev
Copy link

I have tool with parameters but when the tool is triggered the arguments are always empty.

The FunctionArguments are received in chunks over several ToolCallUpdates in multiple StreamingChatCompletionUpdates.

@joseharriaga joseharriaga self-assigned this Jan 29, 2025
@RichOwenMercury
Copy link
Author

Amazing, makes a lot more sense now. I thought the StreamingChatToolCallsBuilder was part of the nuget package, but have found it now :).

Thanks for the help @joseharriaga & @Raivr-dev

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants