-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can MCP Chunk the files? #76
Comments
Can you share a standalone repro? How are you constructing the messages? |
That was quick :D I am just trying out this SDK today! I am so far loving this SDK it is so easy to use and write tools! Thank you! So far I wrote these tools:
But used the stock sample app to do the chat/MCP integration, I know it is my fault cause that is very basic thing to do with AzureOpenAIClient azureClient = new(
new Uri("https://......openai.azure.com"),
credential);
using IChatClient chatClient = azureClient.AsChatClient("o3-mini")
.AsBuilder().UseFunctionInvocation().Build();
List<ChatMessage> messages = [];
while (true)
{
Console.Write("Q: ");
messages.Add(new(ChatRole.User, Console.ReadLine()));
List<ChatResponseUpdate> updates = [];
await foreach (var update in chatClient.GetStreamingResponseAsync(messages, new() { Tools = [.. tools] }))
{
Console.Write(update);
updates.Add(update);
}
Console.WriteLine();
messages.AddMessages(updates);
} |
Thanks. I think there are likely two aspects here. There's the length limitation OpenAI imposes on the length of an individual message. That could be addressed by splitting the single message up into multiple messages, as you say. But there's the more challenging aspect of token limitation. If you're sending a 1mb message, that's likely several hundred thousand tokens, which is very likely going to exceed the context window of your model. No amount of splitting up the tool result message into multiple messages will help with that. Both of these can be addressed external to the MCP library by plugging in a custom IChatClient after the FunctionInvokingChatClient that culls back the results from the tool. But it'd be really hard for a general-purpose middleware filter to do so on an arbitrary tool result; it'd be removing content from that tool result, and it's not clear how it could do so in a way that would do the "right thing". It'd be better if the tool didn't return 1mb of content. |
I am using the
Azure.AI.OpenAI
client, and trying out this MCP C# SDK, and I am getting this error:How would I go ahead and fix the chunking boundary/chunk construction?
The text was updated successfully, but these errors were encountered: