-
Notifications
You must be signed in to change notification settings - Fork 187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider making a number of properties on ChatCompletionOptions
public
#306
Comments
I think the right way to implement the abstractions is to use the protocol methods. They will not only give you access to all, otherwise internal, properties, but you will be able to avoid the cost of instantiating OpenAI model types and then copying their properties into the abstraction layer model types. You can read about the protocol methods here: https://learn.microsoft.com/en-us/dotnet/azure/sdk/protocol-convenience-methods?tabs=convenience-methods%2Csystem-clientmodel |
Correct me if I'm wrong, but it appears protocol methods necessitate working with a client. In the case of agents though we require processing the wire format from the perspective of a server. From our perspective, the cost of allocating the intermediate models is not a huge source of concern, we do that already on the client side of things. What we would want to avoid though is hardcoding/reimplementing the OpenAI wire format, as such being able to use the existing DTOs from the server side would be amazing. cc @stephentoub |
Yeah, I did not mean to imply that the model allocations are something you should worry about. I was just saying that protocol methods give you all the flexibility including some additional perf benefits. I understand that the cost of implementing these abstractions would be much higher if you were to call the protocol methods, and so I am not proposing you do it whole sale. But would it be reasonable to do it just for the APIs where you need to access the additional properties? Also, are you aware that for output models (models returned from client methods), you can access all the "internal" properties using raw content?, i.e. take a look at ClientResult.GetRawResponse |
To add to the list it would be useful to get access to:
Currently getting around it with: public static class OpenAIHelper
{
public static ChatCompletionOptions GetOptionsNotIncludingStreamUsage()
{
ChatCompletionOptions options = new ChatCompletionOptions();
var streamOptionsPropertyInfo = typeof(ChatCompletionOptions).GetProperty("StreamOptions", BindingFlags.NonPublic | BindingFlags.Instance);
var streamOptionsInstance = streamOptionsPropertyInfo!.GetValue(options);
var streamOptionsType = streamOptionsInstance!.GetType();
var includeUsagePropertyInfo = streamOptionsType.GetProperty("IncludeUsage", BindingFlags.Public | BindingFlags.Instance);
includeUsagePropertyInfo!.SetValue(streamOptionsInstance, false);
return options;
}
} Which isn't exactly a very maintainable hack 😅 My use-case is quite specific however, this is for an integration test that ensure our openai proxy correctly appends |
Confirm this is a feature request for the .NET library and not the underlying OpenAI API
Describe the feature or improvement you are requesting
I'm currently implementing a feature in Microsoft.Extensions.AI that maps OpenAI chat completion requests (as represented in the
ChatCompletionOptions
type) to the Microsoft.Extensions.AI model. This is to accelerate development of Copilot agents that must be able to accept completion requests in the OpenAI wire format.One of the issues I've encountered is that a number of important properties in the
ChatCompletionOptions
class are currently marked internal. This includes theMessages
property, theStream
property, and theModel
property.While I've been able to work around the issue temporarily using reflection, this isn't a viable solution longer term. Please consider making these (and other properties in the model) public so that your types can be used as DTOs in the context of agents implementing the OpenAI wire format.
Additional context
No response
The text was updated successfully, but these errors were encountered: