Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider making a number of properties on ChatCompletionOptions public #306

Open
1 task done
eiriktsarpalis opened this issue Dec 10, 2024 · 4 comments
Open
1 task done

Comments

@eiriktsarpalis
Copy link

Confirm this is a feature request for the .NET library and not the underlying OpenAI API

  • This is a feature request for the .NET library

Describe the feature or improvement you are requesting

I'm currently implementing a feature in Microsoft.Extensions.AI that maps OpenAI chat completion requests (as represented in the ChatCompletionOptions type) to the Microsoft.Extensions.AI model. This is to accelerate development of Copilot agents that must be able to accept completion requests in the OpenAI wire format.

One of the issues I've encountered is that a number of important properties in the ChatCompletionOptions class are currently marked internal. This includes the Messages property, the Stream property, and the Model property.

While I've been able to work around the issue temporarily using reflection, this isn't a viable solution longer term. Please consider making these (and other properties in the model) public so that your types can be used as DTOs in the context of agents implementing the OpenAI wire format.

Additional context

No response

@KrzysztofCwalina
Copy link
Collaborator

I think the right way to implement the abstractions is to use the protocol methods. They will not only give you access to all, otherwise internal, properties, but you will be able to avoid the cost of instantiating OpenAI model types and then copying their properties into the abstraction layer model types. You can read about the protocol methods here: https://learn.microsoft.com/en-us/dotnet/azure/sdk/protocol-convenience-methods?tabs=convenience-methods%2Csystem-clientmodel

@eiriktsarpalis
Copy link
Author

Correct me if I'm wrong, but it appears protocol methods necessitate working with a client. In the case of agents though we require processing the wire format from the perspective of a server.

From our perspective, the cost of allocating the intermediate models is not a huge source of concern, we do that already on the client side of things. What we would want to avoid though is hardcoding/reimplementing the OpenAI wire format, as such being able to use the existing DTOs from the server side would be amazing.

cc @stephentoub

@KrzysztofCwalina
Copy link
Collaborator

Yeah, I did not mean to imply that the model allocations are something you should worry about. I was just saying that protocol methods give you all the flexibility including some additional perf benefits. I understand that the cost of implementing these abstractions would be much higher if you were to call the protocol methods, and so I am not proposing you do it whole sale. But would it be reasonable to do it just for the APIs where you need to access the additional properties? Also, are you aware that for output models (models returned from client methods), you can access all the "internal" properties using raw content?, i.e. take a look at ClientResult.GetRawResponse

@bbartels
Copy link

bbartels commented Jan 9, 2025

To add to the list it would be useful to get access to:

internal InternalChatCompletionStreamOptions StreamOptions { get; set; }

Currently getting around it with:

public static class OpenAIHelper
{
    public static ChatCompletionOptions GetOptionsNotIncludingStreamUsage()
    {
        ChatCompletionOptions options = new ChatCompletionOptions();

        var streamOptionsPropertyInfo = typeof(ChatCompletionOptions).GetProperty("StreamOptions", BindingFlags.NonPublic | BindingFlags.Instance);

        var streamOptionsInstance = streamOptionsPropertyInfo!.GetValue(options);

        var streamOptionsType = streamOptionsInstance!.GetType();
        var includeUsagePropertyInfo = streamOptionsType.GetProperty("IncludeUsage", BindingFlags.Public | BindingFlags.Instance);
        includeUsagePropertyInfo!.SetValue(streamOptionsInstance, false);

        return options;
    }
}

Which isn't exactly a very maintainable hack 😅

My use-case is quite specific however, this is for an integration test that ensure our openai proxy correctly appends "streamOptions": { "includeUsage": true } if not supplied

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants