We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
There currently seems no way to use the models separately from the commands/clients. It's all very tightly coupled.
For my use case of sending JSON messages on a WebRTC data channel I need to be able to serialise along the lines of:
ConversationSessionOptions sessionOptions = new() { Voice = ConversationVoice.Alloy, Instructions = "Hi There!" };
To JSON as per the OpenAI specification:
{ "event_id": "event_123", "type": "session.update", "session": { "modalities": ["text", "audio"], "instructions": "You are a helpful assistant.", "voice": "sage", <snip for brevity> "tool_choice": "auto", "temperature": 0.8, "max_response_output_tokens": "inf" } }
No response
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Confirm this is a feature request for the .NET library and not the underlying OpenAI API
Describe the feature or improvement you are requesting
There currently seems no way to use the models separately from the commands/clients. It's all very tightly coupled.
For my use case of sending JSON messages on a WebRTC data channel I need to be able to serialise along the lines of:
To JSON as per the OpenAI specification:
Additional context
No response
The text was updated successfully, but these errors were encountered: