-
Notifications
You must be signed in to change notification settings - Fork 785
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove non-AsXx surface area from M.E.AI.OpenAI/AzureAIInference #6138
base: main
Are you sure you want to change the base?
Conversation
We expect the AsChatClient/AsEmbeddingGenerator extension methods from M.E.AI.OpenAI and M.E.AI.AzureAIInference to move into the OpenAI / Azure.AI.Inference libraries, respectively. To prepare for that, and the temporary M.E.AI.OpenAI/AzureAIInference libs then being deprecated, this PR removes the other surface area from these assemblies. For anything folks found useful, we should find another way to ship it, likely just as sample source somewhere.
@@ -321,7 +311,7 @@ private ChatCompletionsOptions ToAzureAIOptions(IEnumerable<ChatMessage> chatCon | |||
default: | |||
if (prop.Value is not null) | |||
{ | |||
byte[] data = JsonSerializer.SerializeToUtf8Bytes(prop.Value, ToolCallJsonSerializerOptions.GetTypeInfo(typeof(object))); | |||
byte[] data = JsonSerializer.SerializeToUtf8Bytes(prop.Value, AIJsonUtilities.DefaultOptions.GetTypeInfo(typeof(object))); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd forgotten about AdditionalProperties
. Hardcoding to the default options effectively forces known types only in AOT.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suspect primitives is sufficient. We can always add an overload later that takes a JSO if it proves important. Or you think it should be done now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suspect there always will be a need for storing something less trivial than just primitives.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quite possibly, but this in particular is about properties that should be sent up to the service as extra properties that a given service understands:
https://learn.microsoft.com/en-us/rest/api/aifoundry/modelinference/#extensibility
and those are generally going to be primitives, like float, int, or string.
(Also, the Azure.AI.Inference library itself is not yet AOT friendly.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about float[]
, int?
, int?[]
, or string[]
? If we're drawing a line in the sand we probably need to document it (and quite possibly reject values whose type is not supported at runtime). Otherwise we find ourselves in a situation where clients that work without issue in CoreCLR end up failing in Native AOT without a workaround.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about float[], int?, int?[], or string[]?
int?
boxes the same as int
, but regardless it's already represented in the default options.
If we think float[]
, int?[]
, or string[]
will be desirable, we can just add them to AIJsonUtilities.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIModelMapper.ChatMessage.cs
Show resolved
Hide resolved
This is certainly all good stuff in general. Within the |
/// <summary> | ||
/// Provides extension methods for working with <see cref="RealtimeConversationSession"/> and related types. | ||
/// </summary> | ||
public static class OpenAIRealtimeExtensions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Losing this is a bit unfortunate (for me anyway, I'm certainly using it from various projects and from https://github.com/SteveSandersonMS/dotnet-ai-workshop).
Isn't there's still legitimacy in the provider-specific packages including helpers for making more of the provider-specific functionality work with MEAI, even if it's not purely an IChatClient
?
If it's important that we don't include anything besides the IChatClient
adapters then that's OK and I'll find some other way to ship this, but if the general concept of enabling related functionality is valid then it would be a strong candidate to keep.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't there's still legitimacy in the provider-specific packages including helpers for making more of the provider-specific functionality work with MEAI, even if it's not purely an IChatClient?
Absolutely. But I don't think dotnet/extensions is the place for those to live, and it's a separate conversation about adding such things into the relevant provider packages.
We expect the AsChatClient/AsEmbeddingGenerator extension methods from M.E.AI.OpenAI and M.E.AI.AzureAIInference to move into the OpenAI / Azure.AI.Inference libraries, respectively. To prepare for that, and the temporary M.E.AI.OpenAI/AzureAIInference libs then being deprecated, this PR removes the other surface area from these assemblies. For anything folks found useful, we should find another way to ship it, likely just as sample source somewhere.
Closes #6094
Closes #5965
Closes #5964
Microsoft Reviewers: Open in CodeFlow