You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
There's a problem with the way the Ollama and ONNX connectors deserialize their execution settings that makes them fail when they include non-string properties, for example, when using CreateFunctionFromPromptYaml with a template that includes a "temperature" setting.
The reason is that both OllamaPromptExecutionSettings and OnnxRuntimeGenAIPromptExecutionSettings are missing the JsonNumberHandling serialization attribute, which is present in the PromptExecutionSettings of other connectors. For example:
System.Text.Json.JsonException
HResult=0x80131500
Message=The JSON value could not be converted to System.Nullable`1[System.Single]. Path: $.temperature | LineNumber: 0 | BytePositionInLine: 86.
Source=System.Text.Json
StackTrace:
at System.Text.Json.ThrowHelper.ReThrowWithPath(ReadStack& state, Utf8JsonReader& reader, Exception ex)
at System.Text.Json.Serialization.JsonConverter`1.ReadCore(Utf8JsonReader& reader, JsonSerializerOptions options, ReadStack& state)
at System.Text.Json.JsonSerializer.ReadFromSpan[TValue](ReadOnlySpan`1 utf8Json, JsonTypeInfo`1 jsonTypeInfo, Nullable`1 actualByteCount)
at System.Text.Json.JsonSerializer.ReadFromSpan[TValue](ReadOnlySpan`1 json, JsonTypeInfo`1 jsonTypeInfo)
at System.Text.Json.JsonSerializer.Deserialize[TValue](String json, JsonSerializerOptions options)
at Microsoft.SemanticKernel.Connectors.Ollama.OllamaPromptExecutionSettings.FromExecutionSettings(PromptExecutionSettings executionSettings)
at Microsoft.SemanticKernel.Connectors.Ollama.OllamaChatCompletionService.<GetChatMessageContentsAsync>d__5.MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.<GetChatCompletionResultAsync>d__20.MoveNext()
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.<InvokeCoreAsync>d__3.MoveNext()
at System.Threading.Tasks.ValueTask`1.get_Result()
at Microsoft.SemanticKernel.KernelFunction.<>c__DisplayClass21_0.<<InvokeAsync>b__0>d.MoveNext()
at Microsoft.SemanticKernel.Kernel.<InvokeFilterOrFunctionAsync>d__34.MoveNext()
at Microsoft.SemanticKernel.Kernel.<OnFunctionInvocationAsync>d__33.MoveNext()
at Microsoft.SemanticKernel.KernelFunction.<InvokeAsync>d__21.MoveNext()
at Program.<<<Main>$>g__InvokePrompt4|0_3>d.MoveNext() in ...
at Program.<<Main>$>d__0.MoveNext() in ...
at Program.<Main>(String[] args)
This exception was originally thrown at this call stack:
System.Text.Json.ThrowHelper.ThrowInvalidOperationException_ExpectedNumber(System.Text.Json.JsonTokenType)
System.Text.Json.Utf8JsonReader.TryGetSingle(out float)
System.Text.Json.Utf8JsonReader.GetSingle()
System.Text.Json.Serialization.Converters.NullableConverter<T>.Read(ref System.Text.Json.Utf8JsonReader, System.Type, System.Text.Json.JsonSerializerOptions)
System.Text.Json.Serialization.Metadata.JsonPropertyInfo<T>.ReadJsonAndSetMember(object, ref System.Text.Json.ReadStack, ref System.Text.Json.Utf8JsonReader)
System.Text.Json.Serialization.Converters.ObjectDefaultConverter<T>.OnTryRead(ref System.Text.Json.Utf8JsonReader, System.Type, System.Text.Json.JsonSerializerOptions, ref System.Text.Json.ReadStack, out T)
System.Text.Json.Serialization.JsonConverter<T>.TryRead(ref System.Text.Json.Utf8JsonReader, System.Type, System.Text.Json.JsonSerializerOptions, ref System.Text.Json.ReadStack, out T, out bool)
System.Text.Json.Serialization.JsonConverter<T>.ReadCore(ref System.Text.Json.Utf8JsonReader, System.Text.Json.JsonSerializerOptions, ref System.Text.Json.ReadStack)
Inner Exception 1:
InvalidOperationException: Cannot get the value of a token type 'String' as a number.
Platform
OS: Windows
IDE: Visual Studio
Language: C#
Source: 1.24.0-alpha (of the Ollama and ONNX connectors)
The text was updated successfully, but these errors were encountered:
github-actionsbot
changed the title
Bug: JsonException using the Ollama and Onnx connectors
.Net: Bug: JsonException using the Ollama and Onnx connectors
Oct 18, 2024
…a, ONNX, and Other Connectors (#10055)
### Motivation and Context
This PR addresses a deserialization issue in the `Ollama` and `ONNX`
connectors where execution settings fail to handle non-string properties
(e.g., "temperature") during JSON parsing.
### Description
**Impact:**
- Ensures compatibility with templates containing non-string numeric
properties.
- Aligns behavior with `OpenAIPromptExecutionSettings`, which already
includes the necessary attribute.
### Contribution Checklist
- [Y] The code builds clean without any errors or warnings
- [Y] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [Y] All unit tests pass, and I have added new tests where possible
- [Y] I didn't break anyone 😄
- Fixes#9318
Co-authored-by: Adit Sheth <[email protected]>
Co-authored-by: Mark Wallace <[email protected]>
Describe the bug
There's a problem with the way the Ollama and ONNX connectors deserialize their execution settings that makes them fail when they include non-string properties, for example, when using
CreateFunctionFromPromptYaml
with a template that includes a "temperature" setting.The reason is that both
OllamaPromptExecutionSettings
andOnnxRuntimeGenAIPromptExecutionSettings
are missing theJsonNumberHandling
serialization attribute, which is present in the PromptExecutionSettings of other connectors. For example:I haven't used them, but I also see that the following execution settings classes are also missing this attribute:
To Reproduce
This fails with the following exception:
Platform
The text was updated successfully, but these errors were encountered: