Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Autogen Studio not using azure open ai API #5743

Open
dfay88 opened this issue Feb 27, 2025 · 1 comment
Open

Autogen Studio not using azure open ai API #5743

dfay88 opened this issue Feb 27, 2025 · 1 comment

Comments

@dfay88
Copy link

dfay88 commented Feb 27, 2025

What happened?

I am using Autogen studio and I am simply trying to run a single calculator agent using the provided examples. I have setup an Azure Open AI model and double checked the json is utilizing the right syntax for Azure Open AI. However, when I try to run the team in the playground I am not able to engage the agent because I am running into the following error in my powershell:

Error processing publish message for assistant_agent/c682597c-86a3-4646-84dc-dfc6f70205b8
Traceback (most recent call last):
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_core_single_threaded_agent_runtime.py", line 505, in _on_message
return await agent.on_message(
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_agentchat\teams_group_chat_sequential_routed_agent.py", line 48, in on_message_impl
return await super().on_message_impl(message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_core_routed_agent.py", line 268, in wrapper
return_value = await func(self, message, ctx) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_agentchat\teams_group_chat_chat_agent_container.py", line 53, in handle_request
async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_agentchat\agents_assistant_agent.py", line 416, in on_messages_stream
model_result = await self._model_client.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\autogen_ext\models\openai_openai_client.py", line 534, in create
result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai\resources\chat\completions\completions.py", line 1927, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai_base_client.py", line 1856, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai_base_client.py", line 1550, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dafay\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\openai_base_client.py", line 1651, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

IT looks as if the backend code is still referencing OpenAI and not Azure Open AI

Which packages was the bug in?

AutoGen Studio (autogensudio)

AutoGen library version.

Python 0.4.1

Other library version.

No response

Model used

gpt 4o

Model provider

Azure AI Foundary (Azure AI Studio)

Other model provider

No response

Python version

None

.NET version

None

Operating system

None

@ekzhu
Copy link
Collaborator

ekzhu commented Feb 27, 2025

Please post your configuration?

Both OpenAIChatCompletionClient and AzureOpenAIChatCompletionClient share the same code path for create, so what you saw in the stack trace can also come from AzureOpenAIChatCompletionClient

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants