-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
APIConnectionError raised when using AsyncOpenAI along FastAPI and uvicorn(uvloop) #1927
Comments
Thanks for the report, does it work if you move the client instantation outside of the request handler?
|
Hey thanks for your quick reply.... Nah does not work either moving client outside. app = FastAPI()
client = AsyncOpenAI(api_key=apiKey, timeout=30)
@app.get("/test")
async def test():
systemMessage = "Get a title for the conversation, the title shall not have more than 4 words"
messages = [
{"role": "system", "content": systemMessage},
{
"role": "user",
"content": "Is the earth flat",
},
]
result = await client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
)
title = result.choices[0].message.content
return title I also went for adding it in the lifespan instead and using it directly from the app object. same issue from contextlib import asynccontextmanager
from typing import AsyncGenerator, Any
from fastapi import FastAPI
import uvicorn
from openai import AsyncOpenAI
apiKey = ""
@asynccontextmanager
async def lifespan(app: FastAPI) -> AsyncGenerator[None, Any]:
app.client_openAI = AsyncOpenAI(api_key=apiKey, timeout=30)
try:
yield
finally:
# Clean up resources if necessary
pass
app = FastAPI(lifespan=lifespan)
@app.get("/test")
async def test():
systemMessage = "Get a title for the conversation, the title shall not have more than 4 words"
messages = [
{"role": "system", "content": systemMessage},
{
"role": "user",
"content": "Is the earth flat",
},
]
result = await app.client_openAI.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
)
title = result.choices[0].message.content
return title
if __name__ == '__main__':
uvicorn.run(app, host="0.0.0.0", port=50051) When removing uvloop from dependencies, everything is working. Not sure if uvloop leads to some compability issues with httpx |
Hi there! I noticed this issue and thought I’d share a few links that discuss similar challenges. They might offer some helpful insights or potential solutions: |
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
Exception raised when using AsyncOpenAI with FastAPI and uvicorn issue seems to be compability issues with
uvloop
that is added through the extras = ["standard"].Installing dependencies manually and leaving out uvloop works.
To Reproduce
pyproject.toml with poetry dep.
[tool.poetry.dependencies]
python = ">=3.11,<3.12"
fastapi = "0.115.6"
uvicorn = {extras = ["standard"], version = "0.32.1"}
openai = "^1.54.4"
Run app main.py.
Call endpoint
http://localhost:50051/test
Exception:
File "/Users/badrelfarri/Documents/Code/RevVue/simple-async-openai-assistant/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1610, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.
Code snippets
OS
macOS
Python version
Python 3.11.4
Library version
openai v1.54.4
The text was updated successfully, but these errors were encountered: