Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cloud Native LLM runtime proposal #164

Open
daixiang0 opened this issue Jun 20, 2024 · 2 comments
Open

Cloud Native LLM runtime proposal #164

daixiang0 opened this issue Jun 20, 2024 · 2 comments
Labels
cnai Issues related to the CNAI WG

Comments

@daixiang0
Copy link

daixiang0 commented Jun 20, 2024

For python developers, litellm maybe a good choice. That would be great we can do it in a runtime and any developers can use LLMs in own language way rather than packaging HTTP/gRPC calls by themselves.

Now we have many LLM APIs like OpenAI, Azure AI, Cohere, LLaMA, AWS bedrock, Kserve, OpenVINO and so on, migrate from one to the other still need many code changes.

I propose that we can do it in a Cloud Native LLM runtime, then developers can migrate from one to the other only by config.

@zanetworker zanetworker added the cnai Issues related to the CNAI WG label Jun 20, 2024
@zanetworker
Copy link
Collaborator

@daixiang0 daixiang0 changed the title Cloud Native AI runtime proposal Cloud Native LLM runtime proposal Jun 28, 2024
@daixiang0
Copy link
Author

The whole proposal is here, and point out difference between gateway and runtime.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cnai Issues related to the CNAI WG
Projects
Status: In progress
Development

No branches or pull requests

2 participants