Built Ondine on top of LiteLLM, a batch DataFrame processing SDK, would love an integration listing #26006
ptimizeroracle
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey LiteLLM team 👋
Built an open-source SDK called Ondine that uses LiteLLM under the hood for provider routing. It wraps batch LLM processing around a Pandas or Polars DataFrame you give it a prompt, a model, and output columns; it handles retries, checkpointing, cost caps, and structured output via Pydantic.
LiteLLM is what makes the "same code, any provider" part work. Users swap
model="gpt-5.4-mini"formodel="ollama/qwen3.5"ormodel="anthropic/claude-sonnet-4-6"and nothing else changes. That's entirely thanks to your abstraction.Repo: https://github.com/ptimizeroracle/ondine
Website: https://ondine.dev
Ask: Would it be worth adding Ondine to your integrations / projects-using-LiteLLM list? Happy to open a PR against the docs if that's the preferred path.
Also, genuinely, thanks. LiteLLM saved me weeks of provider-specific adapter work.
Beta Was this translation helpful? Give feedback.
All reactions