Skip to content

deepsense-ai/ragbits

Folders and files

NameName
Last commit message
Last commit date
Apr 15, 2025
Apr 28, 2025
May 6, 2025
Apr 30, 2025
May 6, 2025
Dec 5, 2024
Apr 28, 2025
Sep 3, 2024
Apr 22, 2025
Apr 18, 2025
Oct 10, 2024
Apr 22, 2025
Apr 28, 2025
Oct 8, 2024
Mar 21, 2025
Jan 27, 2025
May 5, 2025
Dec 18, 2024
Apr 22, 2025
May 6, 2025

Repository files navigation

Ragbits

Building blocks for rapid development of GenAI applications

Documentation | Contact

PyPI - License PyPI - Version PyPI - Python Version


Features

πŸ”¨ Build Reliable & Scalable GenAI Apps

πŸ“š Fast & Flexible RAG Processing

  • Ingest 20+ formats – Process PDFs, HTML, spreadsheets, presentations, and more. Process data using unstructured or create a custom provider.
  • Handle complex data – Extract tables, images, and structured content with built-in VLMs support.
  • Connect to any data source – Use prebuilt connectors for S3, GCS, Azure, or implement your own.
  • Scale ingestion – Process large datasets quickly with Ray-based parallel processing.

πŸš€ Deploy & Monitor with Confidence

  • Real-time observability – Track performance with OpenTelemetry and CLI insights.
  • Built-in testing – Validate prompts with promptfoo before deployment.
  • Auto-optimization – Continuously evaluate and refine model performance.
  • Visual testing UI (Coming Soon) – Test and optimize applications with a visual interface.

What's Included?

  • Core - Fundamental tools for working with prompts and LLMs.
  • Document Search - Handles vector search to retrieve relevant documents.
  • CLI - The ragbits shell command, enabling tools such as GUI prompt management.
  • Guardrails - Ensures response safety and relevance.
  • Evaluation - Unified evaluation framework for Ragbits components.
  • Flow Controls - Manages multi-stage chat flows for performing advanced actions (coming soon).
  • Structured Querying - Queries structured data sources in a predictable manner (coming soon).
  • Caching - Adds a caching layer to reduce costs and response times (coming soon).

Installation

To use the complete Ragbits stack, install the ragbits package:

pip install ragbits

Alternatively, you can use individual components of the stack by installing their respective packages: ragbits-core, ragbits-document-search, ragbits-cli.

Quickstart

First, create a prompt and a model for the data used in the prompt:

from pydantic import BaseModel
from ragbits.core.prompt import Prompt

class Dog(BaseModel):
    breed: str
    age: int
    temperament: str

class DogNamePrompt(Prompt[Dog, str]):
    system_prompt = """
    You are a dog name generator. You come up with funny names for dogs given the dog details.
    """

    user_prompt = """
    The dog is a {breed} breed, {age} years old, and has a {temperament} temperament.
    """

Next, create an instance of the LLM and the prompt:

from ragbits.core.llms.litellm import LiteLLM

llm = LiteLLM("gpt-4o")
example_dog = Dog(breed="Golden Retriever", age=3, temperament="friendly")
prompt = DogNamePrompt(example_dog)

Finally, generate a response from the LLM using the prompt:

response = await llm.generate(prompt)
print(f"Generated dog name: {response}")

How Ragbits documentation is organized

  • Quickstart - Get started with Ragbits in a few minutes
  • How-to guides - Learn how to use Ragbits in your projects
  • CLI - Learn how to run Ragbits in your terminal
  • API reference - Explore the underlying API of Ragbits

License

Ragbits is licensed under the MIT License.

Contributing

We welcome contributions! Please read CONTRIBUTING.md for more information.