Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 5 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,8 @@ __pycache__/
.cache/
.DS_Store
output*
storage/*.json
<<<<<<< HEAD
*.csv
=======
storage/*.json
>>>>>>> 8d902546c2f7d4238e340aadff3f1a20bfb4fee7
134 changes: 133 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,133 @@
# agentic-chatbot
# 🤖 Agentic Chatbot: Navigating Red Hat Internal resources from THE SOURCE

## Overview

The Agentic Chatbot is an intelligent assistant designed to help Red Hat associates navigate internal resources efficiently.

## Project Goals

- Simplify access to internal documentation
- Reduce onboarding friction
- Provide intelligent, context-aware resource discovery

## Use Case

Instead of navigating the Source portal manually, associates can ask:

- "What is the process for updating a Red Hat product?"
- "Where can I find the latest documentation for a specific product?"
- "How do I resolve a common issue with a Red Hat product?"

And receive:

- Direct answers
- Links to the exact documents/pages
- Context-aware follow-up suggestions

## Technical Architecture

### Core Components

| Layer | Technology |
|---------------|--------------------------------------|
| **Backend** | Python + FastAPI / Flask |
| **LLMs** | OpenAI GPT, Ollama, Hugging Face |
| **Vector DB** | ChromaDB / FAISS |
| **Embeddings**| Hugging Face Transformers |
| **Frontend** | Slack Bot (via Slack Bolt SDK)|

### Agentic Workflow

The chatbot operates on a **multi-agent architecture**, where each agent specializes in a sub-task:

- **Query Understanding Agent**
Interprets user inputs into actionable formats.

- **Content Indexing Agent**
Parses documents and pushes them to a vector database after generating embeddings.

- **Response Generation Agent**
Retrieves relevant data and composes human-like, informative replies.

## Slack Integration

The chatbot is tightly integrated with Slack for enterprise accessibility:

- Users interact via DM or threads
- Replies include embedded links to Source documents
- Backend uses Slack Bolt SDK with FastAPI
- Events are securely handled with workspace-level tokens

## Key Features

- **Semantic Search**
Leverages vector embeddings to match intent with content, even for vague queries

- **Model Flexibility**
Easily swap between OpenAI, Hugging Face, or locally hosted models via Ollama

- **Microservice Architecture**
Modular design allows for scaling and independent agent upgrades

- **Multilingual Support**
Via Hugging Face embeddings and tokenizers

- **Secure Document Handling**
Respects access controls and data privacy protocols

## Impact

- **Significantly reduces time** spent navigating documentation manually
- **Improves onboarding** experience for new Red Hatters
- **Promotes self-service** culture and reduces dependency on internal channels
- **Lays foundation** for enterprise-grade knowledge retrieval systems

## Future Scope

- **EagleView API Integration**
Post-authentication, the chatbot can dynamically fetch the complete Source portal data and continuously update its knowledge base

- **Scalable to Entire Source Ecosystem**
Once EagleView API access is enabled, the chatbot will be able to answer **all** questions across teams and departments

- **Credential-Aware Access Control**
Role-based response customization based on user credentials

- **Intelligent Logging & Feedback Loop**
Enable query analytics to improve answers through fine-tuning

- **Horizontally Scalable Architecture**
Multi-agent system allows parallel processing, independent agent upgrades, and multi-tenant support

## Installation

```bash
# Clone the repository
git clone https://github.com/your-org/agentic-chatbot.git

# Install dependencies
pip install -r requirements.txt

# Set up environment variables
cp sample.env .env
# Edit .env with your configurations
```

## Quick Start

### Running the Slack Bot
```bash
python services/slack_service.py
```

## Contributing

1. Fork the repository
2. Create your feature branch (`git checkout -b feature/AmazingFeature`)
3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request

## License

Distributed under the MIT License. See `LICENSE` for more information.
Empty file removed agents/__init__.py
Empty file.
Empty file removed agents/indexing_agent.py
Empty file.
Empty file removed agents/query_agent.py
Empty file.
Empty file removed agents/response_agent.py
Empty file.
3 changes: 3 additions & 0 deletions app/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
client = WebClient(token=slack_token)
verifier = SignatureVerifier(signing_secret)


@app.route("/slack/commands", methods=["POST"])
def slack_commands():
data = request.form
Expand All @@ -25,6 +26,7 @@ def slack_commands():
})
return "Unknown command", 200


@app.route("/slack/prompt", methods=["POST"])
def slack_prompt():
data = request.form
Expand All @@ -42,5 +44,6 @@ def slack_prompt():
"text": f"<@{user_id}> {response}"
})


if __name__ == "__main__":
app.run(debug=True,port=3000)
Empty file removed data/__init__.py
Empty file.
Empty file removed embeddings/__init__.py
Empty file.
Empty file removed embeddings/vector_store.py
Empty file.
3 changes: 3 additions & 0 deletions scripts/bot_script.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
DATA_DIR = ROOT_DIR / "data"
STORAGE_DIR = ROOT_DIR / "storage"


async def build_and_save_index():
"""Builds the index from PDF files and saves it."""
print("Loading PDF documents...")
Expand All @@ -39,6 +40,7 @@ async def build_and_save_index():
index.storage_context.persist(persist_dir=STORAGE_DIR)
return index


async def load_or_build_index():
"""Loads existing index or builds a new one if not found."""
if os.path.exists(STORAGE_DIR) or False:
Expand All @@ -49,6 +51,7 @@ async def load_or_build_index():
index = await build_and_save_index()
return index


async def main():
# Step 1: Setup global Settings
Settings.embed_model = OpenAIEmbedding(api_key=os.getenv("OPENAI_API_KEY"))
Expand Down
Empty file removed services/__init__.py
Empty file.
18 changes: 11 additions & 7 deletions services/slack_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,43 +18,45 @@
client = WebClient(token=slack_token)
verifier = SignatureVerifier(signing_secret)


def get_rag_response(query: str) -> str:
"""
Query the agent service and get response
"""
import requests
import json

# Configure the endpoint URL - adjust host/port as needed
agent_service_url = f"{os.getenv("AGENT_SERVICE_URL")}/query"

try:
# Make POST request to the agent service
response = requests.post(
agent_service_url,
json={"query": query},
headers={"Content-Type": "application/json"}
)

# Check if request was successful
response.raise_for_status()

# Parse JSON response
result = response.json()

# Return the response text
return result.get("response", "No response received from agent")

except requests.exceptions.RequestException as e:
# Handle connection errors
print(f"Error querying agent service: {e}")
return f"Sorry, I couldn't reach the knowledge base. Error: {str(e)}"

except json.JSONDecodeError:
# Handle invalid JSON response
print(f"Invalid response from agent service: {response.text}")
return "Sorry, I received an invalid response from the knowledge base."


@app.route("/slack/commands", methods=["POST"])
def slack_commands():
data = request.form
Expand All @@ -66,6 +68,7 @@ def slack_commands():
})
return "Unknown command", 200


@app.route("/slack/prompt", methods=["POST"])
def slack_prompt():
data = request.form
Expand All @@ -83,5 +86,6 @@ def slack_prompt():
"text": f"<@{user_id}> {response}"
})


if __name__ == "__main__":
app.run(debug=True,port=3000)