diff --git a/.github/media/app5-ui.png b/.github/media/app5-ui.png new file mode 100644 index 00000000..939e8531 Binary files /dev/null and b/.github/media/app5-ui.png differ diff --git a/readme.md b/readme.md index 351b69c2..b0c84b3e 100644 --- a/readme.md +++ b/readme.md @@ -48,6 +48,17 @@ docker compose down ``` # Applications + +Here's what's in this repo: + +| Name | Main files | Compose name | URLs | Description | +|---|---|---|---|---| +| Code support bot | `bot.py` | `bot` | http://localhost:8501 | Main usecase. Fullstack Python application. | +| Stack Overflow Loader | `loader.py` | `loader` | http://localhost:8502 | Load SO data into the database (create vector embeddings etc). Fullstack Python application. | +| PDF Reader | `pdf_bot.py` | `pdf_bot` | http://localhost:8503 | Read local PDF and ask it questions. Fullstack Python application. | +| Standalone Bot API | `api.py` | `api` | http://localhost:8504 | Standalone HTTP API streaming (SSE) + non-streaming endpoints. Python. | +| Standalone UI | `front-end/` | `front-end` | http://localhost:8505 | Standalone client that uses the Standalone Bot API to interact with the model. JavaScript (Svelte) front-end. | + ## App 1 - Support Agent Bot UI: http://localhost:8501 @@ -63,11 +74,10 @@ DB client: http://localhost:7474 ![](.github/media/app1-rag-selector.png) *(Chat input + RAG mode selector)* -![](.github/media/app1-generate.png) -*(CTA to auto generate support ticket draft)* - -![](.github/media/app1-ticket.png) -*(UI of the auto generated support ticket draft)* +| | | +|---|---| +| ![](.github/media/app1-generate.png) | ![](.github/media/app1-ticket.png) | +| *(CTA to auto generate support ticket draft)* | *(UI of the auto generated support ticket draft)* | --- @@ -81,8 +91,12 @@ DB client: http://localhost:7474 - UI: choose tags, run import, see progress, some stats of data in the database - Load high ranked questions (regardless of tags) to support the ticket generation feature of App 1. -![](.github/media/app2-ui-1.png) -![](.github/media/app2-model.png) + + + +| | | +|---|---| +| ![](.github/media/app2-ui-1.png) | ![](.github/media/app2-model.png) | ## App 3 Question / Answer with a local PDF UI: http://localhost:8503 @@ -94,3 +108,19 @@ its contents and have the LLM answer them using vector similarity search. ![](.github/media/app3-ui.png) + +## App 4 Standalone HTTP API +Endpoints: + - http://localhost:8504/query?text=hello&rag=false (non streaming) + - http://localhost:8504/query-stream?text=hello&rag=false (SSE streaming) + +Exposes the functionality to answer questions in the same way as App 1 above. Uses +same code and prompts. + +## App 5 Static front-end +UI: http://localhost:8505 + +This application has the same features as App 1, but is built separate from +the back-end code using modern best practices (Vite, Svelte, Tailwind). +The auto-reload on changes are instant using the Docker watch `sync` config. +![](.github/media/app5-ui.png)