This repository is a full-stack application template designed for hobbyists and developers to practice and explore full-stack chatbot application.
It combines a modern Next.js frontend with a robust FastAPI backend, integrating LangChain and LLMs API for AI-powered features. The architecture includes a PostgreSQL server, SSO authentication with Google gmail using NextAuth, and support for real-time Server-Sent Events (SSE), making it an excellent starting point for learning how to build, connect, and deploy AI-driven applications!
The demo in this repository also utilize Typhoon
free openapi! You can register and try it out for free here as well ...hooerey 🎊.
In this architecture, there are 2 main components as follow:
- Frontend service powered by
Next.js
- Backend service powered by
FastAPI
(andNext.js
as proxy)
With this setup, we have no need to setup security layer at Backend LLM api server
! Since this api server will be hosted on private network and will be requested only through proxy server
.
The communication between Frontend browser
to Backend proxy server
is also secured by default using secret from Next.js
which also implements CORS
for security.
Lastly, there is also a NextAuth
middleware to prevent all requests to access Frontend webserver
and Backend proxy server
unless authenticated/authorized by SSO with GCP.
- Server-Sent Events (SSE) support for streaming LLM tokens
- FastAPI web framework
- SQLAlchemy with PostgreSQL database
- LangChain + OpenAI integration
- Alembic for database migrations
- Uvicorn ASGI server
- Server-Sent Events (SSE) support for streaming LLM tokens
- Next.js React framework
- SSO Integration with Google
- Zustand state management
Please install these required tools first for local development:
- uv: Super fast python env/deps management
- pre-commit hook: For code quality checking even before commiting to git. You can check the current setup at
.pre-commit-config.yaml
- nvm: For node version management
- nerdctl: Opensource dropin replacement for
docker
(this is just my preference though 🤣).
- Run
nerdctl compose up --buld frontend-webapp
You can also use
docker
as well ;) - Access the frontend webapp at
http://localhost:3000
Note that backend api server cannot be accessed by default on
nerdctl compose
setup. You need to expose the port indocker-compose.yaml
manually.
- Navigate to the backend directory:
cd typhoon-be
- Install dependencies:
cd typhoon-be
uv sync --frozen
- Set up environment variables:
DATABASE_URL=<async-database-url>
ALEMBIC_DATABASE_URL=<sync-database-url>
LLM_ENDPOINT=<llm-endpoint>
API_KEY=<your-api-key> # API key for accessing typhoon model
- Run migration
uv run alembic upgrade head
- Run api server
uv run fastapi run src/app.py
- Navigate to the frontend directory:
cd typhoon-fe
- Install dependencies:
npm install
- Set up environment variables:
NEXTAUTH_SECRET=
LLM_BACKEND_ENDPOINT=http://localhost:3000
# Google IdP credentials for SSOing with google
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
- Start the development server:
npm run dev
Access the application at http://localhost:3000