An AI-driven application that enables users to define natural language alert rules for their credit card transactions. The system provides rule-based fraud detection and location-based security monitoring, ingesting real-time data, evaluating transactions against user-defined rules, applying AI/ML analysis for anomaly detection, and sending alerts through preferred channels such as email or SMS.
The Spending Transaction Monitor acts as an intelligent intermediary between credit card transaction streams and customers.
- Users define alert rules in plain natural language (e.g., βAlert me if I spend more than $200 at restaurants this monthβ).
- The system ingests transaction events in real-time.
- An NLP engine converts rules into structured criteria.
- A rule engine evaluates each new transaction against rules, user data, and behavioral patterns.
- Alerts are triggered and delivered via email, or SMS notification.
This project demonstrates how to combine OpenShift AI, and modern ML frameworks to deliver real-time, user-centric financial monitoring.
The solution is deployed on OpenShift and integrates multiple components:
- Transaction Ingestion Service: Securely receives credit card transaction data in real-time and stores that in the database.
- Transaction Data Store: Stores both historical and streaming data (PostgreSQL).
- Customer UI: React frontend for defining and managing alerts with location-based fraud detection.
- NLP Module (LlamaStack + LangGraph Agent): Parses natural language into machine-readable rules.
- Rules Engine / Alerting Service: Evaluates transactions against user rules, behavioral patterns, and location-based risk assessment.
- AI/ML Behavioral Analysis: Detects anomalies, spending spikes, recurring patterns, and location-based fraud indicators.
- Location-based Security: Captures user GPS coordinates for enhanced security monitoring and fraud detection.
- Notification Service: Sends alerts via email, SMS, push notifications, or webhooks.
- Users create alert rules (amount, merchant, category, timeframe, location; notification methods: email/SMS/push/webhook).
- Location-based fraud detection captures user GPS coordinates for enhanced security monitoring.
- Incoming transactions are stored and evaluated against active rules, including location-based risk assessment.
- Triggered rules produce alert notifications which are delivered via configured channels.
graph TD
%% UI
subgraph UI["UI (packages/ui)"]
U["User"] --> WUI["Web UI"]
end
%% API
subgraph API["API (packages/api)"]
API_APP["FastAPI App"]
IN["Transaction API"]
end
%% Evaluation
subgraph EVAL["Evaluation (packages/evaluation)"]
EV["Rule Evaluation Service"]
end
%% Alerts
subgraph ALERTS["Alerts (packages/alerts)"]
AL["Alerts Service"]
end
%% DB
subgraph DB["DB (packages/db) - PostgreSQL"]
USERS["users"]
CARDS["credit_cards"]
AR["alert_rules"]
TRX["transactions"]
AN["alert_notifications"]
end
%% Delivery
subgraph DELIV["Delivery Channels"]
EM["Email"]
SM["SMS"]
PS["Push"]
WH["Webhook"]
end
%% External Source
subgraph EXT["External"]
TS["Transaction Source"]
end
%% Rule authoring
WUI -->|Create/Update Rule| API_APP
API_APP -->|Persist| AR
%% Transaction ingestion
TS --> IN
IN --> API_APP
API_APP -->|Store| TRX
%% Evaluation path
API_APP -->|Evaluate| EV
EV -->|Read| AR
EV -->|Read| TRX
EV -->|Create| AN
EV -->|Dispatch| AL
%% Alerts delivery
AL -->|Update| AN
AL --> EM
AL --> SM
AL --> PS
AL --> WH
-
Natural Language Rule Creation
Define alerts in everyday language (e.g., βNotify me if I spend 3x more than usual on diningβ). -
Flexible Rule Management
Add, delete, or pause rules at any time through the UI. -
Behavioral AI Analysis
Detect anomalies such as unusual merchant categories, high-frequency spending, or out-of-pattern locations. -
Location-Aware Alerts
Compare transaction location with userβs home, last transaction, or last mobile location. -
Historical Spend Analysis
Evaluate transactions against aggregated historical trends (e.g., monthly average dining spend). -
Multi-Channel Notifications
Deliver alerts via Email, or SMS.
- Frontend: React
- Backend: FastAPI, Python
- Databases: PostgreSQL (transactions, rules, users)
- AI/ML: LlamaStack, LangGraph, TensorFlow/PyTorch, RHOAI
- Deployment: OpenShift, Kubernetes-native microservices
- Notifications: Email, SMS
| Category | Example Trigger |
|---|---|
| Spending Pattern | βYour dining expense of $98 is 45% higher than your average of $67 over 30 days.β |
| Recurring Payment | βNetflix charged $18.99 this month vs. your usual $15.49 β a 22% increase.β |
| Location-Based | βTransaction in Boston detected. Your last known location was Los Angeles.β |
| Merchant-Based | βUber ride was $47.89, up from your last 5 ride average of $28.40.β |
spending-transaction-monitor/
βββ packages/
β βββ api/
β βββ db/
β βββ ui/
β βββ ingestion-service/
β βββ configs/
βββ docs/
βββ deploy/
βββ .env.example
βββ turbo.json
βββ Makefile
βββ pnpm-workspace.yaml
βββ package.json
βββ README.md
- OpenShift cluster with RHOAI
- PostgreSQL instance
- Python 3.11+
- Node.js 18+
# Clone the repo
git clone https://github.com/rh-ai-quickstart/spending-transaction-monitor.git
cd spending-transaction-monitorStart with pre-built images:
make run-localBuild and run from source:
make build-run-localContainer URLs:
- Frontend: http://localhost:3000
- API: http://localhost:3000/api/* (proxied)
- API Docs: http://localhost:8000/docs
- SMTP Web UI: http://localhost:3002
- Database: localhost:5432
Container Management:
make run-local # Start with registry images
make build-local # Build images from source
make build-run-local # Build and start
make stop-local # Stop all services
make logs-local # View service logs
make reset-local # Reset with fresh dataSetting Up Data:
After starting services, set up the database and Keycloak:
pnpm setup:data # Complete setup: Start DB + migrations + seed all data
pnpm seed:all # Just seed data (DB + Keycloak) - migrations already run
pnpm seed:db # Seed only database
pnpm seed:keycloak # Setup only Keycloak realm
# Or using make
make setup-data # Complete data setup: Start DB + migrations + all data
make seed-all # Seed both DB and KeycloakNote: pnpm setup:data now automatically starts the database, so you don't need to run pnpm db:start separately.
π See SEEDING.md for complete seeding documentation
The application supports two authentication modes:
By default, the application uses Keycloak for secure authentication:
- Automatic Setup: Keycloak realm and test users are automatically created on startup
- OAuth2/OIDC Flow: Implements OpenID Connect with PKCE for secure authentication
- Automatic Token Refresh: Tokens are automatically refreshed before expiration
- Test Users:
testuser/password(Regular user)adminuser/password(Admin user)
Access Points:
- Frontend: http://localhost:3000 (redirects to Keycloak login)
- Keycloak Admin: http://localhost:8080 (admin / admin)
- API Docs: http://localhost:8000/docs
For local development, you can bypass authentication:
# Set environment variables for bypass mode
BYPASS_AUTH=true VITE_BYPASS_AUTH=true VITE_ENVIRONMENT=development make build-run-localIn bypass mode:
- β No login required - automatic authentication as dev user
- β Yellow "DEV MODE - Authentication Bypassed" banner visible
- β Faster development iteration
β οΈ NOT for production use
Switching Between Modes:
# Production mode (Keycloak authentication)
make build-run-local
# Development mode (auth bypass)
BYPASS_AUTH=true VITE_BYPASS_AUTH=true VITE_ENVIRONMENT=development make build-run-localEnvironment Variables:
| Variable | Values | Description |
|---|---|---|
BYPASS_AUTH |
true/false |
Backend auth bypass |
VITE_BYPASS_AUTH |
true/false |
Frontend auth bypass |
VITE_ENVIRONMENT |
development/staging/production |
Environment mode |
KEYCLOAK_URL |
URL | Keycloak server URL (default: http://localhost:8080) |
For local development without containers, use these pnpm commands:
# Install dependencies
pnpm setup
# Start in development mode (auth bypassed)
pnpm start:dev
# Or start individual services
pnpm backend:setup # Setup database
pnpm backend:start # Start API (port 8002, auth bypass)
pnpm --filter @*/ui dev # Start UI (port 3000)# Start with Keycloak authentication
pnpm start:prod
# Access points:
# - Frontend: http://localhost:3000
# - API: http://localhost:8002
# - Keycloak: http://localhost:8080# With Keycloak authentication (default)
pnpm dev:containers:auth
# With auth bypass (no login required) - fastest iteration
pnpm dev:containers:noauth
# Standard container startup (without rebuild)
pnpm dev:containersOr using Make directly:
# Build and run with Keycloak authentication (default)
make build-run-local
# Build and run with auth bypass (no authentication)
make build-run-local-noauth
# Run without rebuilding
make run-local# Database management
pnpm db:start # Start PostgreSQL container
pnpm db:stop # Stop PostgreSQL container
pnpm db:upgrade # Run migrations
pnpm db:seed # Load sample data
pnpm db:verify # Verify database connection
# Authentication
pnpm auth:start # Start Keycloak container
pnpm auth:stop # Stop Keycloak container
pnpm auth:setup-keycloak # Setup Keycloak realm/client
pnpm auth:setup-keycloak-with-users # Setup Keycloak with DB users
# Code quality
pnpm lint # Run all linters
pnpm lint:fix # Auto-fix linting issues
pnpm format # Format code
pnpm test # Run tests
pnpm type-check # Run TypeScript checksAfter starting the application with make run-local, you can test alert rules interactively:
List available sample alert rules:
make list-alert-samplesShows all available test scenarios with their descriptions, such as:
- "Alert when spending more than $500 in one transaction"
- "Alert me if my dining expense exceeds the average of the last 30 days by more than 40%"
- "Alert me if a transaction happens outside my home state"
Interactive testing menu:
make test-alert-rulesThis command provides:
- π Alert Rule Menu showing alert rule descriptions
- π Data preview with realistic transaction data adjusted to current time
- π User context showing the test user profile and transaction history
- β Confirmation prompt before running the actual test
After confirming a rule test:
- The system sends a test notification via the configured test SMTP server.
- To verify:
- Open the SMTP server Web UI:
π http://localhost:3002 - Check the inbox for the test email.
- Open the email to confirm:
- The rule name/description is included.
- The transaction details that triggered the rule are shown.
- Open the SMTP server Web UI:
-
Start the application:
make run-local
-
Browse available test scenarios:
make list-alert-samples
-
Run interactive testing:
make test-alert-rules
- Select an alert rule by number (1-16)
- Review the data preview showing exactly what will be tested
- Confirm to proceed with the test
- Watch the complete validation and creation process
The test process:
- Seeds database with realistic user and transaction data
- Validates the alert rule using the NLP validation API
- Creates the alert rule if validation passes
- Shows step-by-step results including SQL queries and processing steps
Note: Make sure the API server is running (make run-local) before testing alert rules.
Quick Deploy:
make full-deployStep-by-step:
# Login and setup
make login
make create-project
# Build and push images
make build-all
make push-all
# Deploy
make deployOpenShift Management:
make deploy # Deploy to OpenShift
make undeploy # Remove deployment
make status # Check deployment status
make logs-api # View API logs
make logs-ui # View UI logsContributions are welcome! Please fork the repo and submit a PR.
See our CONTRIBUTING.md for guidelines.
This project is licensed under the Apache License 2.0. See LICENSE for details.