diff --git a/.env.example b/.env.example
deleted file mode 100644
index 53e921e..0000000
--- a/.env.example
+++ /dev/null
@@ -1,5 +0,0 @@
-# Replace the placeholders with your actual API keys
-
-NEWSAPI_KEY="your_actual_newsapi_key_here"
-GOOGLE_API_KEY="your_actual_google_api_key_here"
-
diff --git a/.github/CICD_SETUP.md b/.github/CICD_SETUP.md
deleted file mode 100644
index c88decd..0000000
--- a/.github/CICD_SETUP.md
+++ /dev/null
@@ -1,239 +0,0 @@
-# GitHub Actions CI Setup Guide
-
-## 🚀 Setting Up Continuous Integration for StockSense Agent
-
-This guide walks you through setting up automated testing with GitHub Actions for your StockSense Agent project.
-
-## 📁 Files Created
-
-### 1. Main CI Workflow: `.github/workflows/ci.yml`
-
-**Complete enterprise-grade CI pipeline with:**
-
-- ✅ Unit testing
-- ✅ Integration testing
-- ✅ Code quality checks
-- ✅ Security scanning
-- ✅ Coverage reporting
-- ✅ Multi-job pipeline
-
-### 2. Simple CI Workflow: `.github/workflows/ci-simple.yml`
-
-**Minimal CI pipeline that exactly matches your requirements:**
-
-- ✅ Triggers on push/PR to main
-- ✅ Python 3.10 setup
-- ✅ Dependency installation
-- ✅ Pytest execution
-- ✅ Environment variable support
-
-## 🔧 Setup Instructions
-
-### Step 1: Choose Your Workflow
-
-**Option A: Use Simple Workflow (Recommended for start)**
-
-```bash
-# Keep only the simple workflow
-mv .github/workflows/ci-simple.yml .github/workflows/ci.yml
-rm .github/workflows/ci-simple.yml
-```
-
-**Option B: Use Full-Featured Workflow**
-
-```bash
-# Keep the comprehensive workflow
-rm .github/workflows/ci-simple.yml
-# The main ci.yml is already in place
-```
-
-### Step 2: Configure GitHub Secrets
-
-In your GitHub repository, go to **Settings > Secrets and Variables > Actions**
-
-Add the following secrets:
-
-#### Required Secrets:
-
-- **`GOOGLE_API_KEY`**: Your Google AI API key for sentiment analysis
-- **`NEWSAPI_KEY`**: Your NewsAPI key for fetching news headlines
-
-#### How to Add Secrets:
-
-1. Go to your GitHub repository
-2. Click **Settings** tab
-3. Navigate to **Secrets and variables** > **Actions**
-4. Click **New repository secret**
-5. Add each secret with the exact names above
-
-### Step 3: Push to GitHub
-
-```bash
-# Add the workflow files
-git add .github/
-
-# Commit the changes
-git commit -m "Add GitHub Actions CI workflow for automated testing"
-
-# Push to main branch (this will trigger the first CI run)
-git push origin main
-```
-
-## 🎯 What Happens After Setup
-
-### Automatic Triggers:
-
-- **Every push to main**: Full CI pipeline runs
-- **Every pull request to main**: CI validation runs
-- **Manual trigger**: Can be run manually from GitHub Actions tab
-
-### Test Execution:
-
-- **Unit Tests**: Tests individual tools (news, price data)
-- **Integration Tests**: Tests API endpoints (requires server)
-- **Smoke Tests**: Quick functionality validation
-
-### CI Pipeline Results:
-
-- ✅ **Green checkmark**: All tests passed
-- ❌ **Red X**: Tests failed (blocks merging)
-- 🟡 **Yellow circle**: Tests running
-
-## 📊 Monitoring CI Results
-
-### View Results:
-
-1. Go to your GitHub repository
-2. Click **Actions** tab
-3. See all workflow runs and their status
-
-### Debug Failed Tests:
-
-1. Click on the failed workflow run
-2. Click on the **Build and Test** job
-3. Expand the failed step to see error details
-4. Review logs and fix issues locally
-
-## 🔄 Workflow Customization
-
-### Modify Test Commands:
-
-Edit the workflow file to change test execution:
-
-```yaml
-# Run specific test types
-- name: Run tests
- run: |
- python run_tests.py unit # Only unit tests
- # OR
- python run_tests.py smoke # Only smoke tests
- # OR
- pytest tests/test_tools.py -v # Specific test file
-```
-
-### Add More Python Versions:
-
-```yaml
-strategy:
- matrix:
- python-version: ['3.9', '3.10', '3.11']
-```
-
-### Add Different Operating Systems:
-
-```yaml
-strategy:
- matrix:
- os: [ubuntu-latest, windows-latest, macos-latest]
-runs-on: ${{ matrix.os }}
-```
-
-## ⚠️ Troubleshooting
-
-### Common Issues:
-
-#### 1. **Tests fail due to missing secrets**
-
-- Ensure `GOOGLE_API_KEY` and `NEWSAPI_KEY` are set in repository secrets
-- Check secret names match exactly (case-sensitive)
-
-#### 2. **Dependencies installation fails**
-
-- Verify `requirements.txt` includes all needed packages
-- Check for conflicts between package versions
-
-#### 3. **Tests timeout**
-
-- Network-dependent tests may be slow
-- Consider mocking external API calls for CI
-
-#### 4. **Import errors**
-
-- Ensure project structure is correct
-- Check that `__init__.py` files exist in packages
-
-### Debug Commands:
-
-```bash
-# Test locally before pushing
-python run_tests.py smoke
-python run_tests.py unit
-
-# Check requirements file
-pip install -r requirements.txt
-
-# Validate workflow syntax
-# Use GitHub CLI or online YAML validators
-```
-
-## 🚀 Next Steps
-
-### 1. **Add Branch Protection Rules**
-
-- Require CI to pass before merging
-- Require pull request reviews
-
-### 2. **Add Code Coverage**
-
-- Use codecov.io integration
-- Set coverage thresholds
-
-### 3. **Add Deployment Pipeline**
-
-- Deploy on successful CI
-- Environment-specific deployments
-
-### 4. **Add Notifications**
-
-- Slack/Discord integration
-- Email notifications on failures
-
-## 📝 Best Practices
-
-### 1. **Keep Secrets Secure**
-
-- Never commit API keys to code
-- Use environment variables consistently
-- Rotate secrets regularly
-
-### 2. **Fast Feedback**
-
-- Run quick tests first
-- Fail fast on syntax errors
-- Parallel job execution
-
-### 3. **Reliable Tests**
-
-- Mock external dependencies when possible
-- Use deterministic test data
-- Handle network timeouts gracefully
-
-### 4. **Clear Documentation**
-
-- Comment workflow steps
-- Document secret requirements
-- Maintain setup instructions
-
----
-
-Your StockSense Agent now has **professional-grade Continuous Integration** that will automatically validate every code change and ensure quality standards! 🎉
diff --git a/.github/CI_STATUS.md b/.github/CI_STATUS.md
deleted file mode 100644
index 179295b..0000000
--- a/.github/CI_STATUS.md
+++ /dev/null
@@ -1,52 +0,0 @@
-# 🔄 CI/CD Status
-
-## Build Status
-
-[](https://github.com/Spkap/StockSense-AI/actions/workflows/ci.yml)
-
-## Quick Setup Commands
-
-### 1. **Choose Your Workflow**
-
-```bash
-# Use the simple workflow (recommended)
-cp .github/workflows/ci-simple.yml .github/workflows/ci.yml
-rm .github/workflows/ci-simple.yml
-
-# OR use the comprehensive workflow (advanced)
-rm .github/workflows/ci-simple.yml
-```
-
-### 2. **Configure Secrets in GitHub**
-
-Go to: **Repository Settings > Secrets and Variables > Actions**
-
-Add these secrets:
-
-- `GOOGLE_API_KEY` - Your Google AI API key
-- `NEWSAPI_KEY` - Your News API key
-
-### 3. **Deploy CI**
-
-```bash
-git add .github/
-git commit -m "Add GitHub Actions CI pipeline"
-git push origin main
-```
-
-## 🎯 What This Achieves
-
-✅ **Automated Testing** - Every push and PR runs full test suite
-✅ **Quality Assurance** - Code quality gates prevent broken code
-✅ **Fast Feedback** - Know immediately if changes break anything
-✅ **Professional Standards** - Industry-standard CI/CD practices
-
-## 📊 Monitoring
-
-- **View CI Status**: Go to repository **Actions** tab
-- **Debug Failures**: Click on failed runs for detailed logs
-- **Badge Status**: Shows current build status in README
-
----
-
-_Your StockSense Agent is now enterprise-ready with automated CI/CD! 🚀_
diff --git a/.github/workflows/ai-service.yml b/.github/workflows/ai-service.yml
new file mode 100644
index 0000000..5d66688
--- /dev/null
+++ b/.github/workflows/ai-service.yml
@@ -0,0 +1,38 @@
+name: AI Service CI/CD
+
+on:
+ push:
+ branches: [ main ]
+ paths:
+ - 'stocksense/**'
+
+jobs:
+ deploy:
+ runs-on: ubuntu-latest
+ if: github.ref == 'refs/heads/main'
+ steps:
+ - name: Checkout repository
+ uses: actions/checkout@v4
+
+ - name: Checkout code
+ uses: actions/checkout@v4
+
+ - name: Install SSH Client
+ run: sudo apt-get install -y openssh-client
+
+ - name: Setup SSH Key
+ run: |
+ mkdir -p ~/.ssh
+ echo "${{ secrets.EC2_KEY }}" > ~/.ssh/id_rsa
+ chmod 600 ~/.ssh/id_rsa
+
+ - name: Deploy to EC2
+ run: |
+ ssh -o StrictHostKeyChecking=no ${{ secrets.EC2_USER }}@${{ secrets.EC2_HOST }} "\
+ export AWS_REGION=${{ secrets.AWS_REGION }} && \
+ export ECR_REPO=${{ secrets.ECR_REPO }} && \
+ aws ecr get-login-password --region \$AWS_REGION | docker login --username AWS --password-stdin \$ECR_REPO && \
+ docker pull \$ECR_REPO:latest && \
+ docker stop stocksense-ai || true && docker rm stocksense-ai || true && \
+ docker run -d -p 8001:8080 --name stocksense-ai \$ECR_REPO:latest \
+ "
diff --git a/.github/workflows/backend.yml b/.github/workflows/backend.yml
new file mode 100644
index 0000000..734f76a
--- /dev/null
+++ b/.github/workflows/backend.yml
@@ -0,0 +1,174 @@
+name: Backend CI/CD
+
+on:
+ push:
+ branches: [ main ]
+ paths:
+ - 'backend/**'
+ pull_request:
+ branches: [ main ]
+ paths:
+ - 'backend/**'
+
+jobs:
+ test:
+ runs-on: ubuntu-latest
+
+ services:
+ postgres:
+ image: postgres:15
+ env:
+ POSTGRES_USER: postgres
+ POSTGRES_PASSWORD: postgres
+ POSTGRES_DB: test_db
+ ports:
+ - 5432:5432
+ options: >-
+ --health-cmd="pg_isready -U postgres"
+ --health-interval=10s
+ --health-timeout=5s
+ --health-retries=5
+
+ env:
+ DJANGO_SETTINGS_MODULE: backend.settings
+ DATABASE_URL: postgres://postgres:postgres@localhost:5432/test_db
+ SECRET_KEY: test-secret
+ DEBUG: "0"
+ TESTING: "true"
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v4
+
+ - name: Set up Python
+ uses: actions/setup-python@v5
+ with:
+ python-version: "3.11"
+
+ - name: Install dependencies
+ working-directory: backend
+ run: |
+ python -m pip install --upgrade pip
+ pip install -r requirements.txt
+
+ - name: Wait for Postgres
+ run: |
+ until pg_isready -h localhost -p 5432 -U postgres; do
+ echo "Waiting for Postgres..."
+ sleep 2
+ done
+
+ - name: Run migrations
+ working-directory: backend
+ run: python manage.py migrate
+
+ - name: Run tests
+ working-directory: backend
+ run: python manage.py test --noinput
+
+ deploy:
+ runs-on: ubuntu-latest
+ needs: test
+ if: github.ref == 'refs/heads/main'
+ env:
+ AWS_REGION: ${{ secrets.AWS_REGION }}
+
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v4
+
+ - name: Set up Docker Buildx
+ uses: docker/setup-buildx-action@v3
+
+ - name: Configure AWS credentials
+ uses: aws-actions/configure-aws-credentials@v4
+ with:
+ aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
+ aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
+ aws-region: ${{ secrets.AWS_REGION }}
+
+
+
+ - name: Login to Amazon ECR
+ uses: aws-actions/amazon-ecr-login@v2
+ with:
+ registry-type: private
+
+ - name: Create .env file
+ run: |
+ echo "DATABASE_URL=${{ secrets.DATABASE_URL }}" >> backend/.env
+ echo "SECRET_KEY=${{ secrets.SECRET_KEY }}" >> backend/.env
+ echo "DEBUG=${{ secrets.DEBUG }}" >> backend/.env
+ echo "FIREBASE_TYPE=${{ secrets.FIREBASE_TYPE }}" >> backend/.env
+ echo "FIREBASE_PROJECT_ID=${{ secrets.FIREBASE_PROJECT_ID }}" >> backend/.env
+ echo "FIREBASE_PRIVATE_KEY_ID=${{ secrets.FIREBASE_PRIVATE_KEY_ID }}" >> backend/.env
+ echo "FIREBASE_PRIVATE_KEY=${{ secrets.FIREBASE_PRIVATE_KEY }}" >> backend/.env
+ echo "FIREBASE_CLIENT_EMAIL=${{ secrets.FIREBASE_CLIENT_EMAIL }}" >> backend/.env
+ echo "FIREBASE_CLIENT_ID=${{ secrets.FIREBASE_CLIENT_ID }}" >> backend/.env
+ echo "FIREBASE_AUTH_URI=${{ secrets.FIREBASE_AUTH_URI }}" >> backend/.env
+ echo "FIREBASE_TOKEN_URI=${{ secrets.FIREBASE_TOKEN_URI }}" >> backend/.env
+ echo "FIREBASE_AUTH_PROVIDER_X509_CERT_URL=${{ secrets.FIREBASE_AUTH_PROVIDER_X509_CERT_URL }}" >> backend/.env
+ echo "FIREBASE_CLIENT_X509_CERT_URL=${{ secrets.FIREBASE_CLIENT_X509_CERT_URL }}" >> backend/.env
+ echo "FIREBASE_UNIVERSE_DOMAIN=${{ secrets.FIREBASE_UNIVERSE_DOMAIN }}" >> backend/.env
+ echo "FIREBASE_STORAGE_BUCKET=${{ secrets.FIREBASE_STORAGE_BUCKET }}" >> backend/.env
+
+ - name: Build and push Docker image
+ uses: docker/build-push-action@v5
+ with:
+ context: ./backend
+ push: true
+ tags: ${{ secrets.ECR_REPO }}:latest
+
+ - name: Install SSH Client
+ run: sudo apt-get install -y openssh-client
+
+ - name: Setup SSH Key
+ run: |
+ mkdir -p ~/.ssh
+ echo "${{ secrets.EC2_KEY }}" > ~/.ssh/id_rsa
+ chmod 600 ~/.ssh/id_rsa
+
+ - name: Deploy to EC2
+ run: |
+ ssh -o StrictHostKeyChecking=no ${{ secrets.EC2_USER }}@${{ secrets.EC2_HOST }} "\
+ docker pull ${{ secrets.ECR_REPO }}:latest && \
+ docker stop stocksense || true && \
+ docker rm stocksense || true && \
+ docker run -d -p 8000:8000 --name stocksense \
+ -e DATABASE_URL='${{ secrets.DATABASE_URL }}' \
+ -e SECRET_KEY='${{ secrets.SECRET_KEY }}' \
+ -e DEBUG='${{ secrets.DEBUG }}' \
+ -e FIREBASE_TYPE='${{ secrets.FIREBASE_TYPE }}' \
+ -e FIREBASE_PROJECT_ID='${{ secrets.FIREBASE_PROJECT_ID }}' \
+ -e FIREBASE_PRIVATE_KEY_ID='${{ secrets.FIREBASE_PRIVATE_KEY_ID }}' \
+ -e FIREBASE_PRIVATE_KEY='${{ secrets.FIREBASE_PRIVATE_KEY }}' \
+ -e FIREBASE_CLIENT_EMAIL='${{ secrets.FIREBASE_CLIENT_EMAIL }}' \
+ -e FIREBASE_CLIENT_ID='${{ secrets.FIREBASE_CLIENT_ID }}' \
+ -e FIREBASE_AUTH_URI='${{ secrets.FIREBASE_AUTH_URI }}' \
+ -e FIREBASE_TOKEN_URI='${{ secrets.FIREBASE_TOKEN_URI }}' \
+ -e FIREBASE_AUTH_PROVIDER_X509_CERT_URL='${{ secrets.FIREBASE_AUTH_PROVIDER_X509_CERT_URL }}' \
+ -e FIREBASE_CLIENT_X509_CERT_URL='${{ secrets.FIREBASE_CLIENT_X509_CERT_URL }}' \
+ -e FIREBASE_UNIVERSE_DOMAIN='${{ secrets.FIREBASE_UNIVERSE_DOMAIN }}' \
+ -e FIREBASE_STORAGE_BUCKET='${{ secrets.FIREBASE_STORAGE_BUCKET }}' \
+ ${{ secrets.ECR_REPO }}:latest"
+
+ - name: Health Check
+ run: |
+ ssh -o StrictHostKeyChecking=no ${{ secrets.EC2_USER }}@${{ secrets.EC2_HOST }} "\
+ echo 'Waiting for container to start...' && \
+ sleep 30 && \
+ echo 'Checking container status...' && \
+ docker ps | grep stocksense && \
+ echo 'Testing health endpoint...' && \
+ RESPONSE=\$(curl -s -w '%{http_code}' http://localhost:8000/ 2>/dev/null); \
+ HTTP_CODE=\${RESPONSE: -3}; \
+ BODY=\${RESPONSE%???}; \
+ if [ \"\$HTTP_CODE\" = \"200\" ]; then \
+ echo 'Health check passed!'; \
+ echo 'Response: '\$BODY; \
+ else \
+ echo 'Health check failed (HTTP \$HTTP_CODE)'; \
+ docker logs stocksense --tail 50; \
+ exit 1; \
+ fi"
diff --git a/.github/workflows/ci-simple.yml b/.github/workflows/ci-simple.yml
deleted file mode 100644
index 1413967..0000000
--- a/.github/workflows/ci-simple.yml
+++ /dev/null
@@ -1,50 +0,0 @@
-# GitHub Actions CI Workflow for StockSense Agent
-# Automated testing pipeline for Python application
-
-name: Python Application CI
-
-# Trigger the workflow on push and pull requests to main branch
-on:
- push:
- branches: [main]
- pull_request:
- branches: [main]
-
-# Define the CI job
-jobs:
- build:
- # Job name and runner configuration
- name: Build and Test
- runs-on: ubuntu-latest
-
- steps:
- # Step 1: Checkout code from repository
- - name: Checkout code
- uses: actions/checkout@v4
-
- # Step 2: Set up Python 3.10 environment
- - name: Set up Python 3.10
- uses: actions/setup-python@v5
- with:
- python-version: '3.10'
-
- # Step 3: Install dependencies
- - name: Upgrade pip
- run: |
- python -m pip install --upgrade pip
-
- - name: Install project dependencies
- run: |
- pip install -r requirements.txt
-
- - name: Install testing dependencies
- run: |
- pip install pytest requests
-
- # Step 4: Run tests with environment variables
- - name: Run tests
- run: |
- pytest tests/ -v
- env:
- GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
- NEWSAPI_KEY: ${{ secrets.NEWSAPI_KEY }}
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
deleted file mode 100644
index 1cfe621..0000000
--- a/.github/workflows/ci.yml
+++ /dev/null
@@ -1,194 +0,0 @@
-# GitHub Actions CI Workflow for StockSense Agent
-# This workflow runs automated tests on every push and pull request to main branch
-
-name: Python Application CI
-
-# Trigger Configuration
-# Runs on pushes to main branch and all pull requests targeting main
-on:
- push:
- branches: [main]
- pull_request:
- branches: [main]
-
-# Define CI Jobs
-jobs:
- build:
- # Job Configuration
- name: Build and Test
- runs-on: ubuntu-latest
-
- # Define build steps
- steps:
- # Step 1: Checkout Repository Code
- - name: Checkout code
- uses: actions/checkout@v4
- with:
- # Fetch full history for better Git operations
- fetch-depth: 0
-
- # Step 2: Set up Python Environment
- - name: Set up Python 3.10
- uses: actions/setup-python@v5
- with:
- python-version: '3.10'
- # Cache pip dependencies for faster builds
- cache: 'pip'
-
- # Step 3: Install Dependencies
- - name: Upgrade pip
- run: |
- python -m pip install --upgrade pip
-
- - name: Install project dependencies
- run: |
- pip install -r requirements.txt
-
- - name: Install testing dependencies
- run: |
- pip install pytest requests
-
- # Step 4: Run Linting (Optional but recommended)
- - name: Run code quality checks
- run: |
- # Install flake8 for linting
- pip install flake8
- # Stop the build if there are Python syntax errors or undefined names
- flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
- # Exit-zero treats all errors as warnings. GitHub editor is 127 chars wide
- flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- continue-on-error: true
-
- # Step 5: Run Unit Tests (No external dependencies required)
- - name: Run unit tests
- run: |
- # Run only unit tests that don't require external services
- python run_tests.py unit
- env:
- # Set environment variables from GitHub secrets
- GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
- NEWSAPI_KEY: ${{ secrets.NEWSAPI_KEY }}
-
- # Step 6: Run Smoke Tests (Quick validation)
- - name: Run smoke tests
- run: |
- # Run quick smoke tests for basic functionality validation
- python run_tests.py smoke
- env:
- GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
- NEWSAPI_KEY: ${{ secrets.NEWSAPI_KEY }}
-
- # Step 7: Run Full Test Suite with Coverage (Optional)
- - name: Run tests with coverage
- run: |
- # Install coverage tool
- pip install coverage
- # Run pytest with coverage reporting
- coverage run -m pytest tests/test_tools.py -v
- # Generate coverage report
- coverage report -m
- # Generate HTML coverage report
- coverage html
- env:
- GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
- NEWSAPI_KEY: ${{ secrets.NEWSAPI_KEY }}
- continue-on-error: true
-
- # Step 8: Upload Coverage Reports (Optional)
- - name: Upload coverage reports
- uses: actions/upload-artifact@v4
- with:
- name: coverage-report
- path: htmlcov/
- if: always()
-
- # Step 9: Upload Test Results
- - name: Upload test results
- uses: actions/upload-artifact@v4
- with:
- name: test-results
- path: |
- .pytest_cache/
- test-results.xml
- if: always()
-
- # Additional Jobs for Different Scenarios
- integration-tests:
- # Only run integration tests if unit tests pass
- needs: build
- name: Integration Tests
- runs-on: ubuntu-latest
-
- steps:
- - name: Checkout code
- uses: actions/checkout@v4
-
- - name: Set up Python 3.10
- uses: actions/setup-python@v5
- with:
- python-version: '3.10'
- cache: 'pip'
-
- - name: Install dependencies
- run: |
- python -m pip install --upgrade pip
- pip install -r requirements.txt
- pip install pytest requests
-
- # Start the API server in background for integration tests
- - name: Start API server
- run: |
- # Start the StockSense API server in background
- python -m stocksense.main &
- # Wait for server to start
- sleep 10
- env:
- GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
- NEWSAPI_KEY: ${{ secrets.NEWSAPI_KEY }}
-
- - name: Run integration tests
- run: |
- # Run API integration tests
- python run_tests.py api
- env:
- GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
- NEWSAPI_KEY: ${{ secrets.NEWSAPI_KEY }}
-
- # Security and Code Quality Job
- security-scan:
- name: Security Scan
- runs-on: ubuntu-latest
-
- steps:
- - name: Checkout code
- uses: actions/checkout@v4
-
- - name: Set up Python 3.10
- uses: actions/setup-python@v5
- with:
- python-version: '3.10'
-
- - name: Install security tools
- run: |
- python -m pip install --upgrade pip
- pip install safety bandit
-
- - name: Run safety check
- run: |
- # Check for known security vulnerabilities in dependencies
- safety check --file requirements.txt
- continue-on-error: true
-
- - name: Run bandit security scan
- run: |
- # Run security analysis on Python code
- bandit -r stocksense/ -f json -o bandit-report.json
- continue-on-error: true
-
- - name: Upload security reports
- uses: actions/upload-artifact@v4
- with:
- name: security-reports
- path: |
- bandit-report.json
- if: always()
diff --git a/.github/workflows/frontend.yml b/.github/workflows/frontend.yml
new file mode 100644
index 0000000..2b50eaf
--- /dev/null
+++ b/.github/workflows/frontend.yml
@@ -0,0 +1,33 @@
+name: Frontend CI/CD
+
+on:
+ push:
+ branches: [ main ]
+ paths:
+ - 'frontend/**'
+
+jobs:
+ build-and-deploy:
+ runs-on: ubuntu-latest
+
+ steps:
+ - name: Checkout code
+ uses: actions/checkout@v4
+
+ - name: Setup Node
+ uses: actions/setup-node@v4
+ with:
+ node-version: 18
+
+ - name: Install dependencies
+ run: npm install
+ working-directory: frontend
+
+ - name: Build app
+ run: npm run build
+ working-directory: frontend
+
+ - name: Trigger Render deployment
+ run: curl -X POST "$RENDER_DEPLOY_HOOK"
+ env:
+ RENDER_DEPLOY_HOOK: ${{ secrets.RENDER_DEPLOY_HOOK }}
diff --git a/.gitignore b/.gitignore
index 16cccec..37f2ac6 100644
--- a/.gitignore
+++ b/.gitignore
@@ -38,6 +38,7 @@ parts/
sdist/
var/
wheels/
+.vscode/
*.egg-info/
.installed.cfg
*.egg
@@ -108,3 +109,7 @@ docker-compose.yml
.Trashes
ehthumbs.db
Thumbs.db
+
+stocksense-e7226-firebase-adminsdk-fbsvc-b8cae3c098.json
+nasdaq_screener_1756622606980.csv
+data.json
\ No newline at end of file
diff --git a/.vscode/settings.json b/.vscode/settings.json
deleted file mode 100644
index a32e2b4..0000000
--- a/.vscode/settings.json
+++ /dev/null
@@ -1,5 +0,0 @@
-{
- "python-envs.defaultEnvManager": "ms-python.python:conda",
- "python-envs.defaultPackageManager": "ms-python.python:conda",
- "python-envs.pythonProjects": []
-}
diff --git a/.vscode/tasks.json b/.vscode/tasks.json
deleted file mode 100644
index 17b85ec..0000000
--- a/.vscode/tasks.json
+++ /dev/null
@@ -1,13 +0,0 @@
-{
- "version": "2.0.0",
- "tasks": [
- {
- "label": "Run StockSense ReAct Agent",
- "type": "shell",
- "command": "cd /Users/sourabhkapure/Developer/Projects/StockSense-Agent && python -m stocksense.main",
- "group": "build",
- "isBackground": true,
- "problemMatcher": []
- }
- ]
-}
diff --git a/Dockerfile.backend b/Dockerfile.backend
deleted file mode 100644
index e627911..0000000
--- a/Dockerfile.backend
+++ /dev/null
@@ -1,34 +0,0 @@
-# Use Python 3.10 slim image for smaller size
-FROM python:3.10-slim
-
-# Set working directory
-WORKDIR /app
-
-# Install system dependencies
-RUN apt-get update && apt-get install -y \
- gcc \
- g++ \
- curl \
- && rm -rf /var/lib/apt/lists/*
-
-# Copy requirements first (for better caching)
-COPY requirements.txt .
-
-# Install Python dependencies
-RUN pip install --no-cache-dir -r requirements.txt
-
-# Copy application code
-COPY . .
-
-# Create directory for SQLite database
-RUN mkdir -p /app/data
-
-# Expose port for FastAPI backend
-EXPOSE 8000
-
-# Health check for backend API
-HEALTHCHECK --interval=30s --timeout=30s --start-period=5s --retries=3 \
- CMD curl -f http://localhost:8000/health || exit 1
-
-# Run the FastAPI backend application
-CMD ["python", "-m", "stocksense.main"]
diff --git a/Dockerfile.frontend b/Dockerfile.frontend
deleted file mode 100644
index b9fdf9d..0000000
--- a/Dockerfile.frontend
+++ /dev/null
@@ -1,41 +0,0 @@
-# Use Python 3.10 slim image
-FROM python:3.10-slim
-
-# Set working directory
-WORKDIR /app
-
-# Install system dependencies
-RUN apt-get update && apt-get install -y \
- gcc \
- g++ \
- curl \
- && rm -rf /var/lib/apt/lists/*
-
-# Copy requirements first (for better caching)
-COPY requirements.txt .
-
-# Install Python dependencies
-RUN pip install --no-cache-dir -r requirements.txt
-
-# Copy application code
-COPY . .
-
-# Expose port for Streamlit
-EXPOSE 8501
-
-# Create Streamlit config directory
-RUN mkdir -p ~/.streamlit
-
-# Configure Streamlit
-RUN echo '[server]' > ~/.streamlit/config.toml && \
- echo 'headless = true' >> ~/.streamlit/config.toml && \
- echo 'port = 8501' >> ~/.streamlit/config.toml && \
- echo 'enableCORS = false' >> ~/.streamlit/config.toml && \
- echo 'enableXsrfProtection = false' >> ~/.streamlit/config.toml
-
-# Health check
-HEALTHCHECK --interval=30s --timeout=30s --start-period=5s --retries=3 \
- CMD curl -f http://localhost:8501/_stcore/health || exit 1
-
-# Run Streamlit app
-CMD ["streamlit", "run", "app.py", "--server.address", "0.0.0.0"]
diff --git a/app.py b/app.py
deleted file mode 100644
index f8b3778..0000000
--- a/app.py
+++ /dev/null
@@ -1,1202 +0,0 @@
-import streamlit as st
-import requests
-import time
-import pandas as pd
-import numpy as np
-from datetime import datetime, timedelta
-from typing import Optional, Dict, Any
-import json
-import yfinance as yf
-import plotly.graph_objects as go
-import plotly.express as px
-from pathlib import Path
-import os
-PLOTLY_AVAILABLE = True
-
-st.set_page_config(
- page_title="StockSense AI Agent",
- page_icon="📈",
- layout="wide",
- initial_sidebar_state="expanded",
- menu_items={
- 'Get Help': 'https://github.com/Spkap/StockSense-AI',
- 'Report a bug': 'https://github.com/Spkap/StockSense-AI/issues',
- 'About': "# StockSense ReAct Agent\nAI-powered autonomous stock analysis using ReAct pattern"
- }
-)
-
-st.markdown("""
-
-""", unsafe_allow_html=True)
-
-def initialize_session_state():
- if 'analysis_result' not in st.session_state:
- st.session_state.analysis_result = None
- if 'analysis_history' not in st.session_state:
- st.session_state.analysis_history = []
- if 'backend_status' not in st.session_state:
- st.session_state.backend_status = None
- if 'selected_ticker' not in st.session_state:
- st.session_state.selected_ticker = ""
-
-initialize_session_state()
-
-@st.cache_data
-def load_company_ticker_mapping():
- """Load company name to ticker mapping from CSV file."""
- csv_path = Path("nasdaq_screener.csv")
-
- if csv_path.exists():
- try:
- df = pd.read_csv(csv_path)
- if 'Symbol' in df.columns and 'Name' in df.columns:
- # Create a mapping from company name to ticker
- mapping = dict(zip(df['Name'], df['Symbol']))
- return mapping, df
- except Exception as e:
- st.warning(f"Error loading CSV file: {e}")
-
- # Fallback to popular stocks if CSV not available
- fallback_mapping = {
- "Apple Inc.": "AAPL",
- "Microsoft Corporation": "MSFT",
- "Alphabet Inc.": "GOOGL",
- "Amazon.com Inc.": "AMZN",
- "Tesla Inc.": "TSLA",
- "NVIDIA Corporation": "NVDA",
- "Meta Platforms Inc.": "META",
- "Netflix Inc.": "NFLX",
- "Advanced Micro Devices Inc.": "AMD",
- "Intel Corporation": "INTC"
- }
-
- return fallback_mapping, None
-
-def create_candlestick_chart(price_data: list) -> go.Figure:
- """Create an interactive candlestick chart with moving averages."""
- if not price_data:
- return None
-
- # Convert to DataFrame
- df = pd.DataFrame(price_data)
- df['Date'] = pd.to_datetime(df['Date'])
- df = df.sort_values('Date')
-
- # Calculate moving averages
- df['SMA_20'] = df['Close'].rolling(window=20, min_periods=1).mean()
- df['SMA_50'] = df['Close'].rolling(window=50, min_periods=1).mean()
-
- # Create candlestick chart
- fig = go.Figure()
-
- # Add candlestick
- fig.add_trace(go.Candlestick(
- x=df['Date'],
- open=df['Open'],
- high=df['High'],
- low=df['Low'],
- close=df['Close'],
- name="Price",
- increasing_line_color='#26a69a',
- decreasing_line_color='#ef5350'
- ))
-
- # Add moving averages
- fig.add_trace(go.Scatter(
- x=df['Date'],
- y=df['SMA_20'],
- mode='lines',
- name='SMA 20',
- line=dict(color='orange', width=2)
- ))
-
- fig.add_trace(go.Scatter(
- x=df['Date'],
- y=df['SMA_50'],
- mode='lines',
- name='SMA 50',
- line=dict(color='blue', width=2)
- ))
-
- # Update layout
- fig.update_layout(
- title="Stock Price with Moving Averages",
- yaxis_title="Price ($)",
- xaxis_title="Date",
- template="plotly_white",
- height=500,
- showlegend=True,
- xaxis_rangeslider_visible=False
- )
-
- return fig
-
-BACKEND_URL = os.getenv("BACKEND_URL", "http://127.0.0.1:8000")
-
-def check_backend_status() -> bool:
- """Check if the backend is online."""
- try:
- response = requests.get(f"{BACKEND_URL}/health", timeout=5)
- status = response.status_code == 200
- st.session_state.backend_status = status
- return status
- except:
- st.session_state.backend_status = False
- return False
-
-
-def create_styled_header():
- st.markdown("""
-
-
- 📈 StockSense AI Agent
-
-
- AI-Powered Stock Analysis Using Reasoning & Action
-
-
- """, unsafe_allow_html=True)
-
-
-def display_hero_section():
- create_styled_header()
-
- col1, col2, col3 = st.columns([1, 2, 1])
- with col2:
- if check_backend_status():
- st.success("🟢 Backend Connected & Ready", icon="✅")
- else:
- st.error("🔴 Backend Connection Failed", icon="❌")
-
- st.markdown(" ", unsafe_allow_html=True)
-
-
-def display_ticker_input():
- st.markdown("### Select Stock to Analyze")
-
- # Load company mapping
- company_mapping, df = load_company_ticker_mapping()
- company_names = list(company_mapping.keys())
-
- col1, col2 = st.columns([2, 1])
-
- with col1:
- st.markdown("**Select Company:**")
-
- # Company selectbox
- selected_company = st.selectbox(
- "Choose a company",
- options=[""] + company_names,
- index=0,
- help="Select a company from the dropdown",
- label_visibility="collapsed"
- )
-
- if selected_company:
- ticker = company_mapping[selected_company]
- st.session_state.selected_ticker = ticker
- st.info(f"Selected: **{selected_company}** → **{ticker}**")
-
- with col2:
- st.markdown("**Or enter manually:**")
- manual_ticker = st.text_input(
- "Stock Ticker",
- value=st.session_state.selected_ticker if not selected_company else "",
- placeholder="e.g., AAPL",
- help="Enter any valid stock ticker symbol",
- label_visibility="collapsed"
- )
-
- if manual_ticker and manual_ticker != st.session_state.selected_ticker:
- st.session_state.selected_ticker = manual_ticker.upper().strip()
-
- # Quick select buttons for popular stocks
- st.markdown("**Quick Select Popular Stocks:**")
- cols = st.columns(5)
- popular_tickers = ["AAPL", "MSFT", "GOOGL", "AMZN", "TSLA"]
-
- for i, ticker in enumerate(popular_tickers):
- with cols[i]:
- if st.button(ticker, key=f"quick_{ticker}", use_container_width=True):
- st.session_state.selected_ticker = ticker
-
- cols = st.columns(5)
- more_tickers = ["NVDA", "META", "NFLX", "AMD", "INTC"]
-
- for i, ticker in enumerate(more_tickers):
- with cols[i]:
- if st.button(ticker, key=f"quick_{ticker}", use_container_width=True):
- st.session_state.selected_ticker = ticker
-
- return st.session_state.selected_ticker
-
-
-def validate_ticker(ticker: str) -> tuple[bool, str]:
- """Validate ticker input."""
- if not ticker:
- return False, "Please select or enter a stock ticker symbol"
-
- if not ticker.replace('.', '').isalpha() or len(ticker) < 1 or len(ticker) > 10:
- return False, "Please enter a valid ticker (1-10 letters, dots allowed)"
-
- return True, ""
-
-
-def trigger_analysis(ticker: str) -> Optional[Dict[str, Any]]:
- """Triggers stock analysis via backend API."""
- try:
- if not check_backend_status():
- st.error("🔌 Backend server is offline. Please start the FastAPI server.")
- return None
-
- with st.spinner("🤖 ReAct Agent is analyzing..."):
- response = requests.post(
- f"{BACKEND_URL}/analyze/{ticker}",
- timeout=60
- )
-
- if response.status_code == 200:
- result = response.json()
-
- st.success(f"Analysis for **{ticker}** has been triggered!")
-
- # Check if this is a fresh analysis or cached result
- analysis_data = result.get('data', {})
- if analysis_data.get('source') == 'react_analysis':
- # Fresh analysis - use the data directly from /analyze response
- st.success("✅ Fresh analysis completed!")
-
- result_obj = {
- 'ticker': ticker,
- 'data': analysis_data,
- 'timestamp': datetime.now().isoformat(),
- 'success': True
- }
-
- st.session_state.analysis_result = result_obj
- st.session_state.analysis_history.insert(0, result_obj)
- if len(st.session_state.analysis_history) > 10:
- st.session_state.analysis_history.pop()
-
- return result_obj
-
- elif analysis_data.get('source') == 'cache':
- # Cached result - data is already complete
- st.info("📚 Retrieved from cache")
-
- result_obj = {
- 'ticker': ticker,
- 'data': analysis_data,
- 'timestamp': datetime.now().isoformat(),
- 'success': True
- }
-
- st.session_state.analysis_result = result_obj
- st.session_state.analysis_history.insert(0, result_obj)
- if len(st.session_state.analysis_history) > 10:
- st.session_state.analysis_history.pop()
-
- return result_obj
-
- else:
- # Fallback to old behavior for backward compatibility
- with st.expander("Backend Response Details"):
- st.json(result)
-
- st.markdown("---")
- progress_bar = st.progress(0)
- status_text = st.empty()
-
- max_attempts = 10
- for attempt in range(1, max_attempts + 1):
- status_text.text(f"Fetching results... (Attempt {attempt}/{max_attempts})")
- progress_bar.progress(attempt / max_attempts)
-
- try:
- results_response = requests.get(
- f"{BACKEND_URL}/results/{ticker}",
- timeout=10
- )
-
- if results_response.status_code == 200:
- analysis_data = results_response.json()
- status_text.text("Results retrieved successfully!")
- progress_bar.progress(1.0)
-
- result_obj = {
- 'ticker': ticker,
- 'data': analysis_data.get('data', analysis_data),
- 'timestamp': datetime.now().isoformat(),
- 'success': True
- }
-
- st.session_state.analysis_result = result_obj
-
- st.session_state.analysis_history.insert(0, result_obj)
- if len(st.session_state.analysis_history) > 10:
- st.session_state.analysis_history.pop()
-
- progress_bar.empty()
- status_text.empty()
- return result_obj
-
- elif results_response.status_code == 404:
- if attempt < max_attempts:
- time.sleep(2)
- continue
- else:
- st.error(f"❌ Error fetching results: Status {results_response.status_code}")
- break
-
- except requests.exceptions.RequestException as e:
- st.error(f"❌ Network error: {str(e)}")
- break
-
- progress_bar.empty()
- status_text.empty()
- st.error("⏱️ Analysis timed out. Please try again.")
- return None
-
- else:
- st.error(f"❌ Analysis failed: Status {response.status_code}")
- if response.text:
- st.error(f"Details: {response.text}")
- return None
-
- except requests.exceptions.Timeout:
- st.error("⏱️ Request timed out. Analysis may still be processing.")
- return None
- except requests.exceptions.ConnectionError:
- st.error("🔌 Cannot connect to backend. Please ensure the server is running.")
- return None
- except Exception as e:
- st.error(f"❌ Unexpected error: {str(e)}")
- return None
-
-
-def display_enhanced_analysis_results(data: Dict[str, Any], ticker: str):
- """Display enhanced analysis results with tabbed interface."""
-
- # Create tabs
- tab1, tab2, tab3 = st.tabs(["📊 Analysis Dashboard", "📰 News & Sentiment", "⚙️ Agent Reasoning"])
-
- with tab1:
- st.markdown("### 📊 Analysis Dashboard")
-
- # Display analysis summary
- summary = (data.get('summary') or
- data.get('analysis_summary') or
- data.get('analysis') or
- "Analysis completed successfully")
-
- if summary and summary != "Analysis completed successfully":
- st.markdown(f"""
-
-
- 📊 Analysis Summary
-
-
Stock: {ticker}
-
{summary}
-
- """, unsafe_allow_html=True)
- else:
- st.info("📊 Detailed analysis summary not available for this request")
-
- # Display interactive candlestick chart
- price_data = data.get('price_data', [])
- if not price_data:
- # Try alternative key names
- price_data = (data.get('historical_data') or
- data.get('price_history') or
- data.get('ohlcv_data') or [])
-
- if price_data and len(price_data) > 0:
- st.markdown("#### 📈 Interactive Price Chart with Moving Averages")
-
- try:
- fig = create_candlestick_chart(price_data)
- if fig:
- st.plotly_chart(fig, use_container_width=True)
- st.success(f"📊 Displaying {len(price_data)} days of price data with technical indicators")
- else:
- st.warning("Unable to create candlestick chart - trying fallback visualization")
- # Fallback to simple line chart
- try:
- df = pd.DataFrame(price_data)
- df['Date'] = pd.to_datetime(df['Date'])
- df = df.sort_values('Date').set_index('Date')
- st.line_chart(df[['Close']], use_container_width=True)
- except Exception as fallback_error:
- st.error(f"Chart creation failed: {str(fallback_error)}")
- except Exception as e:
- st.error(f"Error creating chart: {str(e)}")
- # Show raw data for debugging
- with st.expander("🔍 Debug: Raw Price Data"):
- st.json(price_data[:3]) # Show first 3 records
- else:
- # Show more helpful message
- available_keys = list(data.keys())
- st.info(f"📈 No historical price data available in this analysis. Available data: {', '.join(available_keys)}")
-
- # Try to get real-time price data as fallback
- if st.button("🔄 Fetch Current Price Data", key="fetch_price_fallback"):
- with st.spinner("Fetching current market data..."):
- try:
- from stocksense.data_collectors import get_price_history
- df = get_price_history(ticker, period="1mo")
- if df is not None and not df.empty:
- # Convert to our format
- df_reset = df.reset_index()
- df_reset['Date'] = df_reset['Date'].dt.strftime('%Y-%m-%d')
- fallback_data = df_reset.to_dict(orient='records')
-
- fig = create_candlestick_chart(fallback_data)
- if fig:
- st.plotly_chart(fig, use_container_width=True)
- st.success("📊 Showing current market data (not from AI analysis)")
- except Exception as e:
- st.error(f"Failed to fetch fallback data: {str(e)}")
-
- # Display key metrics
- display_key_metrics(ticker)
-
- with tab2:
- st.markdown("### 📰 News & Sentiment Analysis")
-
- # Display detailed sentiment report
- sentiment_report_raw = data.get('sentiment_report')
-
- if sentiment_report_raw:
- st.markdown("""
-
- 📊 Market Sentiment Report
-
- """, unsafe_allow_html=True)
-
- # Try to parse as JSON first
- if isinstance(sentiment_report_raw, str):
- try:
- sentiment_data = json.loads(sentiment_report_raw)
- if isinstance(sentiment_data, list):
- # Display structured sentiment data
- for i, item in enumerate(sentiment_data, 1):
- headline = item.get('headline', 'N/A')
- sentiment = item.get('sentiment', 'N/A')
- justification = item.get('justification', 'N/A')
-
- with st.expander(f"📰 Article {i}: {headline[:80]}..."):
- st.markdown(f"**Headline:** {headline}")
- st.markdown(f"**Sentiment:** {sentiment}")
- st.markdown(f"**Analysis:** {justification}")
- else:
- st.markdown(sentiment_report_raw)
- except json.JSONDecodeError:
- st.markdown(sentiment_report_raw)
- elif isinstance(sentiment_report_raw, list):
- # Already structured data
- for i, item in enumerate(sentiment_report_raw, 1):
- headline = item.get('headline', 'N/A')
- sentiment = item.get('sentiment', 'N/A')
- justification = item.get('justification', 'N/A')
-
- with st.expander(f"� Article {i}: {headline[:80]}..."):
- st.markdown(f"**Headline:** {headline}")
- st.markdown(f"**Sentiment:** {sentiment}")
- st.markdown(f"**Analysis:** {justification}")
- else:
- st.markdown(str(sentiment_report_raw))
- else:
- # Try to show any related sentiment data
- sentiment_info = []
- if data.get('headlines_count'):
- sentiment_info.append(f"Analyzed {data['headlines_count']} headlines")
- if data.get('sentiment'):
- sentiment_info.append(f"Overall sentiment: {data['sentiment']}")
-
- if sentiment_info:
- st.info("Sentiment analysis completed. " + " | ".join(sentiment_info))
- else:
- st.info("No detailed sentiment analysis data available")
-
- with tab3:
- st.markdown("### ⚙️ Agent Reasoning Process")
-
- # Check if we have any reasoning data - be more specific about what constitutes "available"
- reasoning_steps = data.get('reasoning_steps') or []
- has_reasoning_steps = len(reasoning_steps) > 0
- has_tools_used = data.get('tools_used') and len(data.get('tools_used', [])) > 0
- has_iterations = data.get('iterations') and data.get('iterations', 0) > 0
- has_source_info = data.get('source') == 'react_analysis' # Fresh analysis vs cached
-
- # For cached data, don't show reasoning even if it has generic cache steps
- is_cached = data.get('source') == 'cache'
- has_meaningful_reasoning = has_reasoning_steps and not is_cached
-
- if has_meaningful_reasoning or (has_tools_used and not is_cached) or (has_iterations and not is_cached):
- st.markdown("""
-
- 🧠 ReAct Agent Decision Process
-
- """, unsafe_allow_html=True)
-
- # Show reasoning steps if available
- if has_meaningful_reasoning:
- reasoning_data = reasoning_steps # Use the cleaned version
- st.markdown("**📋 Agent Reasoning Steps:**")
- if isinstance(reasoning_data, str):
- st.markdown(reasoning_data)
- elif isinstance(reasoning_data, list):
- for i, step in enumerate(reasoning_data, 1):
- st.markdown(f"**Step {i}:** {step}")
- else:
- st.markdown(str(reasoning_data))
-
- # Show tools used if available (but not for cached data)
- if has_tools_used and not is_cached:
- st.markdown("**🔧 Tools Used:**")
- tools = data['tools_used']
- if isinstance(tools, list):
- for tool in set(tools): # Remove duplicates
- st.markdown(f"• {tool}")
- else:
- st.markdown(str(tools))
-
- # Show iteration count if available (but not for cached data)
- if has_iterations and not is_cached:
- st.markdown(f"**🔄 Analysis Iterations:** {data['iterations']}")
-
- # Show analysis type
- if has_source_info and not is_cached:
- st.info(f"✅ Fresh ReAct analysis completed with {data.get('iterations', 0)} reasoning iterations")
-
- else:
- # Determine why no reasoning data is available
- source = data.get('source', 'unknown')
- if source == 'cache':
- st.info("📚 This is a cached result from a previous analysis. Detailed reasoning steps are not available for cached results.")
- elif not has_source_info:
- st.warning("⚠️ No detailed reasoning information available. This may be an incomplete analysis or cached result.")
- else:
- # This shouldn't happen for fresh analyses
- st.error("❌ No reasoning data available despite being a fresh analysis. This may indicate an issue with the ReAct agent.")
-
- # Display raw data for debugging
- with st.expander("🔍 Raw Analysis Data (Debug)"):
- st.json(data)
-
-
-def display_visualizations(ticker: str):
- """Displays data visualizations for price trends and sentiment distribution."""
- st.markdown("### 📊 Market Insights")
-
- col1, col2 = st.columns(2)
-
- with col1:
- st.markdown("**📈 30-Day Price Trend**")
-
- from stocksense.data_collectors import get_price_history
-
- with st.spinner("Fetching real market data..."):
- price_data = get_price_history(ticker, period="1mo")
-
- if price_data is not None and not price_data.empty:
- price_df = pd.DataFrame({'Price': price_data['Close']})
- st.line_chart(price_df, use_container_width=True)
-
- current_price = price_data['Close'].iloc[-1]
- prev_price = price_data['Close'].iloc[-2] if len(price_data) > 1 else current_price
- change_pct = ((current_price - prev_price) / prev_price) * 100 if prev_price != 0 else 0
- trend_icon = "📈" if change_pct > 0 else "📉"
-
- st.markdown(f"""
-
-
{trend_icon} Current Price (Real Data)
-
${current_price:.2f}
-
- {change_pct:+.2f}% from previous day
-
-
Source: Yahoo Finance
-
- """, unsafe_allow_html=True)
- else:
- st.error(f"❌ Unable to fetch real price data for {ticker}")
- st.info("Please check if the ticker symbol is valid and try again.")
-
- with col2:
- st.markdown("**📊 Sentiment Distribution**")
-
- from stocksense.database import get_latest_analysis
-
- try:
- cached_result = get_latest_analysis(ticker)
- if cached_result and cached_result.get('sentiment_report'):
- sentiment_data = cached_result['sentiment_report']
-
- if isinstance(sentiment_data, str) and sentiment_data.strip():
- sentiment_counts = {'Positive': 0, 'Negative': 0, 'Neutral': 0}
-
- text_lower = sentiment_data.lower()
-
- lines = sentiment_data.split('\n')
- for line in lines:
- line_lower = line.lower()
- if 'sentiment:' in line_lower or 'sentiment is' in line_lower:
- if 'positive' in line_lower:
- sentiment_counts['Positive'] += 1
- elif 'negative' in line_lower:
- sentiment_counts['Negative'] += 1
- elif 'neutral' in line_lower:
- sentiment_counts['Neutral'] += 1
-
- if sum(sentiment_counts.values()) == 0:
- positive_words = ['positive', 'bullish', 'optimistic', 'strong', 'good', 'gains', 'up', 'growth', 'beat', 'exceeds']
- negative_words = ['negative', 'bearish', 'pessimistic', 'weak', 'bad', 'losses', 'down', 'decline', 'miss', 'disappoints']
- neutral_words = ['neutral', 'mixed', 'stable', 'unchanged', 'moderate']
-
- for word in positive_words:
- sentiment_counts['Positive'] += text_lower.count(word)
- for word in negative_words:
- sentiment_counts['Negative'] += text_lower.count(word)
- for word in neutral_words:
- sentiment_counts['Neutral'] += text_lower.count(word)
-
- max_count = max(sentiment_counts.values())
- if max_count > 10:
- for key in sentiment_counts:
- sentiment_counts[key] = min(sentiment_counts[key], 10)
-
- if sum(sentiment_counts.values()) == 0:
- sentiment_counts['Neutral'] = 1
-
- sentiment_df = pd.DataFrame(
- list(sentiment_counts.items()),
- columns=['Sentiment', 'Count']
- )
-
- if PLOTLY_AVAILABLE:
- colors = {
- 'Positive': '#28a745',
- 'Negative': '#dc3545',
- 'Neutral': '#6c757d'
- }
-
- fig = go.Figure(data=[
- go.Bar(
- x=sentiment_df['Sentiment'],
- y=sentiment_df['Count'],
- marker_color=[colors.get(sent, '#6c757d') for sent in sentiment_df['Sentiment']],
- text=sentiment_df['Count'],
- textposition='auto',
- )
- ])
-
- fig.update_layout(
- title="Sentiment Distribution",
- xaxis_title="Sentiment",
- yaxis_title="Count",
- showlegend=False,
- height=300,
- margin=dict(l=20, r=20, t=40, b=20)
- )
-
- st.plotly_chart(fig, use_container_width=True)
-
- else:
- st.bar_chart(sentiment_df.set_index('Sentiment'), use_container_width=True)
-
- total_headlines = sum(sentiment_counts.values())
- dominant_sentiment = max(sentiment_counts, key=sentiment_counts.get) if total_headlines > 0 else "Neutral"
-
- st.markdown(f"""
-
-
📊 Sentiment Summary
-
Analysis Available: ✅
-
Dominant: {dominant_sentiment}
-
From real market data
-
- """, unsafe_allow_html=True)
- else:
- st.info("No sentiment analysis available. Run a new analysis to see sentiment data.")
- placeholder_df = pd.DataFrame({
- 'Sentiment': ['Positive', 'Negative', 'Neutral'],
- 'Count': [0, 0, 0]
- })
- st.bar_chart(placeholder_df.set_index('Sentiment'), use_container_width=True)
- else:
- st.info("No recent sentiment analysis available. Run a new analysis to see sentiment data.")
- placeholder_df = pd.DataFrame({
- 'Sentiment': ['Positive', 'Negative', 'Neutral'],
- 'Count': [0, 0, 0]
- })
- st.bar_chart(placeholder_df.set_index('Sentiment'), use_container_width=True)
- except Exception as e:
- st.warning("Unable to load sentiment data. Run a new analysis to generate sentiment insights.")
- placeholder_df = pd.DataFrame({
- 'Sentiment': ['Positive', 'Negative', 'Neutral'],
- 'Count': [0, 0, 0]
- })
- st.bar_chart(placeholder_df.set_index('Sentiment'), use_container_width=True)
-
-
-def display_key_metrics(ticker: str):
- """Displays key financial metrics."""
- st.markdown("### 💰 Key Metrics")
-
- from stocksense.data_collectors import get_price_history
-
- with st.spinner("Loading real market metrics..."):
- price_data = get_price_history(ticker, period="1mo")
-
- if price_data is not None and not price_data.empty:
- current_price = price_data['Close'].iloc[-1]
- prev_price = price_data['Close'].iloc[-2] if len(price_data) > 1 else current_price
- high_30d = price_data['High'].max()
- low_30d = price_data['Low'].min()
-
- returns = price_data['Close'].pct_change().dropna()
- volatility = returns.std() * 100 if len(returns) > 1 else 0
-
- col1, col2, col3, col4 = st.columns(4)
-
- with col1:
- change_pct = ((current_price - prev_price) / prev_price * 100) if prev_price != 0 else 0
- st.metric(
- "💵 Current Price",
- f"${current_price:.2f}",
- f"{change_pct:+.2f}%"
- )
-
- with col2:
- st.metric(
- "📊 30D High",
- f"${high_30d:.2f}",
- help="Highest price in the last 30 days"
- )
-
- with col3:
- st.metric(
- "📉 30D Low",
- f"${low_30d:.2f}",
- help="Lowest price in the last 30 days"
- )
-
- with col4:
- st.metric(
- "📈 Volatility",
- f"{volatility:.1f}%",
- help="Price volatility over the period"
- )
-
- st.caption("📊 All metrics sourced from Yahoo Finance real-time data")
- else:
- st.error(f"❌ Unable to fetch real market data for {ticker}")
- st.info("Please verify the ticker symbol and try again.")
-
-
-def clear_database_cache() -> bool:
- """Clear all cached analysis results from the database."""
- try:
- import sqlite3
- import os
-
- # Get database path (same logic as in database.py)
- current_file_dir = os.path.dirname(os.path.abspath(__file__))
- db_path = os.path.join(current_file_dir, 'stocksense.db')
-
- if os.path.exists(db_path):
- with sqlite3.connect(db_path) as conn:
- cursor = conn.cursor()
- cursor.execute('DELETE FROM analysis_cache')
- rows_deleted = cursor.rowcount
- conn.commit()
- return True, rows_deleted
- else:
- return True, 0 # No database file, so "cleared"
-
- except Exception as e:
- return False, str(e)
-
-
-def get_cache_stats() -> dict:
- """Get statistics about cached analysis results."""
- try:
- import sqlite3
- import os
-
- current_file_dir = os.path.dirname(os.path.abspath(__file__))
- db_path = os.path.join(current_file_dir, 'stocksense.db')
-
- if not os.path.exists(db_path):
- return {"total_analyses": 0, "unique_tickers": 0, "db_size_mb": 0}
-
- # Get file size
- db_size_mb = round(os.path.getsize(db_path) / (1024 * 1024), 2)
-
- with sqlite3.connect(db_path) as conn:
- cursor = conn.cursor()
-
- # Check if table exists
- cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='analysis_cache'")
- if not cursor.fetchone():
- return {"total_analyses": 0, "unique_tickers": 0, "db_size_mb": db_size_mb}
-
- # Get total analyses
- cursor.execute('SELECT COUNT(*) FROM analysis_cache')
- total_analyses = cursor.fetchone()[0]
-
- # Get unique tickers
- cursor.execute('SELECT COUNT(DISTINCT ticker) FROM analysis_cache')
- unique_tickers = cursor.fetchone()[0]
-
- return {
- "total_analyses": total_analyses,
- "unique_tickers": unique_tickers,
- "db_size_mb": db_size_mb
- }
-
- except Exception as e:
- return {"total_analyses": 0, "unique_tickers": 0, "db_size_mb": 0, "error": str(e)}
-
-
-def get_cached_tickers() -> list:
- """Get list of all cached ticker symbols."""
- try:
- import sqlite3
- import os
-
- current_file_dir = os.path.dirname(os.path.abspath(__file__))
- db_path = os.path.join(current_file_dir, 'stocksense.db')
-
- if not os.path.exists(db_path):
- return []
-
- with sqlite3.connect(db_path) as conn:
- cursor = conn.cursor()
-
- # Check if table exists
- cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='analysis_cache'")
- if not cursor.fetchone():
- return []
-
- cursor.execute('''
- SELECT DISTINCT ticker, MAX(timestamp) as latest_timestamp, COUNT(*) as analysis_count
- FROM analysis_cache
- GROUP BY ticker
- ORDER BY latest_timestamp DESC
- ''')
-
- results = cursor.fetchall()
- return [{"ticker": result[0], "latest": result[1], "count": result[2]} for result in results]
-
- except Exception as e:
- return []
-
-
-def display_analysis_history():
- """Displays analysis history in sidebar."""
- if st.session_state.analysis_history:
- st.markdown("### 📚 Recent Analyses")
-
- for i, analysis in enumerate(st.session_state.analysis_history[:5]):
- ticker = analysis['ticker']
- timestamp = analysis['timestamp']
- dt = datetime.fromisoformat(timestamp)
- time_str = dt.strftime("%m/%d %H:%M")
-
- if st.button(f"📊 {ticker} - {time_str}", key=f"history_{i}"):
- st.session_state.analysis_result = analysis
- st.rerun()
-
-
-def display_sidebar():
- with st.sidebar:
- st.markdown("### 🔧 System Status")
-
- status = check_backend_status()
- if status:
- st.markdown('🟢 Backend Online
', unsafe_allow_html=True)
- else:
- st.markdown('🔴 Backend Offline
', unsafe_allow_html=True)
- st.warning("Start the FastAPI server to enable analysis")
-
- display_analysis_history()
-
- st.markdown("---")
-
- st.markdown("### ℹ️ About ReAct Agent")
-
- with st.expander("How it works"):
- st.markdown("""
- **ReAct Pattern (Reasoning + Action):**
-
- 1. 🧠 **Reasons** about market conditions
- 2. 🔧 **Acts** by selecting appropriate tools
- 3. 📊 **Observes** the results
- 4. 🔄 **Adapts** strategy based on findings
- 5. ✅ **Concludes** with comprehensive analysis
-
- **Features:**
- - Autonomous decision making
- - Dynamic tool selection
- - Real-time sentiment analysis
- - Market trend identification
- """)
-
- with st.expander("Data Sources"):
- st.markdown("""
- - 📰 **NewsAPI**: Latest market news
- - 📈 **Yahoo Finance**: Price data
- - 🤖 **Google Gemini**: AI analysis
- - 💾 **SQLite**: Result caching
- """)
-
- st.markdown("---")
-
- # Cache Management Section
- st.markdown("### 💾 Cache Management")
-
- # Get cache statistics
- cache_stats = get_cache_stats()
-
- if cache_stats.get("error"):
- st.error(f"❌ Cache error: {cache_stats['error']}")
- else:
- # Display cache statistics in a nice format
- col1, col2 = st.columns(2)
- with col1:
- st.metric("📊 Total Analyses", cache_stats["total_analyses"])
- with col2:
- st.metric("🎯 Unique Stocks", cache_stats["unique_tickers"])
-
- if cache_stats["db_size_mb"] > 0:
- st.metric("💾 Database Size", f"{cache_stats['db_size_mb']} MB")
-
- # Clear cache section
- if cache_stats["total_analyses"] > 0:
- with st.expander("🗑️ Clear Cache", expanded=False):
- st.markdown(f"""
- **Current Cache Status:**
- - {cache_stats['total_analyses']} cached analyses
- - {cache_stats['unique_tickers']} different stocks
- - {cache_stats['db_size_mb']} MB database size
- """)
-
- # Show cached tickers
- cached_tickers = get_cached_tickers()
- if cached_tickers:
- st.markdown("**Cached Stocks:**")
- ticker_display = ", ".join([f"`{item['ticker']}`" for item in cached_tickers[:10]])
- if len(cached_tickers) > 10:
- ticker_display += f" *(+{len(cached_tickers)-10} more)*"
- st.markdown(ticker_display)
-
- st.markdown("""
- ⚠️ **Warning:** This will permanently delete all cached analysis results.
- Fresh analyses will take longer but will use the latest data.
- """)
-
- # Confirmation checkbox
- confirm_clear = st.checkbox("I understand this action cannot be undone", key="confirm_cache_clear")
-
- # Clear button (only enabled when confirmed)
- if st.button(
- "🗑️ Clear All Cached Results",
- type="secondary",
- disabled=not confirm_clear,
- help="Permanently delete all cached analysis results",
- use_container_width=True
- ):
- with st.spinner("Clearing cache..."):
- success, result = clear_database_cache()
-
- if success:
- st.success(f"✅ Successfully cleared {result} cached analyses!")
- # Also clear session state
- st.session_state.analysis_result = None
- st.session_state.analysis_history = []
- time.sleep(1) # Brief pause to show success message
- st.rerun()
- else:
- st.error(f"❌ Failed to clear cache: {result}")
- else:
- st.info("📭 No cached results to clear")
-
- st.markdown("---")
-
- if st.button("🗑️ Clear Session Data", help="Clear current analysis results and history from this session only"):
- st.session_state.analysis_result = None
- st.session_state.analysis_history = []
- st.rerun()
-
-def main():
- """Main function for the Streamlit application."""
- display_hero_section()
- display_sidebar()
-
- main_container = st.container()
-
- with main_container:
- ticker = display_ticker_input()
-
- col1, col2, col3 = st.columns([1, 2, 1])
-
- with col2:
- is_valid, error_msg = validate_ticker(ticker)
-
- if not is_valid and ticker:
- st.error(f"❌ {error_msg}")
-
- analyze_button = st.button(
- "🚀 Analyze with ReAct Agent",
- type="primary",
- use_container_width=True,
- disabled=not is_valid,
- help="Trigger autonomous AI analysis using the ReAct pattern"
- )
-
- if analyze_button and is_valid:
- result = trigger_analysis(ticker)
- if result and result.get('success'):
- st.success(f"✅ Analysis completed for **{ticker}**!")
-
- st.markdown("---")
-
- if st.session_state.analysis_result:
- result_data = st.session_state.analysis_result
- ticker = result_data['ticker']
- data = result_data['data']
-
- col1, col2 = st.columns([4, 1])
-
- with col1:
- st.markdown(f"## 📊 Analysis Results: {ticker}")
-
- with col2:
- if st.button("🗑️ Clear", help="Clear current results"):
- st.session_state.analysis_result = None
- st.rerun()
-
- st.markdown(" ", unsafe_allow_html=True)
-
- # Use the enhanced tabbed display
- display_enhanced_analysis_results(data, ticker)
-
- else:
- st.markdown("""
-
-
👋 Welcome to StockSense AI Agent
-
- Select a stock ticker above to begin your AI-powered market analysis
-
-
- Our ReAct agent will autonomously reason about market conditions
- and select the best tools for comprehensive analysis
-
-
- """, unsafe_allow_html=True)
-
-if __name__ == "__main__":
- main()
\ No newline at end of file
diff --git a/backend/.dockerignore b/backend/.dockerignore
new file mode 100644
index 0000000..7d1e921
--- /dev/null
+++ b/backend/.dockerignore
@@ -0,0 +1,11 @@
+# .dockerignore - for development builds
+Dockerfile
+.git
+.gitignore
+README.md
+__pycache__
+*.pyc
+.pytest_cache
+node_modules
+.vscode
+.idea
\ No newline at end of file
diff --git a/backend/Dockerfile b/backend/Dockerfile
new file mode 100644
index 0000000..a5a152e
--- /dev/null
+++ b/backend/Dockerfile
@@ -0,0 +1,19 @@
+FROM python:3.11-slim
+
+ENV PYTHONDONTWRITEBYTECODE=1
+ENV PYTHONUNBUFFERED=1
+
+WORKDIR /app
+
+COPY requirements.txt .
+
+RUN pip install --upgrade pip
+RUN pip install -r requirements.txt
+
+COPY . .
+
+EXPOSE 8000
+
+CMD [ "python", "manage.py", "runserver", "0.0.0.0:8000" ]
+
+
diff --git a/stocksense/__init__.py b/backend/backend/__init__.py
similarity index 100%
rename from stocksense/__init__.py
rename to backend/backend/__init__.py
diff --git a/backend/backend/asgi.py b/backend/backend/asgi.py
new file mode 100644
index 0000000..6aa1b52
--- /dev/null
+++ b/backend/backend/asgi.py
@@ -0,0 +1,16 @@
+"""
+ASGI config for backend project.
+
+It exposes the ASGI callable as a module-level variable named ``application``.
+
+For more information on this file, see
+https://docs.djangoproject.com/en/5.2/howto/deployment/asgi/
+"""
+
+import os
+
+from django.core.asgi import get_asgi_application
+
+os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')
+
+application = get_asgi_application()
diff --git a/backend/backend/settings.py b/backend/backend/settings.py
new file mode 100644
index 0000000..8c6abec
--- /dev/null
+++ b/backend/backend/settings.py
@@ -0,0 +1,171 @@
+"""
+Django settings for backend project.
+
+Generated by 'django-admin startproject' using Django 5.2.5.
+
+For more information on this file, see
+https://docs.djangoproject.com/en/5.2/topics/settings/
+
+For the full list of settings and their values, see
+https://docs.djangoproject.com/en/5.2/ref/settings/
+"""
+
+from pathlib import Path
+import os
+from dotenv import load_dotenv
+
+# Load environment variables from .env file
+load_dotenv()
+
+# Build paths inside the project like this: BASE_DIR / 'subdir'.
+BASE_DIR = Path(__file__).resolve().parent.parent
+
+# Quick-start development settings - unsuitable for production
+# See https://docs.djangoproject.com/en/5.2/howto/deployment/checklist/
+
+SECRET_KEY = os.getenv('SECRET_KEY')
+
+DEBUG = False
+
+ALLOWED_HOSTS = ["*"] # Allow all hosts for development; restrict in production
+
+REST_FRAMEWORK = {
+ "DEFAULT_AUTHENTICATION_CLASSES": [
+ "users.authentication.FirebaseAuthentication",
+ ],
+ "DEFAULT_PERMISSION_CLASSES": [
+ "rest_framework.permissions.AllowAny",
+ ],
+ "DEFAULT_RENDERER_CLASSES": [
+ "rest_framework.renderers.JSONRenderer",
+ ],
+ "DEFAULT_PARSER_CLASSES": [
+ "rest_framework.parsers.JSONParser",
+ ],
+}
+
+# Application definition
+
+INSTALLED_APPS = [
+ 'django.contrib.admin',
+ 'django.contrib.auth',
+ 'django.contrib.contenttypes',
+ 'django.contrib.sessions',
+ 'django.contrib.messages',
+ 'django.contrib.staticfiles',
+ 'rest_framework',
+ 'corsheaders',
+ 'users',
+ 'stocks',
+ 'watchlists',
+]
+
+MIDDLEWARE = [
+ 'corsheaders.middleware.CorsMiddleware',
+ 'django.middleware.security.SecurityMiddleware',
+ 'django.contrib.sessions.middleware.SessionMiddleware',
+ 'django.middleware.common.CommonMiddleware',
+ 'django.contrib.auth.middleware.AuthenticationMiddleware',
+ 'django.contrib.messages.middleware.MessageMiddleware',
+ 'django.middleware.clickjacking.XFrameOptionsMiddleware',
+]
+
+ROOT_URLCONF = 'backend.urls'
+
+TEMPLATES = [
+ {
+ 'BACKEND': 'django.template.backends.django.DjangoTemplates',
+ 'DIRS': [],
+ 'APP_DIRS': True,
+ 'OPTIONS': {
+ 'context_processors': [
+ 'django.template.context_processors.request',
+ 'django.contrib.auth.context_processors.auth',
+ 'django.contrib.messages.context_processors.messages',
+ ],
+ },
+ },
+]
+
+WSGI_APPLICATION = 'backend.wsgi.application'
+
+# Database
+# https://docs.djangoproject.com/en/5.2/ref/settings/#databases
+
+import dj_database_url
+
+DATABASES = {
+ 'default': dj_database_url.config()
+}
+
+# Password validation
+# https://docs.djangoproject.com/en/5.2/ref/settings/#auth-password-validators
+
+AUTH_PASSWORD_VALIDATORS = [
+ {
+ 'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
+ },
+ {
+ 'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
+ },
+ {
+ 'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
+ },
+ {
+ 'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
+ },
+]
+
+# Internationalization
+# https://docs.djangoproject.com/en/5.2/topics/i18n/
+
+LANGUAGE_CODE = 'en-us'
+
+TIME_ZONE = 'UTC'
+
+USE_I18N = True
+
+USE_TZ = True
+
+# Static files (CSS, JavaScript, Images)
+# https://docs.djangoproject.com/en/5.2/howto/static-files/
+
+STATIC_URL = 'static/'
+
+# Default primary key field type
+# https://docs.djangoproject.com/en/5.2/ref/settings/#default-auto-field
+
+DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
+
+# CORS settings for frontend communication
+CORS_ALLOWED_ORIGINS = [
+ "http://localhost:3000",
+ "http://127.0.0.1:3000",
+ "http://localhost:5173", # Vite default port
+ "http://127.0.0.1:5173",
+]
+
+CORS_ALLOW_CREDENTIALS = True
+CORS_ALLOW_ALL_ORIGINS = True # Set to True only for development if needed
+CORS_ALLOWED_HEADERS = [
+ 'accept',
+ 'accept-encoding',
+ 'authorization',
+ 'content-type',
+ 'dnt',
+ 'origin',
+ 'user-agent',
+ 'x-csrftoken',
+ 'x-requested-with',
+]
+
+# Custom user model
+AUTH_USER_MODEL = 'users.User'
+
+# Disable CSRF for API endpoints (for development)
+CSRF_TRUSTED_ORIGINS = [
+ "http://localhost:3000",
+ "http://127.0.0.1:3000",
+ "http://localhost:5173",
+ "http://127.0.0.1:5173",
+]
diff --git a/backend/backend/urls.py b/backend/backend/urls.py
new file mode 100644
index 0000000..c9e5206
--- /dev/null
+++ b/backend/backend/urls.py
@@ -0,0 +1,44 @@
+"""
+URL configuration for backend project.
+
+The `urlpatterns` list routes URLs to views. For more information please see:
+ https://docs.djangoproject.com/en/5.2/topics/http/urls/
+Examples:
+Function views
+ 1. Add an import: from my_app import views
+ 2. Add a URL to urlpatterns: path('', views.home, name='home')
+Class-based views
+ 1. Add an import: from other_app.views import Home
+ 2. Add a URL to urlpatterns: path('', Home.as_view(), name='home')
+Including another URLconf
+ 1. Import the include() function: from django.urls import include, path
+ 2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
+"""
+from django.contrib import admin
+from django.urls import path, include
+from django.http import JsonResponse
+from django.views.decorators.csrf import csrf_exempt
+from datetime import datetime
+import json
+
+def health_check(request):
+ """Simple health check endpoint"""
+ try:
+ return JsonResponse({
+ 'message': 'StockSense Backend API is running',
+ 'timestamp': datetime.now().isoformat(),
+ }, status=200)
+
+ except Exception as e:
+ return JsonResponse({
+ 'message': f'Health check failed: {str(e)}',
+ 'timestamp': datetime.now().isoformat()
+ }, status=503)
+
+urlpatterns = [
+ path('', health_check, name='health_check'),
+ path('admin/', admin.site.urls),
+ path('api/users/', include('users.urls')),
+ path('api/watchlists/', include('watchlists.urls')),
+ path('api/stocks/', include('stocks.urls')),
+]
diff --git a/backend/backend/wsgi.py b/backend/backend/wsgi.py
new file mode 100644
index 0000000..ce5c079
--- /dev/null
+++ b/backend/backend/wsgi.py
@@ -0,0 +1,16 @@
+"""
+WSGI config for backend project.
+
+It exposes the WSGI callable as a module-level variable named ``application``.
+
+For more information on this file, see
+https://docs.djangoproject.com/en/5.2/howto/deployment/wsgi/
+"""
+
+import os
+
+from django.core.wsgi import get_wsgi_application
+
+os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')
+
+application = get_wsgi_application()
diff --git a/backend/firebase_utils.py b/backend/firebase_utils.py
new file mode 100644
index 0000000..0ba408a
--- /dev/null
+++ b/backend/firebase_utils.py
@@ -0,0 +1,71 @@
+import firebase_admin
+from firebase_admin import credentials, auth
+import os
+from dotenv import load_dotenv
+
+load_dotenv()
+
+# Required Firebase environment variables
+required_env_vars = [
+ 'FIREBASE_PROJECT_ID',
+ 'FIREBASE_PRIVATE_KEY_ID',
+ 'FIREBASE_PRIVATE_KEY',
+ 'FIREBASE_CLIENT_EMAIL',
+ 'FIREBASE_CLIENT_ID',
+ 'FIREBASE_CLIENT_X509_CERT_URL'
+]
+
+# Check if all required environment variables are present
+missing_vars = [var for var in required_env_vars if not os.getenv(var)]
+
+# Allow tests to run without Firebase if we're in a test environment
+is_testing = (
+ os.getenv('DJANGO_SETTINGS_MODULE', '').endswith('test') or
+ 'test' in os.getenv('DATABASE_URL', '') or
+ os.getenv('TESTING', '').lower() == 'true'
+)
+
+if missing_vars and not is_testing:
+ raise EnvironmentError(f"Missing required Firebase environment variables: {', '.join(missing_vars)}")
+
+# Skip Firebase initialization if we're in testing mode and variables are missing
+if missing_vars and is_testing:
+ print("Skipping Firebase initialization in test environment - Firebase variables not set")
+ firebase_config = None
+ FIREBASE_STORAGE_BUCKET = None
+else:
+ # Get Firebase configuration from environment variables ONLY
+ firebase_config = {
+ "type": os.getenv('FIREBASE_TYPE', 'service_account'),
+ "project_id": os.getenv('FIREBASE_PROJECT_ID'),
+ "private_key_id": os.getenv('FIREBASE_PRIVATE_KEY_ID'),
+ "private_key": os.getenv('FIREBASE_PRIVATE_KEY'),
+ "client_email": os.getenv('FIREBASE_CLIENT_EMAIL'),
+ "client_id": os.getenv('FIREBASE_CLIENT_ID'),
+ "auth_uri": os.getenv('FIREBASE_AUTH_URI', 'https://accounts.google.com/o/oauth2/auth'),
+ "token_uri": os.getenv('FIREBASE_TOKEN_URI', 'https://oauth2.googleapis.com/token'),
+ "auth_provider_x509_cert_url": os.getenv('FIREBASE_AUTH_PROVIDER_X509_CERT_URL', 'https://www.googleapis.com/oauth2/v1/certs'),
+ "client_x509_cert_url": os.getenv('FIREBASE_CLIENT_X509_CERT_URL'),
+ "universe_domain": os.getenv('FIREBASE_UNIVERSE_DOMAIN', 'googleapis.com')
+ }
+
+ FIREBASE_STORAGE_BUCKET = os.getenv('FIREBASE_STORAGE_BUCKET', 'stocksense-e7226.appspot.com')
+
+# Initialize Firebase app only once using ONLY environment variables
+if not firebase_admin._apps and firebase_config is not None:
+ try:
+ cred = credentials.Certificate(firebase_config)
+ firebase_admin.initialize_app(cred, {
+ 'projectId': firebase_config['project_id'],
+ 'storageBucket': FIREBASE_STORAGE_BUCKET
+ })
+ print("✅ Firebase initialized successfully using environment variables")
+ except Exception as e:
+ if not is_testing:
+ raise Exception(f"Failed to initialize Firebase using environment variables: {str(e)}")
+ else:
+ print(f"⚠️ Firebase initialization failed in test environment: {str(e)}")
+elif firebase_config is None:
+ print("⚠️ Firebase not initialized - running in test mode")
+
+
diff --git a/backend/manage.py b/backend/manage.py
new file mode 100644
index 0000000..8871ba0
--- /dev/null
+++ b/backend/manage.py
@@ -0,0 +1,21 @@
+"""Django's command-line utility for administrative tasks."""
+import os
+import sys
+
+
+def main():
+ """Run administrative tasks."""
+ os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')
+ try:
+ from django.core.management import execute_from_command_line
+ except ImportError as exc:
+ raise ImportError(
+ "Couldn't import Django. Are you sure it's installed and "
+ "available on your PYTHONPATH environment variable? Did you "
+ "forget to activate a virtual environment?"
+ ) from exc
+ execute_from_command_line(sys.argv)
+
+
+if __name__ == '__main__':
+ main()
diff --git a/backend/requirements.txt b/backend/requirements.txt
new file mode 100644
index 0000000..aac7d08
--- /dev/null
+++ b/backend/requirements.txt
@@ -0,0 +1,8 @@
+Django==5.2.5
+djangorestframework==3.14.0
+django-cors-headers==4.3.1
+firebase-admin==6.2.0
+python-decouple==3.8
+python-dotenv==1.0.0
+psycopg2-binary==2.9.7
+dj-database-url==3.0.1
\ No newline at end of file
diff --git a/backend/stocks/__init__.py b/backend/stocks/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/stocks/admin.py b/backend/stocks/admin.py
new file mode 100644
index 0000000..8c38f3f
--- /dev/null
+++ b/backend/stocks/admin.py
@@ -0,0 +1,3 @@
+from django.contrib import admin
+
+# Register your models here.
diff --git a/backend/stocks/apps.py b/backend/stocks/apps.py
new file mode 100644
index 0000000..6190e0c
--- /dev/null
+++ b/backend/stocks/apps.py
@@ -0,0 +1,6 @@
+from django.apps import AppConfig
+
+
+class StocksConfig(AppConfig):
+ default_auto_field = 'django.db.models.BigAutoField'
+ name = 'stocks'
diff --git a/backend/stocks/management/__init__.py b/backend/stocks/management/__init__.py
new file mode 100644
index 0000000..25e3eca
--- /dev/null
+++ b/backend/stocks/management/__init__.py
@@ -0,0 +1 @@
+# This file makes Python treat the directories as packages
diff --git a/backend/stocks/management/commands/__init__.py b/backend/stocks/management/commands/__init__.py
new file mode 100644
index 0000000..25e3eca
--- /dev/null
+++ b/backend/stocks/management/commands/__init__.py
@@ -0,0 +1 @@
+# This file makes Python treat the directories as packages
diff --git a/backend/stocks/management/commands/load_stocks.py b/backend/stocks/management/commands/load_stocks.py
new file mode 100644
index 0000000..e759df5
--- /dev/null
+++ b/backend/stocks/management/commands/load_stocks.py
@@ -0,0 +1,107 @@
+import csv
+import os
+from django.core.management.base import BaseCommand
+from django.conf import settings
+from stocks.models import Stock
+
+
+class Command(BaseCommand):
+ help = 'Load stocks from NASDAQ screener CSV file into the Stock model'
+
+ def add_arguments(self, parser):
+ parser.add_argument(
+ '--csv-file',
+ type=str,
+ default='nasdaq_screener_1756622606980.csv',
+ help='Path to the CSV file (relative to project root or absolute path)'
+ )
+ parser.add_argument(
+ '--clear',
+ action='store_true',
+ help='Clear existing stocks before loading new ones'
+ )
+
+ def handle(self, *args, **options):
+ csv_file = options['csv_file']
+
+ # If it's not an absolute path, assume it's in the project root
+ if not os.path.isabs(csv_file):
+ project_root = settings.BASE_DIR.parent # Go up from backend/ to project root
+ csv_file = os.path.join(project_root, csv_file)
+
+ if not os.path.exists(csv_file):
+ self.stdout.write(
+ self.style.ERROR(f'CSV file not found: {csv_file}')
+ )
+ return
+
+ # Clear existing stocks if requested
+ if options['clear']:
+ count = Stock.objects.count()
+ Stock.objects.all().delete()
+ self.stdout.write(
+ self.style.WARNING(f'Cleared {count} existing stocks')
+ )
+
+ # Load stocks from CSV
+ created_count = 0
+ updated_count = 0
+ error_count = 0
+
+ try:
+ with open(csv_file, 'r', encoding='utf-8') as file:
+ reader = csv.DictReader(file)
+
+ for row in reader:
+ symbol = row['Symbol'].strip().upper()
+ name = row['Name'].strip()
+
+ # Skip empty rows
+ if not symbol or not name:
+ continue
+
+ try:
+ # Use get_or_create to avoid duplicates
+ stock, created = Stock.objects.get_or_create(
+ symbol=symbol,
+ defaults={'name': name}
+ )
+
+ if created:
+ created_count += 1
+ self.stdout.write(f'Created: {symbol} - {name}')
+ else:
+ # Update name if stock exists but name is different
+ if stock.name != name:
+ stock.name = name
+ stock.save()
+ updated_count += 1
+ self.stdout.write(f'Updated: {symbol} - {name}')
+
+ except Exception as e:
+ error_count += 1
+ self.stdout.write(
+ self.style.ERROR(f'Error processing {symbol}: {str(e)}')
+ )
+
+ except Exception as e:
+ self.stdout.write(
+ self.style.ERROR(f'Error reading CSV file: {str(e)}')
+ )
+ return
+
+ # Summary
+ self.stdout.write('\n' + '='*50)
+ self.stdout.write(
+ self.style.SUCCESS(f'Successfully processed CSV file: {csv_file}')
+ )
+ self.stdout.write(f'Stocks created: {created_count}')
+ self.stdout.write(f'Stocks updated: {updated_count}')
+ if error_count > 0:
+ self.stdout.write(
+ self.style.WARNING(f'Errors encountered: {error_count}')
+ )
+
+ total_stocks = Stock.objects.count()
+ self.stdout.write(f'Total stocks in database: {total_stocks}')
+ self.stdout.write('='*50)
diff --git a/backend/stocks/migrations/0001_initial.py b/backend/stocks/migrations/0001_initial.py
new file mode 100644
index 0000000..1679ea7
--- /dev/null
+++ b/backend/stocks/migrations/0001_initial.py
@@ -0,0 +1,22 @@
+# Generated by Django 5.2.5 on 2025-08-31 05:05
+
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ initial = True
+
+ dependencies = [
+ ]
+
+ operations = [
+ migrations.CreateModel(
+ name='Stock',
+ fields=[
+ ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
+ ('symbol', models.CharField(db_index=True, max_length=10, unique=True)),
+ ('name', models.CharField(max_length=200)),
+ ],
+ ),
+ ]
diff --git a/backend/stocks/migrations/__init__.py b/backend/stocks/migrations/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/stocks/models.py b/backend/stocks/models.py
new file mode 100644
index 0000000..6d0ca3e
--- /dev/null
+++ b/backend/stocks/models.py
@@ -0,0 +1,10 @@
+from django.db import models
+
+# Create your models here.
+class Stock(models.Model):
+ symbol = models.CharField(max_length=10, unique=True, db_index=True)
+ name = models.CharField(max_length=200)
+
+
+ def __str__(self):
+ return f"{self.symbol} - {self.name}"
\ No newline at end of file
diff --git a/backend/stocks/serializers.py b/backend/stocks/serializers.py
new file mode 100644
index 0000000..c1e2fa7
--- /dev/null
+++ b/backend/stocks/serializers.py
@@ -0,0 +1,13 @@
+from rest_framework import serializers
+from .models import Stock
+
+
+class StockSerializer(serializers.ModelSerializer):
+ class Meta:
+ model = Stock
+ fields = ['id', 'symbol', 'name']
+
+ def create(self, validated_data):
+ # Ensure symbol is uppercase
+ validated_data['symbol'] = validated_data['symbol'].upper()
+ return super().create(validated_data)
diff --git a/backend/stocks/tests.py b/backend/stocks/tests.py
new file mode 100644
index 0000000..7ff7a9c
--- /dev/null
+++ b/backend/stocks/tests.py
@@ -0,0 +1,147 @@
+from django.test import TestCase
+from django.db import IntegrityError
+from django.core.exceptions import ValidationError
+from .models import Stock
+
+
+class StockModelTests(TestCase):
+ """Test cases for Stock model"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.stock_data = {
+ 'symbol': 'AAPL',
+ 'name': 'Apple Inc.'
+ }
+ # Create a stock for read-only tests
+ self.existing_stock = Stock.objects.create(**self.stock_data)
+
+ def test_stock_str_method(self):
+ """Test Stock __str__ method"""
+ expected_str = "AAPL - Apple Inc."
+ self.assertEqual(str(self.existing_stock), expected_str)
+
+ def test_symbol_max_length(self):
+ """Test symbol field max length constraint"""
+ long_symbol_data = {
+ 'symbol': 'VERYLONGSYMBOL', # More than 10 characters
+ 'name': 'Test Company'
+ }
+
+ stock = Stock(**long_symbol_data)
+ with self.assertRaises(ValidationError):
+ stock.full_clean()
+
+ def test_name_max_length(self):
+ """Test name field max length constraint"""
+ long_name_data = {
+ 'symbol': 'TEST',
+ 'name': 'A' * 201 # More than 200 characters
+ }
+
+ stock = Stock(**long_name_data)
+ with self.assertRaises(ValidationError):
+ stock.full_clean()
+
+ def test_empty_fields_validation(self):
+ """Test validation with empty required fields"""
+ # Test empty symbol
+ with self.assertRaises(ValidationError):
+ stock = Stock(symbol='', name='Test Company')
+ stock.full_clean()
+
+ # Test empty name
+ with self.assertRaises(ValidationError):
+ stock = Stock(symbol='TEST', name='')
+ stock.full_clean()
+
+
+class StockQueryTests(TestCase):
+ """Test cases for Stock model queries and database operations"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.stocks = [
+ Stock.objects.create(symbol='AAPL', name='Apple Inc.'),
+ Stock.objects.create(symbol='GOOGL', name='Alphabet Inc.'),
+ Stock.objects.create(symbol='MSFT', name='Microsoft Corporation'),
+ Stock.objects.create(symbol='TSLA', name='Tesla Inc.'),
+ Stock.objects.create(symbol='NVDA', name='NVIDIA Corporation')
+ ]
+
+ def test_get_stock_by_symbol(self):
+ """Test getting stock by symbol"""
+ stock = Stock.objects.get(symbol='AAPL')
+ self.assertEqual(stock.name, 'Apple Inc.')
+
+ def test_filter_stocks_by_name_contains(self):
+ """Test filtering stocks by name containing text"""
+ results = Stock.objects.filter(name__icontains='Inc')
+
+ # Should return Apple, Tesla, and potentially others with 'Inc'
+ symbols = [stock.symbol for stock in results]
+ self.assertIn('AAPL', symbols)
+ self.assertIn('TSLA', symbols)
+
+ def test_filter_stocks_by_symbol_startswith(self):
+ """Test filtering stocks by symbol starting with letters"""
+ results = Stock.objects.filter(symbol__startswith='A')
+ symbols = [stock.symbol for stock in results]
+ self.assertIn('AAPL', symbols)
+
+ def test_order_stocks_by_symbol(self):
+ """Test ordering stocks by symbol"""
+ ordered_stocks = Stock.objects.order_by('symbol')
+ symbols = [stock.symbol for stock in ordered_stocks]
+
+ # Should be in alphabetical order
+ self.assertEqual(symbols, sorted(symbols))
+
+ def test_order_stocks_by_name(self):
+ """Test ordering stocks by name"""
+ ordered_stocks = Stock.objects.order_by('name')
+ names = [stock.name for stock in ordered_stocks]
+
+ # Should be in alphabetical order
+ self.assertEqual(names, sorted(names))
+
+ def test_count_stocks(self):
+ """Test counting stocks"""
+ count = Stock.objects.count()
+ self.assertEqual(count, 5)
+
+ def test_stock_exists(self):
+ """Test checking if stock exists"""
+ exists = Stock.objects.filter(symbol='AAPL').exists()
+ not_exists = Stock.objects.filter(symbol='NONEXISTENT').exists()
+
+ self.assertTrue(exists)
+ self.assertFalse(not_exists)
+
+
+class StockValidationTests(TestCase):
+ """Test cases for Stock model validation"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.stock = Stock.objects.create(symbol='AAPL', name='Apple Inc.')
+
+ def test_model_field_attributes(self):
+ """Test model field attributes are set correctly"""
+ symbol_field = Stock._meta.get_field('symbol')
+ name_field = Stock._meta.get_field('name')
+
+ # Test symbol field attributes
+ self.assertEqual(symbol_field.max_length, 10)
+ self.assertTrue(symbol_field.unique)
+ self.assertTrue(symbol_field.db_index)
+
+ # Test name field attributes
+ self.assertEqual(name_field.max_length, 200)
+ self.assertFalse(name_field.unique)
+
+ def test_existing_stock_properties(self):
+ """Test properties of existing stock"""
+ self.assertEqual(self.stock.symbol, 'AAPL')
+ self.assertEqual(self.stock.name, 'Apple Inc.')
+ self.assertIsNotNone(self.stock.id)
diff --git a/backend/stocks/urls.py b/backend/stocks/urls.py
new file mode 100644
index 0000000..0d73382
--- /dev/null
+++ b/backend/stocks/urls.py
@@ -0,0 +1,6 @@
+from django.urls import path
+from .views import StockListCreateView
+
+urlpatterns = [
+ path('', StockListCreateView.as_view(), name='stock-list-create'),
+]
diff --git a/backend/stocks/views.py b/backend/stocks/views.py
new file mode 100644
index 0000000..d985472
--- /dev/null
+++ b/backend/stocks/views.py
@@ -0,0 +1,27 @@
+from rest_framework.generics import ListCreateAPIView, RetrieveAPIView
+from rest_framework import permissions
+from django.db.models import Q
+from .models import Stock
+from .serializers import StockSerializer
+
+
+class StockListCreateView(ListCreateAPIView):
+ """
+ GET: List all stocks (with search functionality)
+ POST: Create a new stock
+ """
+ serializer_class = StockSerializer
+ permission_classes = [permissions.IsAuthenticated]
+
+ def get_queryset(self):
+ queryset = Stock.objects.all()
+ search = self.request.query_params.get('search', None)
+
+ if search:
+ queryset = queryset.filter(
+ Q(symbol__icontains=search) | Q(name__icontains=search)
+ )
+
+ return queryset.order_by('symbol')
+
+
diff --git a/backend/users/__init__.py b/backend/users/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/users/admin.py b/backend/users/admin.py
new file mode 100644
index 0000000..8c38f3f
--- /dev/null
+++ b/backend/users/admin.py
@@ -0,0 +1,3 @@
+from django.contrib import admin
+
+# Register your models here.
diff --git a/backend/users/apps.py b/backend/users/apps.py
new file mode 100644
index 0000000..72b1401
--- /dev/null
+++ b/backend/users/apps.py
@@ -0,0 +1,6 @@
+from django.apps import AppConfig
+
+
+class UsersConfig(AppConfig):
+ default_auto_field = 'django.db.models.BigAutoField'
+ name = 'users'
diff --git a/backend/users/authentication.py b/backend/users/authentication.py
new file mode 100644
index 0000000..ede841c
--- /dev/null
+++ b/backend/users/authentication.py
@@ -0,0 +1,46 @@
+from rest_framework.authentication import BaseAuthentication
+from rest_framework.exceptions import AuthenticationFailed
+from firebase_admin import auth
+from django.contrib.auth import get_user_model
+import os
+import firebase_utils
+
+User = get_user_model()
+
+class FirebaseAuthentication(BaseAuthentication):
+ def authenticate(self, request):
+ # Skip Firebase authentication in test environments
+ is_testing = (
+ os.getenv('DJANGO_SETTINGS_MODULE', '').endswith('test') or
+ 'test' in os.getenv('DATABASE_URL', '') or
+ os.getenv('TESTING', '').lower() == 'true'
+ )
+
+ if is_testing:
+ # In test environment, skip authentication or use a test user
+ return None
+
+ auth_header = request.headers.get("Authorization")
+ if not auth_header or not auth_header.startswith("Bearer "):
+ return None
+
+ id_token = auth_header.split(" ")[1]
+ try:
+ # Check if Firebase is properly initialized
+ if firebase_utils.firebase_config is None:
+ raise AuthenticationFailed("Firebase not configured")
+
+ decoded = auth.verify_id_token(id_token)
+ except Exception as e:
+ raise AuthenticationFailed(f"Invalid Firebase token: {str(e)}")
+
+ uid = decoded["uid"]
+ try:
+ user = User.objects.get(firebase_uid=uid)
+ except User.DoesNotExist:
+ raise AuthenticationFailed("No such user")
+
+ return (user, None)
+
+
+
diff --git a/backend/users/migrations/0001_initial.py b/backend/users/migrations/0001_initial.py
new file mode 100644
index 0000000..8619c96
--- /dev/null
+++ b/backend/users/migrations/0001_initial.py
@@ -0,0 +1,47 @@
+# Generated by Django 5.2.5 on 2025-08-31 05:05
+
+import django.contrib.auth.models
+import django.contrib.auth.validators
+import django.utils.timezone
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ initial = True
+
+ dependencies = [
+ ('auth', '0012_alter_user_first_name_max_length'),
+ ]
+
+ operations = [
+ migrations.CreateModel(
+ name='User',
+ fields=[
+ ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
+ ('password', models.CharField(max_length=128, verbose_name='password')),
+ ('last_login', models.DateTimeField(blank=True, null=True, verbose_name='last login')),
+ ('is_superuser', models.BooleanField(default=False, help_text='Designates that this user has all permissions without explicitly assigning them.', verbose_name='superuser status')),
+ ('username', models.CharField(error_messages={'unique': 'A user with that username already exists.'}, help_text='Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.', max_length=150, unique=True, validators=[django.contrib.auth.validators.UnicodeUsernameValidator()], verbose_name='username')),
+ ('first_name', models.CharField(blank=True, max_length=150, verbose_name='first name')),
+ ('last_name', models.CharField(blank=True, max_length=150, verbose_name='last name')),
+ ('email', models.EmailField(blank=True, max_length=254, verbose_name='email address')),
+ ('is_staff', models.BooleanField(default=False, help_text='Designates whether the user can log into this admin site.', verbose_name='staff status')),
+ ('is_active', models.BooleanField(default=True, help_text='Designates whether this user should be treated as active. Unselect this instead of deleting accounts.', verbose_name='active')),
+ ('date_joined', models.DateTimeField(default=django.utils.timezone.now, verbose_name='date joined')),
+ ('firebase_uid', models.CharField(db_index=True, max_length=255, unique=True)),
+ ('picture', models.URLField(blank=True, null=True)),
+ ('email_verified', models.BooleanField(default=False)),
+ ('groups', models.ManyToManyField(blank=True, help_text='The groups this user belongs to. A user will get all permissions granted to each of their groups.', related_name='user_set', related_query_name='user', to='auth.group', verbose_name='groups')),
+ ('user_permissions', models.ManyToManyField(blank=True, help_text='Specific permissions for this user.', related_name='user_set', related_query_name='user', to='auth.permission', verbose_name='user permissions')),
+ ],
+ options={
+ 'verbose_name': 'user',
+ 'verbose_name_plural': 'users',
+ 'abstract': False,
+ },
+ managers=[
+ ('objects', django.contrib.auth.models.UserManager()),
+ ],
+ ),
+ ]
diff --git a/backend/users/migrations/0002_alter_user_email_verified.py b/backend/users/migrations/0002_alter_user_email_verified.py
new file mode 100644
index 0000000..84c6efc
--- /dev/null
+++ b/backend/users/migrations/0002_alter_user_email_verified.py
@@ -0,0 +1,18 @@
+# Generated by Django 5.2.5 on 2025-10-02 16:44
+
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('users', '0001_initial'),
+ ]
+
+ operations = [
+ migrations.AlterField(
+ model_name='user',
+ name='email_verified',
+ field=models.BooleanField(blank=True, default=False, null=True),
+ ),
+ ]
diff --git a/backend/users/migrations/__init__.py b/backend/users/migrations/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/users/models.py b/backend/users/models.py
new file mode 100644
index 0000000..02574f5
--- /dev/null
+++ b/backend/users/models.py
@@ -0,0 +1,14 @@
+from django.db import models
+from django.contrib.auth.models import AbstractUser
+
+
+# Create your models here.
+class User(AbstractUser):
+ firebase_uid = models.CharField(max_length=255, unique=True, db_index=True)
+ picture = models.URLField(blank=True, null=True)
+ email_verified = models.BooleanField(default=False, blank=True, null=True)
+
+ def __str__(self):
+ return self.email
+
+
diff --git a/backend/users/tests.py b/backend/users/tests.py
new file mode 100644
index 0000000..321cfcf
--- /dev/null
+++ b/backend/users/tests.py
@@ -0,0 +1,146 @@
+from django.test import TestCase
+from django.contrib.auth import get_user_model
+from .models import User
+
+User = get_user_model()
+
+
+class UserModelTests(TestCase):
+ """Test cases for User model"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.user_data = {
+ 'username': 'testuser@example.com',
+ 'email': 'testuser@example.com',
+ 'firebase_uid': 'test_firebase_uid_123',
+ 'first_name': 'Test',
+ 'last_name': 'User',
+ 'picture': 'https://example.com/avatar.jpg',
+ 'email_verified': True
+ }
+
+ def test_create_user(self):
+ """Test creating a new user"""
+ user = User.objects.create_user(**self.user_data)
+
+ self.assertEqual(user.username, 'testuser@example.com')
+ self.assertEqual(user.email, 'testuser@example.com')
+ self.assertEqual(user.firebase_uid, 'test_firebase_uid_123')
+ self.assertEqual(user.first_name, 'Test')
+ self.assertEqual(user.last_name, 'User')
+ self.assertEqual(user.picture, 'https://example.com/avatar.jpg')
+ self.assertTrue(user.email_verified)
+
+ def test_user_str_method(self):
+ """Test User __str__ method returns email"""
+ user = User.objects.create_user(**self.user_data)
+ self.assertEqual(str(user), 'testuser@example.com')
+
+ def test_firebase_uid_unique(self):
+ """Test firebase_uid is unique"""
+ User.objects.create_user(**self.user_data)
+
+ # Try to create another user with same firebase_uid
+ duplicate_data = self.user_data.copy()
+ duplicate_data['username'] = 'different@example.com'
+ duplicate_data['email'] = 'different@example.com'
+
+ with self.assertRaises(Exception):
+ User.objects.create_user(**duplicate_data)
+
+ def test_user_fields_optional(self):
+ """Test that picture and email_verified fields are optional"""
+ minimal_data = {
+ 'username': 'minimal@example.com',
+ 'email': 'minimal@example.com',
+ 'firebase_uid': 'minimal_firebase_uid'
+ }
+ user = User.objects.create_user(**minimal_data)
+
+ self.assertEqual(user.picture, None)
+ self.assertFalse(user.email_verified)
+
+
+class UserMethodTests(TestCase):
+ """Test cases for User model methods and properties"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.user = User.objects.create_user(
+ username='testuser@example.com',
+ email='testuser@example.com',
+ firebase_uid='test_firebase_uid_123',
+ first_name='Test',
+ last_name='User'
+ )
+
+ def test_get_full_name(self):
+ """Test get_full_name method"""
+ full_name = self.user.get_full_name()
+ self.assertEqual(full_name, 'Test User')
+
+ def test_get_short_name(self):
+ """Test get_short_name method"""
+ short_name = self.user.get_short_name()
+ self.assertEqual(short_name, 'Test')
+
+ def test_is_authenticated_property(self):
+ """Test is_authenticated property"""
+ self.assertTrue(self.user.is_authenticated)
+
+ def test_is_anonymous_property(self):
+ """Test is_anonymous property"""
+ self.assertFalse(self.user.is_anonymous)
+
+
+class UserQueryTests(TestCase):
+ """Test cases for User model queries and database operations"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.user1 = User.objects.create_user(
+ username='user1@example.com',
+ email='user1@example.com',
+ firebase_uid='uid_1',
+ email_verified=True
+ )
+ self.user2 = User.objects.create_user(
+ username='user2@example.com',
+ email='user2@example.com',
+ firebase_uid='uid_2',
+ email_verified=False
+ )
+
+ def test_filter_by_firebase_uid(self):
+ """Test filtering users by firebase_uid"""
+ user = User.objects.filter(firebase_uid='uid_1').first()
+ self.assertEqual(user, self.user1)
+
+ def test_filter_by_email_verified(self):
+ """Test filtering users by email_verified status"""
+ verified_users = User.objects.filter(email_verified=True)
+ unverified_users = User.objects.filter(email_verified=False)
+
+ self.assertIn(self.user1, verified_users)
+ self.assertNotIn(self.user2, verified_users)
+ self.assertIn(self.user2, unverified_users)
+ self.assertNotIn(self.user1, unverified_users)
+
+ def test_get_user_by_email(self):
+ """Test getting user by email"""
+ user = User.objects.get(email='user1@example.com')
+ self.assertEqual(user, self.user1)
+
+ def test_user_count(self):
+ """Test counting users"""
+ count = User.objects.count()
+ self.assertEqual(count, 2)
+
+ def test_user_exists(self):
+ """Test checking if user exists"""
+ exists = User.objects.filter(firebase_uid='uid_1').exists()
+ not_exists = User.objects.filter(firebase_uid='nonexistent').exists()
+
+ self.assertTrue(exists)
+ self.assertFalse(not_exists)
diff --git a/backend/users/urls.py b/backend/users/urls.py
new file mode 100644
index 0000000..0ae362d
--- /dev/null
+++ b/backend/users/urls.py
@@ -0,0 +1,7 @@
+from django.urls import path, include
+from .views import LoginView
+
+
+urlpatterns = [
+ path('login/', LoginView.as_view(), name='login'),
+]
diff --git a/backend/users/views.py b/backend/users/views.py
new file mode 100644
index 0000000..a672bdf
--- /dev/null
+++ b/backend/users/views.py
@@ -0,0 +1,62 @@
+from rest_framework.views import APIView
+from rest_framework.response import Response
+from rest_framework import permissions
+from django.contrib.auth import get_user_model
+from django.views.decorators.csrf import csrf_exempt
+from django.utils.decorators import method_decorator
+from firebase_admin import auth
+import firebase_utils
+
+
+User = get_user_model()
+
+@method_decorator(csrf_exempt, name='dispatch')
+class LoginView(APIView):
+
+ authentication_classes = []
+ permission_classes = [permissions.AllowAny]
+
+ def post(self, request):
+ id_token = request.data.get("idToken")
+ if not id_token:
+ return Response({"detail": "idToken required"}, status=400)
+
+ try:
+ decoded = auth.verify_id_token(id_token)
+ except Exception as e:
+ return Response({"detail": "Invalid token", "error": str(e)}, status=401)
+
+ uid = decoded["uid"]
+ email = decoded.get("email")
+ name = decoded.get("name", "")
+ picture = decoded.get("picture", "")
+ email_verified = decoded.get("email_verified", False)
+
+ user, _ = User.objects.get_or_create(
+ firebase_uid=uid,
+ defaults={
+ "username": email or uid,
+ "email": email or "",
+ "first_name": name.split(" ")[0] if name else "",
+ "last_name": " ".join(name.split(" ")[1:]) if name else "",
+ "picture": picture,
+ "email_verified": email_verified,
+ },
+ )
+
+ return Response({
+ "message": "Login successful",
+ "user": {
+ "id": user.id,
+ "uid": uid,
+ "email": email,
+ "name": name,
+ "picture": picture,
+ "email_verified": email_verified
+ }
+ })
+
+
+
+
+
diff --git a/backend/watchlists/__init__.py b/backend/watchlists/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/watchlists/admin.py b/backend/watchlists/admin.py
new file mode 100644
index 0000000..8c38f3f
--- /dev/null
+++ b/backend/watchlists/admin.py
@@ -0,0 +1,3 @@
+from django.contrib import admin
+
+# Register your models here.
diff --git a/backend/watchlists/apps.py b/backend/watchlists/apps.py
new file mode 100644
index 0000000..a75e310
--- /dev/null
+++ b/backend/watchlists/apps.py
@@ -0,0 +1,6 @@
+from django.apps import AppConfig
+
+
+class WatchlistsConfig(AppConfig):
+ default_auto_field = 'django.db.models.BigAutoField'
+ name = 'watchlists'
diff --git a/backend/watchlists/management/__init__.py b/backend/watchlists/management/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/watchlists/management/commands/__init__.py b/backend/watchlists/management/commands/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/watchlists/management/commands/clean_watchlists.py b/backend/watchlists/management/commands/clean_watchlists.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/watchlists/migrations/0001_initial.py b/backend/watchlists/migrations/0001_initial.py
new file mode 100644
index 0000000..90741fa
--- /dev/null
+++ b/backend/watchlists/migrations/0001_initial.py
@@ -0,0 +1,41 @@
+# Generated by Django 5.2.5 on 2025-08-31 05:05
+
+import django.db.models.deletion
+from django.conf import settings
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ initial = True
+
+ dependencies = [
+ ('stocks', '0001_initial'),
+ migrations.swappable_dependency(settings.AUTH_USER_MODEL),
+ ]
+
+ operations = [
+ migrations.CreateModel(
+ name='Watchlist',
+ fields=[
+ ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
+ ('name', models.CharField(default='My Watchlist', max_length=100)),
+ ('created_at', models.DateTimeField(auto_now_add=True)),
+ ('updated_at', models.DateTimeField(auto_now=True)),
+ ('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='watchlists', to=settings.AUTH_USER_MODEL)),
+ ],
+ ),
+ migrations.CreateModel(
+ name='WatchlistStock',
+ fields=[
+ ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
+ ('added_at', models.DateTimeField(auto_now_add=True)),
+ ('stock', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='stocks.stock')),
+ ('watchlist', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='watchlists.watchlist')),
+ ],
+ options={
+ 'ordering': ['added_at'],
+ 'unique_together': {('watchlist', 'stock')},
+ },
+ ),
+ ]
diff --git a/backend/watchlists/migrations/0002_alter_watchlist_user.py b/backend/watchlists/migrations/0002_alter_watchlist_user.py
new file mode 100644
index 0000000..9d0e782
--- /dev/null
+++ b/backend/watchlists/migrations/0002_alter_watchlist_user.py
@@ -0,0 +1,21 @@
+# Generated by Django 5.2.5 on 2025-08-31 11:09
+
+import django.db.models.deletion
+from django.conf import settings
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('watchlists', '0001_initial'),
+ migrations.swappable_dependency(settings.AUTH_USER_MODEL),
+ ]
+
+ operations = [
+ migrations.AlterField(
+ model_name='watchlist',
+ name='user',
+ field=models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='watchlist', to=settings.AUTH_USER_MODEL),
+ ),
+ ]
diff --git a/backend/watchlists/migrations/0003_alter_watchlist_name.py b/backend/watchlists/migrations/0003_alter_watchlist_name.py
new file mode 100644
index 0000000..f0f9b44
--- /dev/null
+++ b/backend/watchlists/migrations/0003_alter_watchlist_name.py
@@ -0,0 +1,18 @@
+# Generated by Django 5.2.5 on 2025-10-02 07:32
+
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+
+ dependencies = [
+ ('watchlists', '0002_alter_watchlist_user'),
+ ]
+
+ operations = [
+ migrations.AlterField(
+ model_name='watchlist',
+ name='name',
+ field=models.CharField(default='My Watchlist', editable=False, max_length=100),
+ ),
+ ]
diff --git a/backend/watchlists/migrations/__init__.py b/backend/watchlists/migrations/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/backend/watchlists/models.py b/backend/watchlists/models.py
new file mode 100644
index 0000000..38bde57
--- /dev/null
+++ b/backend/watchlists/models.py
@@ -0,0 +1,31 @@
+from django.db import models
+from users.models import User
+
+# Create your models here.
+class Watchlist(models.Model):
+ user = models.OneToOneField(User, on_delete=models.CASCADE, related_name='watchlist')
+ name = models.CharField(max_length=100, default='My Watchlist', editable=False)
+ created_at = models.DateTimeField(auto_now_add=True)
+ updated_at = models.DateTimeField(auto_now=True)
+
+ def save(self, *args, **kwargs):
+ # Always set name to 'My Watchlist'
+ self.name = 'My Watchlist'
+ super().save(*args, **kwargs)
+
+ def __str__(self):
+ return f"{self.user.email} - {self.name}"
+
+
+class WatchlistStock(models.Model):
+ """Junction table for Watchlist-Stock many-to-many relationship"""
+ watchlist = models.ForeignKey(Watchlist, on_delete=models.CASCADE)
+ stock = models.ForeignKey('stocks.Stock', on_delete=models.CASCADE)
+
+ added_at = models.DateTimeField(auto_now_add=True)
+
+ class Meta:
+ unique_together = ['watchlist', 'stock']
+ ordering = ['added_at']
+
+
\ No newline at end of file
diff --git a/backend/watchlists/serializers.py b/backend/watchlists/serializers.py
new file mode 100644
index 0000000..3d611df
--- /dev/null
+++ b/backend/watchlists/serializers.py
@@ -0,0 +1,27 @@
+from rest_framework import serializers
+from .models import Watchlist, WatchlistStock
+from stocks.models import Stock
+from stocks.serializers import StockSerializer
+
+
+class WatchlistStockSerializer(serializers.ModelSerializer):
+ stock = StockSerializer(read_only=True)
+ stock_id = serializers.IntegerField(write_only=True)
+
+ class Meta:
+ model = WatchlistStock
+ fields = ['id', 'stock', 'stock_id', 'added_at']
+ read_only_fields = ['added_at']
+
+
+class WatchlistSerializer(serializers.ModelSerializer):
+ stocks = WatchlistStockSerializer(source='watchliststock_set', many=True, read_only=True)
+ stock_count = serializers.SerializerMethodField()
+
+ class Meta:
+ model = Watchlist
+ fields = ['id', 'name', 'created_at', 'updated_at', 'stocks', 'stock_count']
+ read_only_fields = ['created_at', 'updated_at', 'name'] # name is read-only
+
+ def get_stock_count(self, obj):
+ return obj.watchliststock_set.count()
diff --git a/backend/watchlists/tests.py b/backend/watchlists/tests.py
new file mode 100644
index 0000000..3fc221b
--- /dev/null
+++ b/backend/watchlists/tests.py
@@ -0,0 +1,356 @@
+from django.test import TestCase
+from django.contrib.auth import get_user_model
+from django.db import IntegrityError
+from django.core.exceptions import ValidationError
+from django.utils import timezone
+from .models import Watchlist, WatchlistStock
+from stocks.models import Stock
+
+User = get_user_model()
+
+
+class WatchlistModelTests(TestCase):
+ """Test cases for Watchlist model"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.user = User.objects.create_user(
+ username='testuser@example.com',
+ email='testuser@example.com',
+ firebase_uid='test_firebase_uid'
+ )
+
+ def test_create_watchlist(self):
+ """Test creating a new watchlist"""
+ watchlist = Watchlist.objects.create(user=self.user)
+
+ self.assertEqual(watchlist.user, self.user)
+ self.assertEqual(watchlist.name, 'My Watchlist')
+ self.assertTrue(isinstance(watchlist.created_at, type(timezone.now())))
+ self.assertTrue(isinstance(watchlist.updated_at, type(timezone.now())))
+
+ def test_watchlist_str_method(self):
+ """Test Watchlist __str__ method"""
+ watchlist = Watchlist.objects.create(user=self.user)
+ expected_str = f"{self.user.email} - My Watchlist"
+ self.assertEqual(str(watchlist), expected_str)
+
+ def test_watchlist_name_auto_set(self):
+ """Test that watchlist name is automatically set to 'My Watchlist'"""
+ watchlist = Watchlist.objects.create(
+ user=self.user,
+ name='Custom Name' # This should be overridden
+ )
+ self.assertEqual(watchlist.name, 'My Watchlist')
+
+ def test_watchlist_save_method(self):
+ """Test that save method always sets name to 'My Watchlist'"""
+ watchlist = Watchlist.objects.create(user=self.user)
+
+ # Try to change the name
+ watchlist.name = 'Different Name'
+ watchlist.save()
+
+ # Should still be 'My Watchlist'
+ watchlist.refresh_from_db()
+ self.assertEqual(watchlist.name, 'My Watchlist')
+
+ def test_one_to_one_relationship(self):
+ """Test one-to-one relationship between User and Watchlist"""
+ watchlist1 = Watchlist.objects.create(user=self.user)
+
+ # Try to create another watchlist for the same user
+ with self.assertRaises(IntegrityError):
+ Watchlist.objects.create(user=self.user)
+
+ def test_watchlist_cascade_delete(self):
+ """Test that watchlist is deleted when user is deleted"""
+ watchlist = Watchlist.objects.create(user=self.user)
+ watchlist_id = watchlist.id
+
+ # Delete the user
+ self.user.delete()
+
+ # Watchlist should be deleted too
+ with self.assertRaises(Watchlist.DoesNotExist):
+ Watchlist.objects.get(id=watchlist_id)
+
+ def test_related_name_access(self):
+ """Test accessing watchlist through user's related name"""
+ watchlist = Watchlist.objects.create(user=self.user)
+
+ # Access watchlist through user
+ user_watchlist = self.user.watchlist
+ self.assertEqual(user_watchlist, watchlist)
+
+
+class WatchlistStockModelTests(TestCase):
+ """Test cases for WatchlistStock model"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.user = User.objects.create_user(
+ username='testuser@example.com',
+ email='testuser@example.com',
+ firebase_uid='test_firebase_uid'
+ )
+ self.watchlist = Watchlist.objects.create(user=self.user)
+ self.stock = Stock.objects.create(symbol='AAPL', name='Apple Inc.')
+
+ def test_create_watchlist_stock(self):
+ """Test creating a new watchlist stock entry"""
+ watchlist_stock = WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stock
+ )
+
+ self.assertEqual(watchlist_stock.watchlist, self.watchlist)
+ self.assertEqual(watchlist_stock.stock, self.stock)
+ self.assertTrue(isinstance(watchlist_stock.added_at, type(timezone.now())))
+
+ def test_watchlist_stock_unique_together(self):
+ """Test unique_together constraint on watchlist and stock"""
+ WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stock
+ )
+
+ # Try to create duplicate entry
+ with self.assertRaises(IntegrityError):
+ WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stock
+ )
+
+ def test_watchlist_stock_cascade_delete_watchlist(self):
+ """Test cascade delete when watchlist is deleted"""
+ watchlist_stock = WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stock
+ )
+ entry_id = watchlist_stock.id
+
+ # Delete watchlist
+ self.watchlist.delete()
+
+ # WatchlistStock entry should be deleted
+ with self.assertRaises(WatchlistStock.DoesNotExist):
+ WatchlistStock.objects.get(id=entry_id)
+
+ def test_watchlist_stock_cascade_delete_stock(self):
+ """Test cascade delete when stock is deleted"""
+ watchlist_stock = WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stock
+ )
+ entry_id = watchlist_stock.id
+
+ # Delete stock
+ self.stock.delete()
+
+ # WatchlistStock entry should be deleted
+ with self.assertRaises(WatchlistStock.DoesNotExist):
+ WatchlistStock.objects.get(id=entry_id)
+
+ def test_watchlist_stock_ordering(self):
+ """Test ordering by added_at field"""
+ stock2 = Stock.objects.create(symbol='GOOGL', name='Alphabet Inc.')
+
+ # Create entries with slight delay
+ entry1 = WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stock
+ )
+ entry2 = WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=stock2
+ )
+
+ # Get ordered entries
+ entries = WatchlistStock.objects.filter(watchlist=self.watchlist)
+
+ # Should be ordered by added_at (earliest first)
+ self.assertEqual(list(entries), [entry1, entry2])
+
+
+class WatchlistQueryTests(TestCase):
+ """Test cases for Watchlist and WatchlistStock queries"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.user1 = User.objects.create_user(
+ username='user1@example.com',
+ email='user1@example.com',
+ firebase_uid='uid_1'
+ )
+ self.user2 = User.objects.create_user(
+ username='user2@example.com',
+ email='user2@example.com',
+ firebase_uid='uid_2'
+ )
+
+ self.watchlist1 = Watchlist.objects.create(user=self.user1)
+ self.watchlist2 = Watchlist.objects.create(user=self.user2)
+
+ self.stocks = [
+ Stock.objects.create(symbol='AAPL', name='Apple Inc.'),
+ Stock.objects.create(symbol='GOOGL', name='Alphabet Inc.'),
+ Stock.objects.create(symbol='MSFT', name='Microsoft Corporation')
+ ]
+
+ # Add stocks to watchlists
+ WatchlistStock.objects.create(watchlist=self.watchlist1, stock=self.stocks[0])
+ WatchlistStock.objects.create(watchlist=self.watchlist1, stock=self.stocks[1])
+ WatchlistStock.objects.create(watchlist=self.watchlist2, stock=self.stocks[0])
+
+ def test_get_watchlist_by_user(self):
+ """Test getting watchlist by user"""
+ watchlist = Watchlist.objects.get(user=self.user1)
+ self.assertEqual(watchlist, self.watchlist1)
+
+ def test_get_stocks_in_watchlist(self):
+ """Test getting all stocks in a watchlist"""
+ watchlist_stocks = WatchlistStock.objects.filter(watchlist=self.watchlist1)
+ stock_symbols = [ws.stock.symbol for ws in watchlist_stocks]
+
+ self.assertIn('AAPL', stock_symbols)
+ self.assertIn('GOOGL', stock_symbols)
+ self.assertNotIn('MSFT', stock_symbols)
+
+ def test_check_stock_in_watchlist(self):
+ """Test checking if a stock is in a watchlist"""
+ exists = WatchlistStock.objects.filter(
+ watchlist=self.watchlist1,
+ stock=self.stocks[0]
+ ).exists()
+
+ not_exists = WatchlistStock.objects.filter(
+ watchlist=self.watchlist1,
+ stock=self.stocks[2]
+ ).exists()
+
+ self.assertTrue(exists)
+ self.assertFalse(not_exists)
+
+ def test_count_stocks_in_watchlist(self):
+ """Test counting stocks in a watchlist"""
+ count1 = WatchlistStock.objects.filter(watchlist=self.watchlist1).count()
+ count2 = WatchlistStock.objects.filter(watchlist=self.watchlist2).count()
+
+ self.assertEqual(count1, 2)
+ self.assertEqual(count2, 1)
+
+ def test_get_watchlists_containing_stock(self):
+ """Test getting all watchlists that contain a specific stock"""
+ watchlists_with_aapl = WatchlistStock.objects.filter(
+ stock=self.stocks[0]
+ ).values_list('watchlist', flat=True)
+
+ self.assertIn(self.watchlist1.id, watchlists_with_aapl)
+ self.assertIn(self.watchlist2.id, watchlists_with_aapl)
+
+ def test_get_or_create_watchlist(self):
+ """Test get_or_create functionality for watchlist"""
+ # Test getting existing watchlist
+ watchlist1, created1 = Watchlist.objects.get_or_create(
+ user=self.user1,
+ defaults={'name': 'My Watchlist'}
+ )
+ self.assertFalse(created1)
+ self.assertEqual(watchlist1, self.watchlist1)
+
+ # Test creating new watchlist for user without one
+ user3 = User.objects.create_user(
+ username='user3@example.com',
+ email='user3@example.com',
+ firebase_uid='uid_3'
+ )
+ watchlist3, created3 = Watchlist.objects.get_or_create(
+ user=user3,
+ defaults={'name': 'My Watchlist'}
+ )
+ self.assertTrue(created3)
+ self.assertEqual(watchlist3.user, user3)
+
+
+class WatchlistBusinessLogicTests(TestCase):
+ """Test cases for business logic related to watchlists"""
+
+ def setUp(self):
+ """Set up test data"""
+ self.user = User.objects.create_user(
+ username='testuser@example.com',
+ email='testuser@example.com',
+ firebase_uid='test_firebase_uid'
+ )
+ self.watchlist = Watchlist.objects.create(user=self.user)
+ self.stocks = [
+ Stock.objects.create(symbol='AAPL', name='Apple Inc.'),
+ Stock.objects.create(symbol='GOOGL', name='Alphabet Inc.'),
+ Stock.objects.create(symbol='MSFT', name='Microsoft Corporation')
+ ]
+
+ def test_add_stock_to_watchlist(self):
+ """Test adding a stock to watchlist"""
+ initial_count = WatchlistStock.objects.filter(watchlist=self.watchlist).count()
+
+ WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stocks[0]
+ )
+
+ new_count = WatchlistStock.objects.filter(watchlist=self.watchlist).count()
+ self.assertEqual(new_count, initial_count + 1)
+
+ def test_remove_stock_from_watchlist(self):
+ """Test removing a stock from watchlist"""
+ # Add stock first
+ watchlist_stock = WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stocks[0]
+ )
+
+ initial_count = WatchlistStock.objects.filter(watchlist=self.watchlist).count()
+
+ # Remove stock
+ watchlist_stock.delete()
+
+ new_count = WatchlistStock.objects.filter(watchlist=self.watchlist).count()
+ self.assertEqual(new_count, initial_count - 1)
+
+ def test_bulk_add_stocks_to_watchlist(self):
+ """Test adding multiple stocks to watchlist"""
+ watchlist_stocks = [
+ WatchlistStock(watchlist=self.watchlist, stock=stock)
+ for stock in self.stocks
+ ]
+
+ WatchlistStock.objects.bulk_create(watchlist_stocks)
+
+ count = WatchlistStock.objects.filter(watchlist=self.watchlist).count()
+ self.assertEqual(count, len(self.stocks))
+
+ def test_watchlist_stock_timestamps(self):
+ """Test that timestamps are set correctly"""
+ before_creation = timezone.now()
+
+ watchlist_stock = WatchlistStock.objects.create(
+ watchlist=self.watchlist,
+ stock=self.stocks[0]
+ )
+
+ after_creation = timezone.now()
+
+ # Check that added_at is between before and after creation
+ self.assertGreaterEqual(watchlist_stock.added_at, before_creation)
+ self.assertLessEqual(watchlist_stock.added_at, after_creation)
+
+ def test_watchlist_updated_at_auto_update(self):
+ """Test that updated_at is automatically updated"""
+ original_updated_at = self.watchlist.updated_at
+
+ # Save the watchlist again
+ self.watchlist.save()
+
+ self.watchlist.refresh_from_db()
+ self.assertGreater(self.watchlist.updated_at, original_updated_at)
diff --git a/backend/watchlists/urls.py b/backend/watchlists/urls.py
new file mode 100644
index 0000000..500d01f
--- /dev/null
+++ b/backend/watchlists/urls.py
@@ -0,0 +1,14 @@
+from django.urls import path
+from .views import (
+ WatchlistView,
+ WatchlistStockView
+)
+
+urlpatterns = [
+ # Single watchlist per user
+ path('', WatchlistView.as_view(), name='user-watchlist'),
+
+ # Stock operations within the user's watchlist
+ path('stocks/', WatchlistStockView.as_view(), name='watchlist-add-stock'),
+ path('stocks//', WatchlistStockView.as_view(), name='watchlist-remove-stock'),
+]
diff --git a/backend/watchlists/views.py b/backend/watchlists/views.py
new file mode 100644
index 0000000..669176e
--- /dev/null
+++ b/backend/watchlists/views.py
@@ -0,0 +1,113 @@
+from rest_framework.views import APIView
+from rest_framework.response import Response
+from rest_framework import status, permissions
+from django.shortcuts import get_object_or_404
+
+from .models import Watchlist, WatchlistStock
+from stocks.models import Stock
+from .serializers import (
+ WatchlistSerializer,
+ WatchlistStockSerializer
+)
+
+
+class WatchlistView(APIView):
+ """
+ GET: Get the user's single watchlist (create if doesn't exist)
+ """
+ permission_classes = [permissions.IsAuthenticated]
+
+ def get(self, request):
+ """Get or create user's watchlist"""
+ watchlist, created = Watchlist.objects.get_or_create(
+ user=request.user,
+ defaults={'name': 'My Watchlist'}
+ )
+ serializer = WatchlistSerializer(watchlist)
+ return Response(serializer.data)
+
+
+class WatchlistStockView(APIView):
+ """
+ POST: Add a stock to the user's watchlist
+ DELETE: Remove a stock from the user's watchlist
+ """
+ permission_classes = [permissions.IsAuthenticated]
+
+ def post(self, request):
+ """Add stock to user's watchlist"""
+ watchlist, created = Watchlist.objects.get_or_create(
+ user=request.user,
+ defaults={'name': 'My Watchlist'}
+ )
+
+ stock_symbol = request.data.get('symbol')
+ stock_id = request.data.get('stock_id')
+
+ if not stock_symbol and not stock_id:
+ return Response(
+ {"detail": "Either 'symbol' or 'stock_id' is required"},
+ status=status.HTTP_400_BAD_REQUEST
+ )
+
+ try:
+ if stock_id:
+ stock = get_object_or_404(Stock, id=stock_id)
+ else:
+ # Try to find stock by symbol, create if doesn't exist
+ stock, created = Stock.objects.get_or_create(
+ symbol=stock_symbol.upper(),
+ defaults={'name': request.data.get('name', stock_symbol.upper())}
+ )
+
+ # Check if stock is already in watchlist
+ if WatchlistStock.objects.filter(watchlist=watchlist, stock=stock).exists():
+ return Response(
+ {"detail": "Stock already exists in your watchlist"},
+ status=status.HTTP_400_BAD_REQUEST
+ )
+
+ # Add stock to watchlist
+ watchlist_stock = WatchlistStock.objects.create(
+ watchlist=watchlist,
+ stock=stock
+ )
+
+ serializer = WatchlistStockSerializer(watchlist_stock)
+ return Response(serializer.data, status=status.HTTP_201_CREATED)
+
+ except Exception as e:
+ return Response(
+ {"detail": f"Error adding stock: {str(e)}"},
+ status=status.HTTP_400_BAD_REQUEST
+ )
+
+ def delete(self, request, stock_id):
+ """Remove stock from user's watchlist"""
+ try:
+ # Get user's watchlist
+ watchlist = get_object_or_404(Watchlist, user=request.user)
+
+ # Find and delete the watchlist stock entry
+ watchlist_stock = get_object_or_404(
+ WatchlistStock,
+ watchlist=watchlist,
+ stock_id=stock_id
+ )
+ watchlist_stock.delete()
+
+ return Response(
+ {"detail": "Stock removed from watchlist"},
+ status=status.HTTP_204_NO_CONTENT
+ )
+
+ except WatchlistStock.DoesNotExist:
+ return Response(
+ {"detail": "Stock not found in your watchlist"},
+ status=status.HTTP_404_NOT_FOUND
+ )
+ except Watchlist.DoesNotExist:
+ return Response(
+ {"detail": "Watchlist not found"},
+ status=status.HTTP_404_NOT_FOUND
+ )
\ No newline at end of file
diff --git a/frontend/.gitignore b/frontend/.gitignore
new file mode 100644
index 0000000..a547bf3
--- /dev/null
+++ b/frontend/.gitignore
@@ -0,0 +1,24 @@
+# Logs
+logs
+*.log
+npm-debug.log*
+yarn-debug.log*
+yarn-error.log*
+pnpm-debug.log*
+lerna-debug.log*
+
+node_modules
+dist
+dist-ssr
+*.local
+
+# Editor directories and files
+.vscode/*
+!.vscode/extensions.json
+.idea
+.DS_Store
+*.suo
+*.ntvs*
+*.njsproj
+*.sln
+*.sw?
diff --git a/frontend/eslint.config.js b/frontend/eslint.config.js
new file mode 100644
index 0000000..cee1e2c
--- /dev/null
+++ b/frontend/eslint.config.js
@@ -0,0 +1,29 @@
+import js from '@eslint/js'
+import globals from 'globals'
+import reactHooks from 'eslint-plugin-react-hooks'
+import reactRefresh from 'eslint-plugin-react-refresh'
+import { defineConfig, globalIgnores } from 'eslint/config'
+
+export default defineConfig([
+ globalIgnores(['dist']),
+ {
+ files: ['**/*.{js,jsx}'],
+ extends: [
+ js.configs.recommended,
+ reactHooks.configs['recommended-latest'],
+ reactRefresh.configs.vite,
+ ],
+ languageOptions: {
+ ecmaVersion: 2020,
+ globals: globals.browser,
+ parserOptions: {
+ ecmaVersion: 'latest',
+ ecmaFeatures: { jsx: true },
+ sourceType: 'module',
+ },
+ },
+ rules: {
+ 'no-unused-vars': ['error', { varsIgnorePattern: '^[A-Z_]' }],
+ },
+ },
+])
diff --git a/frontend/index.html b/frontend/index.html
new file mode 100644
index 0000000..6006b3d
--- /dev/null
+++ b/frontend/index.html
@@ -0,0 +1,15 @@
+
+
+
+
+
+
+
+
+ Stocksense
+
+
+
+
+
+
diff --git a/frontend/package-lock.json b/frontend/package-lock.json
new file mode 100644
index 0000000..85020b9
--- /dev/null
+++ b/frontend/package-lock.json
@@ -0,0 +1,4211 @@
+{
+ "name": "frontend",
+ "version": "0.0.0",
+ "lockfileVersion": 3,
+ "requires": true,
+ "packages": {
+ "": {
+ "name": "frontend",
+ "version": "0.0.0",
+ "dependencies": {
+ "axios": "^1.11.0",
+ "chart.js": "^4.5.0",
+ "chartjs-adapter-date-fns": "^3.0.0",
+ "date-fns": "^2.30.0",
+ "firebase": "^12.2.1",
+ "lucide-react": "^0.544.0",
+ "react": "^19.1.1",
+ "react-chartjs-2": "^5.3.0",
+ "react-dom": "^19.1.1",
+ "react-router-dom": "^7.8.2"
+ },
+ "devDependencies": {
+ "@eslint/js": "^9.33.0",
+ "@types/react": "^19.1.10",
+ "@types/react-dom": "^19.1.7",
+ "@vitejs/plugin-react": "^5.0.0",
+ "eslint": "^9.33.0",
+ "eslint-plugin-react-hooks": "^5.2.0",
+ "eslint-plugin-react-refresh": "^0.4.20",
+ "globals": "^16.3.0",
+ "vite": "^7.1.2"
+ }
+ },
+ "node_modules/@ampproject/remapping": {
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/@ampproject/remapping/-/remapping-2.3.0.tgz",
+ "integrity": "sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@jridgewell/gen-mapping": "^0.3.5",
+ "@jridgewell/trace-mapping": "^0.3.24"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@babel/code-frame": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.27.1.tgz",
+ "integrity": "sha512-cjQ7ZlQ0Mv3b47hABuTevyTuYN4i+loJKGeV9flcCgIK37cCXRh+L1bd3iBHlynerhQ7BhCkn2BPbQUL+rGqFg==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/helper-validator-identifier": "^7.27.1",
+ "js-tokens": "^4.0.0",
+ "picocolors": "^1.1.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/compat-data": {
+ "version": "7.28.0",
+ "resolved": "https://registry.npmjs.org/@babel/compat-data/-/compat-data-7.28.0.tgz",
+ "integrity": "sha512-60X7qkglvrap8mn1lh2ebxXdZYtUcpd7gsmy9kLaBJ4i/WdY8PqTSdxyA8qraikqKQK5C1KRBKXqznrVapyNaw==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/core": {
+ "version": "7.28.3",
+ "resolved": "https://registry.npmjs.org/@babel/core/-/core-7.28.3.tgz",
+ "integrity": "sha512-yDBHV9kQNcr2/sUr9jghVyz9C3Y5G2zUM2H2lo+9mKv4sFgbA8s8Z9t8D1jiTkGoO/NoIfKMyKWr4s6CN23ZwQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@ampproject/remapping": "^2.2.0",
+ "@babel/code-frame": "^7.27.1",
+ "@babel/generator": "^7.28.3",
+ "@babel/helper-compilation-targets": "^7.27.2",
+ "@babel/helper-module-transforms": "^7.28.3",
+ "@babel/helpers": "^7.28.3",
+ "@babel/parser": "^7.28.3",
+ "@babel/template": "^7.27.2",
+ "@babel/traverse": "^7.28.3",
+ "@babel/types": "^7.28.2",
+ "convert-source-map": "^2.0.0",
+ "debug": "^4.1.0",
+ "gensync": "^1.0.0-beta.2",
+ "json5": "^2.2.3",
+ "semver": "^6.3.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/babel"
+ }
+ },
+ "node_modules/@babel/generator": {
+ "version": "7.28.3",
+ "resolved": "https://registry.npmjs.org/@babel/generator/-/generator-7.28.3.tgz",
+ "integrity": "sha512-3lSpxGgvnmZznmBkCRnVREPUFJv2wrv9iAoFDvADJc0ypmdOxdUtcLeBgBJ6zE0PMeTKnxeQzyk0xTBq4Ep7zw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/parser": "^7.28.3",
+ "@babel/types": "^7.28.2",
+ "@jridgewell/gen-mapping": "^0.3.12",
+ "@jridgewell/trace-mapping": "^0.3.28",
+ "jsesc": "^3.0.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-compilation-targets": {
+ "version": "7.27.2",
+ "resolved": "https://registry.npmjs.org/@babel/helper-compilation-targets/-/helper-compilation-targets-7.27.2.tgz",
+ "integrity": "sha512-2+1thGUUWWjLTYTHZWK1n8Yga0ijBz1XAhUXcKy81rd5g6yh7hGqMp45v7cadSbEHc9G3OTv45SyneRN3ps4DQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/compat-data": "^7.27.2",
+ "@babel/helper-validator-option": "^7.27.1",
+ "browserslist": "^4.24.0",
+ "lru-cache": "^5.1.1",
+ "semver": "^6.3.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-globals": {
+ "version": "7.28.0",
+ "resolved": "https://registry.npmjs.org/@babel/helper-globals/-/helper-globals-7.28.0.tgz",
+ "integrity": "sha512-+W6cISkXFa1jXsDEdYA8HeevQT/FULhxzR99pxphltZcVaugps53THCeiWA8SguxxpSp3gKPiuYfSWopkLQ4hw==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-module-imports": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/helper-module-imports/-/helper-module-imports-7.27.1.tgz",
+ "integrity": "sha512-0gSFWUPNXNopqtIPQvlD5WgXYI5GY2kP2cCvoT8kczjbfcfuIljTbcWrulD1CIPIX2gt1wghbDy08yE1p+/r3w==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/traverse": "^7.27.1",
+ "@babel/types": "^7.27.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-module-transforms": {
+ "version": "7.28.3",
+ "resolved": "https://registry.npmjs.org/@babel/helper-module-transforms/-/helper-module-transforms-7.28.3.tgz",
+ "integrity": "sha512-gytXUbs8k2sXS9PnQptz5o0QnpLL51SwASIORY6XaBKF88nsOT0Zw9szLqlSGQDP/4TljBAD5y98p2U1fqkdsw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/helper-module-imports": "^7.27.1",
+ "@babel/helper-validator-identifier": "^7.27.1",
+ "@babel/traverse": "^7.28.3"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0"
+ }
+ },
+ "node_modules/@babel/helper-plugin-utils": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/helper-plugin-utils/-/helper-plugin-utils-7.27.1.tgz",
+ "integrity": "sha512-1gn1Up5YXka3YYAHGKpbideQ5Yjf1tDa9qYcgysz+cNCXukyLl6DjPXhD3VRwSb8c0J9tA4b2+rHEZtc6R0tlw==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-string-parser": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.27.1.tgz",
+ "integrity": "sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-validator-identifier": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.27.1.tgz",
+ "integrity": "sha512-D2hP9eA+Sqx1kBZgzxZh0y1trbuU+JoDkiEwqhQ36nodYqJwyEIhPSdMNd7lOm/4io72luTPWH20Yda0xOuUow==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-validator-option": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/helper-validator-option/-/helper-validator-option-7.27.1.tgz",
+ "integrity": "sha512-YvjJow9FxbhFFKDSuFnVCe2WxXk1zWc22fFePVNEaWJEu8IrZVlda6N0uHwzZrUM1il7NC9Mlp4MaJYbYd9JSg==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helpers": {
+ "version": "7.28.3",
+ "resolved": "https://registry.npmjs.org/@babel/helpers/-/helpers-7.28.3.tgz",
+ "integrity": "sha512-PTNtvUQihsAsDHMOP5pfobP8C6CM4JWXmP8DrEIt46c3r2bf87Ua1zoqevsMo9g+tWDwgWrFP5EIxuBx5RudAw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/template": "^7.27.2",
+ "@babel/types": "^7.28.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/parser": {
+ "version": "7.28.3",
+ "resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.28.3.tgz",
+ "integrity": "sha512-7+Ey1mAgYqFAx2h0RuoxcQT5+MlG3GTV0TQrgr7/ZliKsm/MNDxVVutlWaziMq7wJNAz8MTqz55XLpWvva6StA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/types": "^7.28.2"
+ },
+ "bin": {
+ "parser": "bin/babel-parser.js"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-react-jsx-self": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx-self/-/plugin-transform-react-jsx-self-7.27.1.tgz",
+ "integrity": "sha512-6UzkCs+ejGdZ5mFFC/OCUrv028ab2fp1znZmCZjAOBKiBK2jXD1O+BPSfX8X2qjJ75fZBMSnQn3Rq2mrBJK2mw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.27.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-react-jsx-source": {
+ "version": "7.27.1",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx-source/-/plugin-transform-react-jsx-source-7.27.1.tgz",
+ "integrity": "sha512-zbwoTsBruTeKB9hSq73ha66iFeJHuaFkUbwvqElnygoNbj/jHRsSeokowZFN3CZ64IvEqcmmkVe89OPXc7ldAw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.27.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/runtime": {
+ "version": "7.28.4",
+ "resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.28.4.tgz",
+ "integrity": "sha512-Q/N6JNWvIvPnLDvjlE1OUBLPQHH6l3CltCEsHIujp45zQUSSh8K+gHnaEX45yAT1nyngnINhvWtzN+Nb9D8RAQ==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/template": {
+ "version": "7.27.2",
+ "resolved": "https://registry.npmjs.org/@babel/template/-/template-7.27.2.tgz",
+ "integrity": "sha512-LPDZ85aEJyYSd18/DkjNh4/y1ntkE5KwUHWTiqgRxruuZL2F1yuHligVHLvcHY2vMHXttKFpJn6LwfI7cw7ODw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/code-frame": "^7.27.1",
+ "@babel/parser": "^7.27.2",
+ "@babel/types": "^7.27.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/traverse": {
+ "version": "7.28.3",
+ "resolved": "https://registry.npmjs.org/@babel/traverse/-/traverse-7.28.3.tgz",
+ "integrity": "sha512-7w4kZYHneL3A6NP2nxzHvT3HCZ7puDZZjFMqDpBPECub79sTtSO5CGXDkKrTQq8ksAwfD/XI2MRFX23njdDaIQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/code-frame": "^7.27.1",
+ "@babel/generator": "^7.28.3",
+ "@babel/helper-globals": "^7.28.0",
+ "@babel/parser": "^7.28.3",
+ "@babel/template": "^7.27.2",
+ "@babel/types": "^7.28.2",
+ "debug": "^4.3.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/types": {
+ "version": "7.28.2",
+ "resolved": "https://registry.npmjs.org/@babel/types/-/types-7.28.2.tgz",
+ "integrity": "sha512-ruv7Ae4J5dUYULmeXw1gmb7rYRz57OWCPM57pHojnLq/3Z1CK2lNSLTCVjxVk1F/TZHwOZZrOWi0ur95BbLxNQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/helper-string-parser": "^7.27.1",
+ "@babel/helper-validator-identifier": "^7.27.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@esbuild/aix-ppc64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.25.9.tgz",
+ "integrity": "sha512-OaGtL73Jck6pBKjNIe24BnFE6agGl+6KxDtTfHhy1HmhthfKouEcOhqpSL64K4/0WCtbKFLOdzD/44cJ4k9opA==",
+ "cpu": [
+ "ppc64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "aix"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/android-arm": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.25.9.tgz",
+ "integrity": "sha512-5WNI1DaMtxQ7t7B6xa572XMXpHAaI/9Hnhk8lcxF4zVN4xstUgTlvuGDorBguKEnZO70qwEcLpfifMLoxiPqHQ==",
+ "cpu": [
+ "arm"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "android"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/android-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.25.9.tgz",
+ "integrity": "sha512-IDrddSmpSv51ftWslJMvl3Q2ZT98fUSL2/rlUXuVqRXHCs5EUF1/f+jbjF5+NG9UffUDMCiTyh8iec7u8RlTLg==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "android"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/android-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.25.9.tgz",
+ "integrity": "sha512-I853iMZ1hWZdNllhVZKm34f4wErd4lMyeV7BLzEExGEIZYsOzqDWDf+y082izYUE8gtJnYHdeDpN/6tUdwvfiw==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "android"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/darwin-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.25.9.tgz",
+ "integrity": "sha512-XIpIDMAjOELi/9PB30vEbVMs3GV1v2zkkPnuyRRURbhqjyzIINwj+nbQATh4H9GxUgH1kFsEyQMxwiLFKUS6Rg==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/darwin-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.25.9.tgz",
+ "integrity": "sha512-jhHfBzjYTA1IQu8VyrjCX4ApJDnH+ez+IYVEoJHeqJm9VhG9Dh2BYaJritkYK3vMaXrf7Ogr/0MQ8/MeIefsPQ==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/freebsd-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.25.9.tgz",
+ "integrity": "sha512-z93DmbnY6fX9+KdD4Ue/H6sYs+bhFQJNCPZsi4XWJoYblUqT06MQUdBCpcSfuiN72AbqeBFu5LVQTjfXDE2A6Q==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "freebsd"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/freebsd-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.25.9.tgz",
+ "integrity": "sha512-mrKX6H/vOyo5v71YfXWJxLVxgy1kyt1MQaD8wZJgJfG4gq4DpQGpgTB74e5yBeQdyMTbgxp0YtNj7NuHN0PoZg==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "freebsd"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-arm": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.25.9.tgz",
+ "integrity": "sha512-HBU2Xv78SMgaydBmdor38lg8YDnFKSARg1Q6AT0/y2ezUAKiZvc211RDFHlEZRFNRVhcMamiToo7bDx3VEOYQw==",
+ "cpu": [
+ "arm"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.25.9.tgz",
+ "integrity": "sha512-BlB7bIcLT3G26urh5Dmse7fiLmLXnRlopw4s8DalgZ8ef79Jj4aUcYbk90g8iCa2467HX8SAIidbL7gsqXHdRw==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-ia32": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.25.9.tgz",
+ "integrity": "sha512-e7S3MOJPZGp2QW6AK6+Ly81rC7oOSerQ+P8L0ta4FhVi+/j/v2yZzx5CqqDaWjtPFfYz21Vi1S0auHrap3Ma3A==",
+ "cpu": [
+ "ia32"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-loong64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.25.9.tgz",
+ "integrity": "sha512-Sbe10Bnn0oUAB2AalYztvGcK+o6YFFA/9829PhOCUS9vkJElXGdphz0A3DbMdP8gmKkqPmPcMJmJOrI3VYB1JQ==",
+ "cpu": [
+ "loong64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-mips64el": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.25.9.tgz",
+ "integrity": "sha512-YcM5br0mVyZw2jcQeLIkhWtKPeVfAerES5PvOzaDxVtIyZ2NUBZKNLjC5z3/fUlDgT6w89VsxP2qzNipOaaDyA==",
+ "cpu": [
+ "mips64el"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-ppc64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.25.9.tgz",
+ "integrity": "sha512-++0HQvasdo20JytyDpFvQtNrEsAgNG2CY1CLMwGXfFTKGBGQT3bOeLSYE2l1fYdvML5KUuwn9Z8L1EWe2tzs1w==",
+ "cpu": [
+ "ppc64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-riscv64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.25.9.tgz",
+ "integrity": "sha512-uNIBa279Y3fkjV+2cUjx36xkx7eSjb8IvnL01eXUKXez/CBHNRw5ekCGMPM0BcmqBxBcdgUWuUXmVWwm4CH9kg==",
+ "cpu": [
+ "riscv64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-s390x": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.25.9.tgz",
+ "integrity": "sha512-Mfiphvp3MjC/lctb+7D287Xw1DGzqJPb/J2aHHcHxflUo+8tmN/6d4k6I2yFR7BVo5/g7x2Monq4+Yew0EHRIA==",
+ "cpu": [
+ "s390x"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/linux-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.25.9.tgz",
+ "integrity": "sha512-iSwByxzRe48YVkmpbgoxVzn76BXjlYFXC7NvLYq+b+kDjyyk30J0JY47DIn8z1MO3K0oSl9fZoRmZPQI4Hklzg==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/netbsd-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/netbsd-arm64/-/netbsd-arm64-0.25.9.tgz",
+ "integrity": "sha512-9jNJl6FqaUG+COdQMjSCGW4QiMHH88xWbvZ+kRVblZsWrkXlABuGdFJ1E9L7HK+T0Yqd4akKNa/lO0+jDxQD4Q==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "netbsd"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/netbsd-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.25.9.tgz",
+ "integrity": "sha512-RLLdkflmqRG8KanPGOU7Rpg829ZHu8nFy5Pqdi9U01VYtG9Y0zOG6Vr2z4/S+/3zIyOxiK6cCeYNWOFR9QP87g==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "netbsd"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/openbsd-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/openbsd-arm64/-/openbsd-arm64-0.25.9.tgz",
+ "integrity": "sha512-YaFBlPGeDasft5IIM+CQAhJAqS3St3nJzDEgsgFixcfZeyGPCd6eJBWzke5piZuZ7CtL656eOSYKk4Ls2C0FRQ==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "openbsd"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/openbsd-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.25.9.tgz",
+ "integrity": "sha512-1MkgTCuvMGWuqVtAvkpkXFmtL8XhWy+j4jaSO2wxfJtilVCi0ZE37b8uOdMItIHz4I6z1bWWtEX4CJwcKYLcuA==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "openbsd"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/openharmony-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/openharmony-arm64/-/openharmony-arm64-0.25.9.tgz",
+ "integrity": "sha512-4Xd0xNiMVXKh6Fa7HEJQbrpP3m3DDn43jKxMjxLLRjWnRsfxjORYJlXPO4JNcXtOyfajXorRKY9NkOpTHptErg==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "openharmony"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/sunos-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.25.9.tgz",
+ "integrity": "sha512-WjH4s6hzo00nNezhp3wFIAfmGZ8U7KtrJNlFMRKxiI9mxEK1scOMAaa9i4crUtu+tBr+0IN6JCuAcSBJZfnphw==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "sunos"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/win32-arm64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.25.9.tgz",
+ "integrity": "sha512-mGFrVJHmZiRqmP8xFOc6b84/7xa5y5YvR1x8djzXpJBSv/UsNK6aqec+6JDjConTgvvQefdGhFDAs2DLAds6gQ==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "win32"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/win32-ia32": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.25.9.tgz",
+ "integrity": "sha512-b33gLVU2k11nVx1OhX3C8QQP6UHQK4ZtN56oFWvVXvz2VkDoe6fbG8TOgHFxEvqeqohmRnIHe5A1+HADk4OQww==",
+ "cpu": [
+ "ia32"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "win32"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@esbuild/win32-x64": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.25.9.tgz",
+ "integrity": "sha512-PPOl1mi6lpLNQxnGoyAfschAodRFYXJ+9fs6WHXz7CSWKbOqiMZsubC+BQsVKuul+3vKLuwTHsS2c2y9EoKwxQ==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "win32"
+ ],
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@eslint-community/eslint-utils": {
+ "version": "4.7.0",
+ "resolved": "https://registry.npmjs.org/@eslint-community/eslint-utils/-/eslint-utils-4.7.0.tgz",
+ "integrity": "sha512-dyybb3AcajC7uha6CvhdVRJqaKyn7w2YKqKyAN37NKYgZT36w+iRb0Dymmc5qEJ549c/S31cMMSFd75bteCpCw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "eslint-visitor-keys": "^3.4.3"
+ },
+ "engines": {
+ "node": "^12.22.0 || ^14.17.0 || >=16.0.0"
+ },
+ "funding": {
+ "url": "https://opencollective.com/eslint"
+ },
+ "peerDependencies": {
+ "eslint": "^6.0.0 || ^7.0.0 || >=8.0.0"
+ }
+ },
+ "node_modules/@eslint-community/eslint-utils/node_modules/eslint-visitor-keys": {
+ "version": "3.4.3",
+ "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-3.4.3.tgz",
+ "integrity": "sha512-wpc+LXeiyiisxPlEkUzU6svyS1frIO3Mgxj1fdy7Pm8Ygzguax2N3Fa/D/ag1WqbOprdI+uY6wMUl8/a2G+iag==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": "^12.22.0 || ^14.17.0 || >=16.0.0"
+ },
+ "funding": {
+ "url": "https://opencollective.com/eslint"
+ }
+ },
+ "node_modules/@eslint-community/regexpp": {
+ "version": "4.12.1",
+ "resolved": "https://registry.npmjs.org/@eslint-community/regexpp/-/regexpp-4.12.1.tgz",
+ "integrity": "sha512-CCZCDJuduB9OUkFkY2IgppNZMi2lBQgD2qzwXkEia16cge2pijY/aXi96CJMquDMn3nJdlPV1A5KrJEXwfLNzQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": "^12.0.0 || ^14.0.0 || >=16.0.0"
+ }
+ },
+ "node_modules/@eslint/config-array": {
+ "version": "0.21.0",
+ "resolved": "https://registry.npmjs.org/@eslint/config-array/-/config-array-0.21.0.tgz",
+ "integrity": "sha512-ENIdc4iLu0d93HeYirvKmrzshzofPw6VkZRKQGe9Nv46ZnWUzcF1xV01dcvEg/1wXUR61OmmlSfyeyO7EvjLxQ==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@eslint/object-schema": "^2.1.6",
+ "debug": "^4.3.1",
+ "minimatch": "^3.1.2"
+ },
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ }
+ },
+ "node_modules/@eslint/config-helpers": {
+ "version": "0.3.1",
+ "resolved": "https://registry.npmjs.org/@eslint/config-helpers/-/config-helpers-0.3.1.tgz",
+ "integrity": "sha512-xR93k9WhrDYpXHORXpxVL5oHj3Era7wo6k/Wd8/IsQNnZUTzkGS29lyn3nAT05v6ltUuTFVCCYDEGfy2Or/sPA==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ }
+ },
+ "node_modules/@eslint/core": {
+ "version": "0.15.2",
+ "resolved": "https://registry.npmjs.org/@eslint/core/-/core-0.15.2.tgz",
+ "integrity": "sha512-78Md3/Rrxh83gCxoUc0EiciuOHsIITzLy53m3d9UyiW8y9Dj2D29FeETqyKA+BRK76tnTp6RXWb3pCay8Oyomg==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@types/json-schema": "^7.0.15"
+ },
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ }
+ },
+ "node_modules/@eslint/eslintrc": {
+ "version": "3.3.1",
+ "resolved": "https://registry.npmjs.org/@eslint/eslintrc/-/eslintrc-3.3.1.tgz",
+ "integrity": "sha512-gtF186CXhIl1p4pJNGZw8Yc6RlshoePRvE0X91oPGb3vZ8pM3qOS9W9NGPat9LziaBV7XrJWGylNQXkGcnM3IQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "ajv": "^6.12.4",
+ "debug": "^4.3.2",
+ "espree": "^10.0.1",
+ "globals": "^14.0.0",
+ "ignore": "^5.2.0",
+ "import-fresh": "^3.2.1",
+ "js-yaml": "^4.1.0",
+ "minimatch": "^3.1.2",
+ "strip-json-comments": "^3.1.1"
+ },
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ },
+ "funding": {
+ "url": "https://opencollective.com/eslint"
+ }
+ },
+ "node_modules/@eslint/eslintrc/node_modules/globals": {
+ "version": "14.0.0",
+ "resolved": "https://registry.npmjs.org/globals/-/globals-14.0.0.tgz",
+ "integrity": "sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=18"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/@eslint/js": {
+ "version": "9.34.0",
+ "resolved": "https://registry.npmjs.org/@eslint/js/-/js-9.34.0.tgz",
+ "integrity": "sha512-EoyvqQnBNsV1CWaEJ559rxXL4c8V92gxirbawSmVUOWXlsRxxQXl6LmCpdUblgxgSkDIqKnhzba2SjRTI/A5Rw==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ },
+ "funding": {
+ "url": "https://eslint.org/donate"
+ }
+ },
+ "node_modules/@eslint/object-schema": {
+ "version": "2.1.6",
+ "resolved": "https://registry.npmjs.org/@eslint/object-schema/-/object-schema-2.1.6.tgz",
+ "integrity": "sha512-RBMg5FRL0I0gs51M/guSAj5/e14VQ4tpZnQNWwuDT66P14I43ItmPfIZRhO9fUVIPOAQXU47atlywZ/czoqFPA==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ }
+ },
+ "node_modules/@eslint/plugin-kit": {
+ "version": "0.3.5",
+ "resolved": "https://registry.npmjs.org/@eslint/plugin-kit/-/plugin-kit-0.3.5.tgz",
+ "integrity": "sha512-Z5kJ+wU3oA7MMIqVR9tyZRtjYPr4OC004Q4Rw7pgOKUOKkJfZ3O24nz3WYfGRpMDNmcOi3TwQOmgm7B7Tpii0w==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@eslint/core": "^0.15.2",
+ "levn": "^0.4.1"
+ },
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ }
+ },
+ "node_modules/@firebase/ai": {
+ "version": "2.2.1",
+ "resolved": "https://registry.npmjs.org/@firebase/ai/-/ai-2.2.1.tgz",
+ "integrity": "sha512-0VWlkGB18oDhwMqsgxpt/usMsyjnH3a7hTvQPcAbk7VhFg0QZMDX60mQKfLTFKrB5VwmlaIdVsSZznsTY2S0wA==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/app-check-interop-types": "0.3.3",
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x",
+ "@firebase/app-types": "0.x"
+ }
+ },
+ "node_modules/@firebase/analytics": {
+ "version": "0.10.18",
+ "resolved": "https://registry.npmjs.org/@firebase/analytics/-/analytics-0.10.18.tgz",
+ "integrity": "sha512-iN7IgLvM06iFk8BeFoWqvVpRFW3Z70f+Qe2PfCJ7vPIgLPjHXDE774DhCT5Y2/ZU/ZbXPDPD60x/XPWEoZLNdg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/installations": "0.6.19",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/analytics-compat": {
+ "version": "0.2.24",
+ "resolved": "https://registry.npmjs.org/@firebase/analytics-compat/-/analytics-compat-0.2.24.tgz",
+ "integrity": "sha512-jE+kJnPG86XSqGQGhXXYt1tpTbCTED8OQJ/PQ90SEw14CuxRxx/H+lFbWA1rlFtFSsTCptAJtgyRBwr/f00vsw==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/analytics": "0.10.18",
+ "@firebase/analytics-types": "0.8.3",
+ "@firebase/component": "0.7.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/analytics-types": {
+ "version": "0.8.3",
+ "resolved": "https://registry.npmjs.org/@firebase/analytics-types/-/analytics-types-0.8.3.tgz",
+ "integrity": "sha512-VrIp/d8iq2g501qO46uGz3hjbDb8xzYMrbu8Tp0ovzIzrvJZ2fvmj649gTjge/b7cCCcjT0H37g1gVtlNhnkbg==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/app": {
+ "version": "0.14.2",
+ "resolved": "https://registry.npmjs.org/@firebase/app/-/app-0.14.2.tgz",
+ "integrity": "sha512-Ecx2ig/JLC9ayIQwZHqm41Tzlf4c1WUuFhFUZB1y+JIJqDRE579x7Uil7tKT8MwDpOPwrK5ZtpxdSsrfy/LF8Q==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "idb": "7.1.1",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ }
+ },
+ "node_modules/@firebase/app-check": {
+ "version": "0.11.0",
+ "resolved": "https://registry.npmjs.org/@firebase/app-check/-/app-check-0.11.0.tgz",
+ "integrity": "sha512-XAvALQayUMBJo58U/rxW02IhsesaxxfWVmVkauZvGEz3vOAjMEQnzFlyblqkc2iAaO82uJ2ZVyZv9XzPfxjJ6w==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/app-check-compat": {
+ "version": "0.4.0",
+ "resolved": "https://registry.npmjs.org/@firebase/app-check-compat/-/app-check-compat-0.4.0.tgz",
+ "integrity": "sha512-UfK2Q8RJNjYM/8MFORltZRG9lJj11k0nW84rrffiKvcJxLf1jf6IEjCIkCamykHE73C6BwqhVfhIBs69GXQV0g==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/app-check": "0.11.0",
+ "@firebase/app-check-types": "0.5.3",
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/app-check-interop-types": {
+ "version": "0.3.3",
+ "resolved": "https://registry.npmjs.org/@firebase/app-check-interop-types/-/app-check-interop-types-0.3.3.tgz",
+ "integrity": "sha512-gAlxfPLT2j8bTI/qfe3ahl2I2YcBQ8cFIBdhAQA4I2f3TndcO+22YizyGYuttLHPQEpWkhmpFW60VCFEPg4g5A==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/app-check-types": {
+ "version": "0.5.3",
+ "resolved": "https://registry.npmjs.org/@firebase/app-check-types/-/app-check-types-0.5.3.tgz",
+ "integrity": "sha512-hyl5rKSj0QmwPdsAxrI5x1otDlByQ7bvNvVt8G/XPO2CSwE++rmSVf3VEhaeOR4J8ZFaF0Z0NDSmLejPweZ3ng==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/app-compat": {
+ "version": "0.5.2",
+ "resolved": "https://registry.npmjs.org/@firebase/app-compat/-/app-compat-0.5.2.tgz",
+ "integrity": "sha512-cn+U27GDaBS/irsbvrfnPZdcCzeZPRGKieSlyb7vV6LSOL6mdECnB86PgYjYGxSNg8+U48L/NeevTV1odU+mOQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/app": "0.14.2",
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ }
+ },
+ "node_modules/@firebase/app-types": {
+ "version": "0.9.3",
+ "resolved": "https://registry.npmjs.org/@firebase/app-types/-/app-types-0.9.3.tgz",
+ "integrity": "sha512-kRVpIl4vVGJ4baogMDINbyrIOtOxqhkZQg4jTq3l8Lw6WSk0xfpEYzezFu+Kl4ve4fbPl79dvwRtaFqAC/ucCw==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/auth": {
+ "version": "1.11.0",
+ "resolved": "https://registry.npmjs.org/@firebase/auth/-/auth-1.11.0.tgz",
+ "integrity": "sha512-5j7+ua93X+IRcJ1oMDTClTo85l7Xe40WSkoJ+shzPrX7OISlVWLdE1mKC57PSD+/LfAbdhJmvKixINBw2ESK6w==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x",
+ "@react-native-async-storage/async-storage": "^1.18.1"
+ },
+ "peerDependenciesMeta": {
+ "@react-native-async-storage/async-storage": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/@firebase/auth-compat": {
+ "version": "0.6.0",
+ "resolved": "https://registry.npmjs.org/@firebase/auth-compat/-/auth-compat-0.6.0.tgz",
+ "integrity": "sha512-J0lGSxXlG/lYVi45wbpPhcWiWUMXevY4fvLZsN1GHh+po7TZVng+figdHBVhFheaiipU8HZyc7ljw1jNojM2nw==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/auth": "1.11.0",
+ "@firebase/auth-types": "0.13.0",
+ "@firebase/component": "0.7.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/auth-interop-types": {
+ "version": "0.2.4",
+ "resolved": "https://registry.npmjs.org/@firebase/auth-interop-types/-/auth-interop-types-0.2.4.tgz",
+ "integrity": "sha512-JPgcXKCuO+CWqGDnigBtvo09HeBs5u/Ktc2GaFj2m01hLarbxthLNm7Fk8iOP1aqAtXV+fnnGj7U28xmk7IwVA==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/auth-types": {
+ "version": "0.13.0",
+ "resolved": "https://registry.npmjs.org/@firebase/auth-types/-/auth-types-0.13.0.tgz",
+ "integrity": "sha512-S/PuIjni0AQRLF+l9ck0YpsMOdE8GO2KU6ubmBB7P+7TJUCQDa3R1dlgYm9UzGbbePMZsp0xzB93f2b/CgxMOg==",
+ "license": "Apache-2.0",
+ "peerDependencies": {
+ "@firebase/app-types": "0.x",
+ "@firebase/util": "1.x"
+ }
+ },
+ "node_modules/@firebase/component": {
+ "version": "0.7.0",
+ "resolved": "https://registry.npmjs.org/@firebase/component/-/component-0.7.0.tgz",
+ "integrity": "sha512-wR9En2A+WESUHexjmRHkqtaVH94WLNKt6rmeqZhSLBybg4Wyf0Umk04SZsS6sBq4102ZsDBFwoqMqJYj2IoDSg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ }
+ },
+ "node_modules/@firebase/data-connect": {
+ "version": "0.3.11",
+ "resolved": "https://registry.npmjs.org/@firebase/data-connect/-/data-connect-0.3.11.tgz",
+ "integrity": "sha512-G258eLzAD6im9Bsw+Qm1Z+P4x0PGNQ45yeUuuqe5M9B1rn0RJvvsQCRHXgE52Z+n9+WX1OJd/crcuunvOGc7Vw==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/auth-interop-types": "0.2.4",
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/database": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/@firebase/database/-/database-1.1.0.tgz",
+ "integrity": "sha512-gM6MJFae3pTyNLoc9VcJNuaUDej0ctdjn3cVtILo3D5lpp0dmUHHLFN/pUKe7ImyeB1KAvRlEYxvIHNF04Filg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/app-check-interop-types": "0.3.3",
+ "@firebase/auth-interop-types": "0.2.4",
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "faye-websocket": "0.11.4",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ }
+ },
+ "node_modules/@firebase/database-compat": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/@firebase/database-compat/-/database-compat-2.1.0.tgz",
+ "integrity": "sha512-8nYc43RqxScsePVd1qe1xxvWNf0OBnbwHxmXJ7MHSuuTVYFO3eLyLW3PiCKJ9fHnmIz4p4LbieXwz+qtr9PZDg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/database": "1.1.0",
+ "@firebase/database-types": "1.0.16",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ }
+ },
+ "node_modules/@firebase/database-types": {
+ "version": "1.0.16",
+ "resolved": "https://registry.npmjs.org/@firebase/database-types/-/database-types-1.0.16.tgz",
+ "integrity": "sha512-xkQLQfU5De7+SPhEGAXFBnDryUWhhlFXelEg2YeZOQMCdoe7dL64DDAd77SQsR+6uoXIZY5MB4y/inCs4GTfcw==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/app-types": "0.9.3",
+ "@firebase/util": "1.13.0"
+ }
+ },
+ "node_modules/@firebase/firestore": {
+ "version": "4.9.1",
+ "resolved": "https://registry.npmjs.org/@firebase/firestore/-/firestore-4.9.1.tgz",
+ "integrity": "sha512-PYVUTkhC9y8pydrqC3O1Oc4AMfkGSWdmuH9xgPJjiEbpUIUPQ4J8wJhyuash+o2u+axmyNRFP8ULNUKb+WzBzQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "@firebase/webchannel-wrapper": "1.0.4",
+ "@grpc/grpc-js": "~1.9.0",
+ "@grpc/proto-loader": "^0.7.8",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/firestore-compat": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/@firebase/firestore-compat/-/firestore-compat-0.4.1.tgz",
+ "integrity": "sha512-BjalPTDh/K0vmR/M/DE148dpIqbcfvtFVTietbUDWDWYIl9YH0TTVp/EwXRbZwswPxyjx4GdHW61GB2AYVz1SQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/firestore": "4.9.1",
+ "@firebase/firestore-types": "3.0.3",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/firestore-types": {
+ "version": "3.0.3",
+ "resolved": "https://registry.npmjs.org/@firebase/firestore-types/-/firestore-types-3.0.3.tgz",
+ "integrity": "sha512-hD2jGdiWRxB/eZWF89xcK9gF8wvENDJkzpVFb4aGkzfEaKxVRD1kjz1t1Wj8VZEp2LCB53Yx1zD8mrhQu87R6Q==",
+ "license": "Apache-2.0",
+ "peerDependencies": {
+ "@firebase/app-types": "0.x",
+ "@firebase/util": "1.x"
+ }
+ },
+ "node_modules/@firebase/functions": {
+ "version": "0.13.1",
+ "resolved": "https://registry.npmjs.org/@firebase/functions/-/functions-0.13.1.tgz",
+ "integrity": "sha512-sUeWSb0rw5T+6wuV2o9XNmh9yHxjFI9zVGFnjFi+n7drTEWpl7ZTz1nROgGrSu472r+LAaj+2YaSicD4R8wfbw==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/app-check-interop-types": "0.3.3",
+ "@firebase/auth-interop-types": "0.2.4",
+ "@firebase/component": "0.7.0",
+ "@firebase/messaging-interop-types": "0.2.3",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/functions-compat": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/@firebase/functions-compat/-/functions-compat-0.4.1.tgz",
+ "integrity": "sha512-AxxUBXKuPrWaVNQ8o1cG1GaCAtXT8a0eaTDfqgS5VsRYLAR0ALcfqDLwo/QyijZj1w8Qf8n3Qrfy/+Im245hOQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/functions": "0.13.1",
+ "@firebase/functions-types": "0.6.3",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/functions-types": {
+ "version": "0.6.3",
+ "resolved": "https://registry.npmjs.org/@firebase/functions-types/-/functions-types-0.6.3.tgz",
+ "integrity": "sha512-EZoDKQLUHFKNx6VLipQwrSMh01A1SaL3Wg6Hpi//x6/fJ6Ee4hrAeswK99I5Ht8roiniKHw4iO0B1Oxj5I4plg==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/installations": {
+ "version": "0.6.19",
+ "resolved": "https://registry.npmjs.org/@firebase/installations/-/installations-0.6.19.tgz",
+ "integrity": "sha512-nGDmiwKLI1lerhwfwSHvMR9RZuIH5/8E3kgUWnVRqqL7kGVSktjLTWEMva7oh5yxQ3zXfIlIwJwMcaM5bK5j8Q==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/util": "1.13.0",
+ "idb": "7.1.1",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/installations-compat": {
+ "version": "0.2.19",
+ "resolved": "https://registry.npmjs.org/@firebase/installations-compat/-/installations-compat-0.2.19.tgz",
+ "integrity": "sha512-khfzIY3EI5LePePo7vT19/VEIH1E3iYsHknI/6ek9T8QCozAZshWT9CjlwOzZrKvTHMeNcbpo/VSOSIWDSjWdQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/installations": "0.6.19",
+ "@firebase/installations-types": "0.5.3",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/installations-types": {
+ "version": "0.5.3",
+ "resolved": "https://registry.npmjs.org/@firebase/installations-types/-/installations-types-0.5.3.tgz",
+ "integrity": "sha512-2FJI7gkLqIE0iYsNQ1P751lO3hER+Umykel+TkLwHj6plzWVxqvfclPUZhcKFVQObqloEBTmpi2Ozn7EkCABAA==",
+ "license": "Apache-2.0",
+ "peerDependencies": {
+ "@firebase/app-types": "0.x"
+ }
+ },
+ "node_modules/@firebase/logger": {
+ "version": "0.5.0",
+ "resolved": "https://registry.npmjs.org/@firebase/logger/-/logger-0.5.0.tgz",
+ "integrity": "sha512-cGskaAvkrnh42b3BA3doDWeBmuHFO/Mx5A83rbRDYakPjO9bJtRL3dX7javzc2Rr/JHZf4HlterTW2lUkfeN4g==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ }
+ },
+ "node_modules/@firebase/messaging": {
+ "version": "0.12.23",
+ "resolved": "https://registry.npmjs.org/@firebase/messaging/-/messaging-0.12.23.tgz",
+ "integrity": "sha512-cfuzv47XxqW4HH/OcR5rM+AlQd1xL/VhuaeW/wzMW1LFrsFcTn0GND/hak1vkQc2th8UisBcrkVcQAnOnKwYxg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/installations": "0.6.19",
+ "@firebase/messaging-interop-types": "0.2.3",
+ "@firebase/util": "1.13.0",
+ "idb": "7.1.1",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/messaging-compat": {
+ "version": "0.2.23",
+ "resolved": "https://registry.npmjs.org/@firebase/messaging-compat/-/messaging-compat-0.2.23.tgz",
+ "integrity": "sha512-SN857v/kBUvlQ9X/UjAqBoQ2FEaL1ZozpnmL1ByTe57iXkmnVVFm9KqAsTfmf+OEwWI4kJJe9NObtN/w22lUgg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/messaging": "0.12.23",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/messaging-interop-types": {
+ "version": "0.2.3",
+ "resolved": "https://registry.npmjs.org/@firebase/messaging-interop-types/-/messaging-interop-types-0.2.3.tgz",
+ "integrity": "sha512-xfzFaJpzcmtDjycpDeCUj0Ge10ATFi/VHVIvEEjDNc3hodVBQADZ7BWQU7CuFpjSHE+eLuBI13z5F/9xOoGX8Q==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/performance": {
+ "version": "0.7.9",
+ "resolved": "https://registry.npmjs.org/@firebase/performance/-/performance-0.7.9.tgz",
+ "integrity": "sha512-UzybENl1EdM2I1sjYm74xGt/0JzRnU/0VmfMAKo2LSpHJzaj77FCLZXmYQ4oOuE+Pxtt8Wy2BVJEENiZkaZAzQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/installations": "0.6.19",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0",
+ "web-vitals": "^4.2.4"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/performance-compat": {
+ "version": "0.2.22",
+ "resolved": "https://registry.npmjs.org/@firebase/performance-compat/-/performance-compat-0.2.22.tgz",
+ "integrity": "sha512-xLKxaSAl/FVi10wDX/CHIYEUP13jXUjinL+UaNXT9ByIvxII5Ne5150mx6IgM8G6Q3V+sPiw9C8/kygkyHUVxg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/performance": "0.7.9",
+ "@firebase/performance-types": "0.2.3",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/performance-types": {
+ "version": "0.2.3",
+ "resolved": "https://registry.npmjs.org/@firebase/performance-types/-/performance-types-0.2.3.tgz",
+ "integrity": "sha512-IgkyTz6QZVPAq8GSkLYJvwSLr3LS9+V6vNPQr0x4YozZJiLF5jYixj0amDtATf1X0EtYHqoPO48a9ija8GocxQ==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/remote-config": {
+ "version": "0.6.6",
+ "resolved": "https://registry.npmjs.org/@firebase/remote-config/-/remote-config-0.6.6.tgz",
+ "integrity": "sha512-Yelp5xd8hM4NO1G1SuWrIk4h5K42mNwC98eWZ9YLVu6Z0S6hFk1mxotAdCRmH2luH8FASlYgLLq6OQLZ4nbnCA==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/installations": "0.6.19",
+ "@firebase/logger": "0.5.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/remote-config-compat": {
+ "version": "0.2.19",
+ "resolved": "https://registry.npmjs.org/@firebase/remote-config-compat/-/remote-config-compat-0.2.19.tgz",
+ "integrity": "sha512-y7PZAb0l5+5oIgLJr88TNSelxuASGlXyAKj+3pUc4fDuRIdPNBoONMHaIUa9rlffBR5dErmaD2wUBJ7Z1a513Q==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/logger": "0.5.0",
+ "@firebase/remote-config": "0.6.6",
+ "@firebase/remote-config-types": "0.4.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/remote-config-types": {
+ "version": "0.4.0",
+ "resolved": "https://registry.npmjs.org/@firebase/remote-config-types/-/remote-config-types-0.4.0.tgz",
+ "integrity": "sha512-7p3mRE/ldCNYt8fmWMQ/MSGRmXYlJ15Rvs9Rk17t8p0WwZDbeK7eRmoI1tvCPaDzn9Oqh+yD6Lw+sGLsLg4kKg==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@firebase/storage": {
+ "version": "0.14.0",
+ "resolved": "https://registry.npmjs.org/@firebase/storage/-/storage-0.14.0.tgz",
+ "integrity": "sha512-xWWbb15o6/pWEw8H01UQ1dC5U3rf8QTAzOChYyCpafV6Xki7KVp3Yaw2nSklUwHEziSWE9KoZJS7iYeyqWnYFA==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app": "0.x"
+ }
+ },
+ "node_modules/@firebase/storage-compat": {
+ "version": "0.4.0",
+ "resolved": "https://registry.npmjs.org/@firebase/storage-compat/-/storage-compat-0.4.0.tgz",
+ "integrity": "sha512-vDzhgGczr1OfcOy285YAPur5pWDEvD67w4thyeCUh6Ys0izN9fNYtA1MJERmNBfqjqu0lg0FM5GLbw0Il21M+g==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/component": "0.7.0",
+ "@firebase/storage": "0.14.0",
+ "@firebase/storage-types": "0.8.3",
+ "@firebase/util": "1.13.0",
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "@firebase/app-compat": "0.x"
+ }
+ },
+ "node_modules/@firebase/storage-types": {
+ "version": "0.8.3",
+ "resolved": "https://registry.npmjs.org/@firebase/storage-types/-/storage-types-0.8.3.tgz",
+ "integrity": "sha512-+Muk7g9uwngTpd8xn9OdF/D48uiQ7I1Fae7ULsWPuKoCH3HU7bfFPhxtJYzyhjdniowhuDpQcfPmuNRAqZEfvg==",
+ "license": "Apache-2.0",
+ "peerDependencies": {
+ "@firebase/app-types": "0.x",
+ "@firebase/util": "1.x"
+ }
+ },
+ "node_modules/@firebase/util": {
+ "version": "1.13.0",
+ "resolved": "https://registry.npmjs.org/@firebase/util/-/util-1.13.0.tgz",
+ "integrity": "sha512-0AZUyYUfpMNcztR5l09izHwXkZpghLgCUaAGjtMwXnCg3bj4ml5VgiwqOMOxJ+Nw4qN/zJAaOQBcJ7KGkWStqQ==",
+ "hasInstallScript": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "tslib": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ }
+ },
+ "node_modules/@firebase/webchannel-wrapper": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/@firebase/webchannel-wrapper/-/webchannel-wrapper-1.0.4.tgz",
+ "integrity": "sha512-6m8+P+dE/RPl4OPzjTxcTbQ0rGeRyeTvAi9KwIffBVCiAMKrfXfLZaqD1F+m8t4B5/Q5aHsMozOgirkH1F5oMQ==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/@grpc/grpc-js": {
+ "version": "1.9.15",
+ "resolved": "https://registry.npmjs.org/@grpc/grpc-js/-/grpc-js-1.9.15.tgz",
+ "integrity": "sha512-nqE7Hc0AzI+euzUwDAy0aY5hCp10r734gMGRdU+qOPX0XSceI2ULrcXB5U2xSc5VkWwalCj4M7GzCAygZl2KoQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@grpc/proto-loader": "^0.7.8",
+ "@types/node": ">=12.12.47"
+ },
+ "engines": {
+ "node": "^8.13.0 || >=10.10.0"
+ }
+ },
+ "node_modules/@grpc/proto-loader": {
+ "version": "0.7.15",
+ "resolved": "https://registry.npmjs.org/@grpc/proto-loader/-/proto-loader-0.7.15.tgz",
+ "integrity": "sha512-tMXdRCfYVixjuFK+Hk0Q1s38gV9zDiDJfWL3h1rv4Qc39oILCu1TRTDt7+fGUI8K4G1Fj125Hx/ru3azECWTyQ==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "lodash.camelcase": "^4.3.0",
+ "long": "^5.0.0",
+ "protobufjs": "^7.2.5",
+ "yargs": "^17.7.2"
+ },
+ "bin": {
+ "proto-loader-gen-types": "build/bin/proto-loader-gen-types.js"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/@humanfs/core": {
+ "version": "0.19.1",
+ "resolved": "https://registry.npmjs.org/@humanfs/core/-/core-0.19.1.tgz",
+ "integrity": "sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": ">=18.18.0"
+ }
+ },
+ "node_modules/@humanfs/node": {
+ "version": "0.16.6",
+ "resolved": "https://registry.npmjs.org/@humanfs/node/-/node-0.16.6.tgz",
+ "integrity": "sha512-YuI2ZHQL78Q5HbhDiBA1X4LmYdXCKCMQIfw0pw7piHJwyREFebJUvrQN4cMssyES6x+vfUbx1CIpaQUKYdQZOw==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@humanfs/core": "^0.19.1",
+ "@humanwhocodes/retry": "^0.3.0"
+ },
+ "engines": {
+ "node": ">=18.18.0"
+ }
+ },
+ "node_modules/@humanfs/node/node_modules/@humanwhocodes/retry": {
+ "version": "0.3.1",
+ "resolved": "https://registry.npmjs.org/@humanwhocodes/retry/-/retry-0.3.1.tgz",
+ "integrity": "sha512-JBxkERygn7Bv/GbN5Rv8Ul6LVknS+5Bp6RgDC/O8gEBU/yeH5Ui5C/OlWrTb6qct7LjjfT6Re2NxB0ln0yYybA==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": ">=18.18"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/nzakas"
+ }
+ },
+ "node_modules/@humanwhocodes/module-importer": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/@humanwhocodes/module-importer/-/module-importer-1.0.1.tgz",
+ "integrity": "sha512-bxveV4V8v5Yb4ncFTT3rPSgZBOpCkjfK0y4oVVVJwIuDVBRMDXrPyXRL988i5ap9m9bnyEEjWfm5WkBmtffLfA==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": ">=12.22"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/nzakas"
+ }
+ },
+ "node_modules/@humanwhocodes/retry": {
+ "version": "0.4.3",
+ "resolved": "https://registry.npmjs.org/@humanwhocodes/retry/-/retry-0.4.3.tgz",
+ "integrity": "sha512-bV0Tgo9K4hfPCek+aMAn81RppFKv2ySDQeMoSZuvTASywNTnVJCArCZE2FWqpvIatKu7VMRLWlR1EazvVhDyhQ==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": ">=18.18"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/nzakas"
+ }
+ },
+ "node_modules/@jridgewell/gen-mapping": {
+ "version": "0.3.13",
+ "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz",
+ "integrity": "sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@jridgewell/sourcemap-codec": "^1.5.0",
+ "@jridgewell/trace-mapping": "^0.3.24"
+ }
+ },
+ "node_modules/@jridgewell/resolve-uri": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz",
+ "integrity": "sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@jridgewell/sourcemap-codec": {
+ "version": "1.5.5",
+ "resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.5.5.tgz",
+ "integrity": "sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/@jridgewell/trace-mapping": {
+ "version": "0.3.30",
+ "resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.30.tgz",
+ "integrity": "sha512-GQ7Nw5G2lTu/BtHTKfXhKHok2WGetd4XYcVKGx00SjAk8GMwgJM3zr6zORiPGuOE+/vkc90KtTosSSvaCjKb2Q==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@jridgewell/resolve-uri": "^3.1.0",
+ "@jridgewell/sourcemap-codec": "^1.4.14"
+ }
+ },
+ "node_modules/@kurkle/color": {
+ "version": "0.3.4",
+ "resolved": "https://registry.npmjs.org/@kurkle/color/-/color-0.3.4.tgz",
+ "integrity": "sha512-M5UknZPHRu3DEDWoipU6sE8PdkZ6Z/S+v4dD+Ke8IaNlpdSQah50lz1KtcFBa2vsdOnwbbnxJwVM4wty6udA5w==",
+ "license": "MIT"
+ },
+ "node_modules/@protobufjs/aspromise": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/@protobufjs/aspromise/-/aspromise-1.1.2.tgz",
+ "integrity": "sha512-j+gKExEuLmKwvz3OgROXtrJ2UG2x8Ch2YZUxahh+s1F2HZ+wAceUNLkvy6zKCPVRkU++ZWQrdxsUeQXmcg4uoQ==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/base64": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/@protobufjs/base64/-/base64-1.1.2.tgz",
+ "integrity": "sha512-AZkcAA5vnN/v4PDqKyMR5lx7hZttPDgClv83E//FMNhR2TMcLUhfRUBHCmSl0oi9zMgDDqRUJkSxO3wm85+XLg==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/codegen": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/@protobufjs/codegen/-/codegen-2.0.4.tgz",
+ "integrity": "sha512-YyFaikqM5sH0ziFZCN3xDC7zeGaB/d0IUb9CATugHWbd1FRFwWwt4ld4OYMPWu5a3Xe01mGAULCdqhMlPl29Jg==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/eventemitter": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/@protobufjs/eventemitter/-/eventemitter-1.1.0.tgz",
+ "integrity": "sha512-j9ednRT81vYJ9OfVuXG6ERSTdEL1xVsNgqpkxMsbIabzSo3goCjDIveeGv5d03om39ML71RdmrGNjG5SReBP/Q==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/fetch": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/@protobufjs/fetch/-/fetch-1.1.0.tgz",
+ "integrity": "sha512-lljVXpqXebpsijW71PZaCYeIcE5on1w5DlQy5WH6GLbFryLUrBD4932W/E2BSpfRJWseIL4v/KPgBFxDOIdKpQ==",
+ "license": "BSD-3-Clause",
+ "dependencies": {
+ "@protobufjs/aspromise": "^1.1.1",
+ "@protobufjs/inquire": "^1.1.0"
+ }
+ },
+ "node_modules/@protobufjs/float": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/@protobufjs/float/-/float-1.0.2.tgz",
+ "integrity": "sha512-Ddb+kVXlXst9d+R9PfTIxh1EdNkgoRe5tOX6t01f1lYWOvJnSPDBlG241QLzcyPdoNTsblLUdujGSE4RzrTZGQ==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/inquire": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/@protobufjs/inquire/-/inquire-1.1.0.tgz",
+ "integrity": "sha512-kdSefcPdruJiFMVSbn801t4vFK7KB/5gd2fYvrxhuJYg8ILrmn9SKSX2tZdV6V+ksulWqS7aXjBcRXl3wHoD9Q==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/path": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/@protobufjs/path/-/path-1.1.2.tgz",
+ "integrity": "sha512-6JOcJ5Tm08dOHAbdR3GrvP+yUUfkjG5ePsHYczMFLq3ZmMkAD98cDgcT2iA1lJ9NVwFd4tH/iSSoe44YWkltEA==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/pool": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/@protobufjs/pool/-/pool-1.1.0.tgz",
+ "integrity": "sha512-0kELaGSIDBKvcgS4zkjz1PeddatrjYcmMWOlAuAPwAeccUrPHdUqo/J6LiymHHEiJT5NrF1UVwxY14f+fy4WQw==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@protobufjs/utf8": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/@protobufjs/utf8/-/utf8-1.1.0.tgz",
+ "integrity": "sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw==",
+ "license": "BSD-3-Clause"
+ },
+ "node_modules/@rolldown/pluginutils": {
+ "version": "1.0.0-beta.34",
+ "resolved": "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-beta.34.tgz",
+ "integrity": "sha512-LyAREkZHP5pMom7c24meKmJCdhf2hEyvam2q0unr3or9ydwDL+DJ8chTF6Av/RFPb3rH8UFBdMzO5MxTZW97oA==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/@rollup/rollup-android-arm-eabi": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.49.0.tgz",
+ "integrity": "sha512-rlKIeL854Ed0e09QGYFlmDNbka6I3EQFw7iZuugQjMb11KMpJCLPFL4ZPbMfaEhLADEL1yx0oujGkBQ7+qW3eA==",
+ "cpu": [
+ "arm"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "android"
+ ]
+ },
+ "node_modules/@rollup/rollup-android-arm64": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.49.0.tgz",
+ "integrity": "sha512-cqPpZdKUSQYRtLLr6R4X3sD4jCBO1zUmeo3qrWBCqYIeH8Q3KRL4F3V7XJ2Rm8/RJOQBZuqzQGWPjjvFUcYa/w==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "android"
+ ]
+ },
+ "node_modules/@rollup/rollup-darwin-arm64": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.49.0.tgz",
+ "integrity": "sha512-99kMMSMQT7got6iYX3yyIiJfFndpojBmkHfTc1rIje8VbjhmqBXE+nb7ZZP3A5skLyujvT0eIUCUsxAe6NjWbw==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ]
+ },
+ "node_modules/@rollup/rollup-darwin-x64": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.49.0.tgz",
+ "integrity": "sha512-y8cXoD3wdWUDpjOLMKLx6l+NFz3NlkWKcBCBfttUn+VGSfgsQ5o/yDUGtzE9HvsodkP0+16N0P4Ty1VuhtRUGg==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ]
+ },
+ "node_modules/@rollup/rollup-freebsd-arm64": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-arm64/-/rollup-freebsd-arm64-4.49.0.tgz",
+ "integrity": "sha512-3mY5Pr7qv4GS4ZvWoSP8zha8YoiqrU+e0ViPvB549jvliBbdNLrg2ywPGkgLC3cmvN8ya3za+Q2xVyT6z+vZqA==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "freebsd"
+ ]
+ },
+ "node_modules/@rollup/rollup-freebsd-x64": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-x64/-/rollup-freebsd-x64-4.49.0.tgz",
+ "integrity": "sha512-C9KzzOAQU5gU4kG8DTk+tjdKjpWhVWd5uVkinCwwFub2m7cDYLOdtXoMrExfeBmeRy9kBQMkiyJ+HULyF1yj9w==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "freebsd"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-arm-gnueabihf": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.49.0.tgz",
+ "integrity": "sha512-OVSQgEZDVLnTbMq5NBs6xkmz3AADByCWI4RdKSFNlDsYXdFtlxS59J+w+LippJe8KcmeSSM3ba+GlsM9+WwC1w==",
+ "cpu": [
+ "arm"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-arm-musleabihf": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.49.0.tgz",
+ "integrity": "sha512-ZnfSFA7fDUHNa4P3VwAcfaBLakCbYaxCk0jUnS3dTou9P95kwoOLAMlT3WmEJDBCSrOEFFV0Y1HXiwfLYJuLlA==",
+ "cpu": [
+ "arm"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-arm64-gnu": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.49.0.tgz",
+ "integrity": "sha512-Z81u+gfrobVK2iV7GqZCBfEB1y6+I61AH466lNK+xy1jfqFLiQ9Qv716WUM5fxFrYxwC7ziVdZRU9qvGHkYIJg==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-arm64-musl": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.49.0.tgz",
+ "integrity": "sha512-zoAwS0KCXSnTp9NH/h9aamBAIve0DXeYpll85shf9NJ0URjSTzzS+Z9evmolN+ICfD3v8skKUPyk2PO0uGdFqg==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-loongarch64-gnu": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-loongarch64-gnu/-/rollup-linux-loongarch64-gnu-4.49.0.tgz",
+ "integrity": "sha512-2QyUyQQ1ZtwZGiq0nvODL+vLJBtciItC3/5cYN8ncDQcv5avrt2MbKt1XU/vFAJlLta5KujqyHdYtdag4YEjYQ==",
+ "cpu": [
+ "loong64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-ppc64-gnu": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-ppc64-gnu/-/rollup-linux-ppc64-gnu-4.49.0.tgz",
+ "integrity": "sha512-k9aEmOWt+mrMuD3skjVJSSxHckJp+SiFzFG+v8JLXbc/xi9hv2icSkR3U7uQzqy+/QbbYY7iNB9eDTwrELo14g==",
+ "cpu": [
+ "ppc64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-riscv64-gnu": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.49.0.tgz",
+ "integrity": "sha512-rDKRFFIWJ/zJn6uk2IdYLc09Z7zkE5IFIOWqpuU0o6ZpHcdniAyWkwSUWE/Z25N/wNDmFHHMzin84qW7Wzkjsw==",
+ "cpu": [
+ "riscv64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-riscv64-musl": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-musl/-/rollup-linux-riscv64-musl-4.49.0.tgz",
+ "integrity": "sha512-FkkhIY/hYFVnOzz1WeV3S9Bd1h0hda/gRqvZCMpHWDHdiIHn6pqsY3b5eSbvGccWHMQ1uUzgZTKS4oGpykf8Tw==",
+ "cpu": [
+ "riscv64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-s390x-gnu": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.49.0.tgz",
+ "integrity": "sha512-gRf5c+A7QiOG3UwLyOOtyJMD31JJhMjBvpfhAitPAoqZFcOeK3Kc1Veg1z/trmt+2P6F/biT02fU19GGTS529A==",
+ "cpu": [
+ "s390x"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-x64-gnu": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.49.0.tgz",
+ "integrity": "sha512-BR7+blScdLW1h/2hB/2oXM+dhTmpW3rQt1DeSiCP9mc2NMMkqVgjIN3DDsNpKmezffGC9R8XKVOLmBkRUcK/sA==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-linux-x64-musl": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.49.0.tgz",
+ "integrity": "sha512-hDMOAe+6nX3V5ei1I7Au3wcr9h3ktKzDvF2ne5ovX8RZiAHEtX1A5SNNk4zt1Qt77CmnbqT+upb/umzoPMWiPg==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "linux"
+ ]
+ },
+ "node_modules/@rollup/rollup-win32-arm64-msvc": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.49.0.tgz",
+ "integrity": "sha512-wkNRzfiIGaElC9kXUT+HLx17z7D0jl+9tGYRKwd8r7cUqTL7GYAvgUY++U2hK6Ar7z5Z6IRRoWC8kQxpmM7TDA==",
+ "cpu": [
+ "arm64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "win32"
+ ]
+ },
+ "node_modules/@rollup/rollup-win32-ia32-msvc": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.49.0.tgz",
+ "integrity": "sha512-gq5aW/SyNpjp71AAzroH37DtINDcX1Qw2iv9Chyz49ZgdOP3NV8QCyKZUrGsYX9Yyggj5soFiRCgsL3HwD8TdA==",
+ "cpu": [
+ "ia32"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "win32"
+ ]
+ },
+ "node_modules/@rollup/rollup-win32-x64-msvc": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.49.0.tgz",
+ "integrity": "sha512-gEtqFbzmZLFk2xKh7g0Rlo8xzho8KrEFEkzvHbfUGkrgXOpZ4XagQ6n+wIZFNh1nTb8UD16J4nFSFKXYgnbdBg==",
+ "cpu": [
+ "x64"
+ ],
+ "dev": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "win32"
+ ]
+ },
+ "node_modules/@types/babel__core": {
+ "version": "7.20.5",
+ "resolved": "https://registry.npmjs.org/@types/babel__core/-/babel__core-7.20.5.tgz",
+ "integrity": "sha512-qoQprZvz5wQFJwMDqeseRXWv3rqMvhgpbXFfVyWhbx9X47POIA6i/+dXefEmZKoAgOaTdaIgNSMqMIU61yRyzA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/parser": "^7.20.7",
+ "@babel/types": "^7.20.7",
+ "@types/babel__generator": "*",
+ "@types/babel__template": "*",
+ "@types/babel__traverse": "*"
+ }
+ },
+ "node_modules/@types/babel__generator": {
+ "version": "7.27.0",
+ "resolved": "https://registry.npmjs.org/@types/babel__generator/-/babel__generator-7.27.0.tgz",
+ "integrity": "sha512-ufFd2Xi92OAVPYsy+P4n7/U7e68fex0+Ee8gSG9KX7eo084CWiQ4sdxktvdl0bOPupXtVJPY19zk6EwWqUQ8lg==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/types": "^7.0.0"
+ }
+ },
+ "node_modules/@types/babel__template": {
+ "version": "7.4.4",
+ "resolved": "https://registry.npmjs.org/@types/babel__template/-/babel__template-7.4.4.tgz",
+ "integrity": "sha512-h/NUaSyG5EyxBIp8YRxo4RMe2/qQgvyowRwVMzhYhBCONbW8PUsg4lkFMrhgZhUe5z3L3MiLDuvyJ/CaPa2A8A==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/parser": "^7.1.0",
+ "@babel/types": "^7.0.0"
+ }
+ },
+ "node_modules/@types/babel__traverse": {
+ "version": "7.28.0",
+ "resolved": "https://registry.npmjs.org/@types/babel__traverse/-/babel__traverse-7.28.0.tgz",
+ "integrity": "sha512-8PvcXf70gTDZBgt9ptxJ8elBeBjcLOAcOtoO/mPJjtji1+CdGbHgm77om1GrsPxsiE+uXIpNSK64UYaIwQXd4Q==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/types": "^7.28.2"
+ }
+ },
+ "node_modules/@types/estree": {
+ "version": "1.0.8",
+ "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz",
+ "integrity": "sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/@types/json-schema": {
+ "version": "7.0.15",
+ "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz",
+ "integrity": "sha512-5+fP8P8MFNC+AyZCDxrB2pkZFPGzqQWUzpSeuuVLvm8VMcorNYavBqoFcxK8bQz4Qsbn4oUEEem4wDLfcysGHA==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/@types/node": {
+ "version": "24.3.0",
+ "resolved": "https://registry.npmjs.org/@types/node/-/node-24.3.0.tgz",
+ "integrity": "sha512-aPTXCrfwnDLj4VvXrm+UUCQjNEvJgNA8s5F1cvwQU+3KNltTOkBm1j30uNLyqqPNe7gE3KFzImYoZEfLhp4Yow==",
+ "license": "MIT",
+ "dependencies": {
+ "undici-types": "~7.10.0"
+ }
+ },
+ "node_modules/@types/react": {
+ "version": "19.1.12",
+ "resolved": "https://registry.npmjs.org/@types/react/-/react-19.1.12.tgz",
+ "integrity": "sha512-cMoR+FoAf/Jyq6+Df2/Z41jISvGZZ2eTlnsaJRptmZ76Caldwy1odD4xTr/gNV9VLj0AWgg/nmkevIyUfIIq5w==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "csstype": "^3.0.2"
+ }
+ },
+ "node_modules/@types/react-dom": {
+ "version": "19.1.9",
+ "resolved": "https://registry.npmjs.org/@types/react-dom/-/react-dom-19.1.9.tgz",
+ "integrity": "sha512-qXRuZaOsAdXKFyOhRBg6Lqqc0yay13vN7KrIg4L7N4aaHN68ma9OK3NE1BoDFgFOTfM7zg+3/8+2n8rLUH3OKQ==",
+ "dev": true,
+ "license": "MIT",
+ "peerDependencies": {
+ "@types/react": "^19.0.0"
+ }
+ },
+ "node_modules/@vitejs/plugin-react": {
+ "version": "5.0.2",
+ "resolved": "https://registry.npmjs.org/@vitejs/plugin-react/-/plugin-react-5.0.2.tgz",
+ "integrity": "sha512-tmyFgixPZCx2+e6VO9TNITWcCQl8+Nl/E8YbAyPVv85QCc7/A3JrdfG2A8gIzvVhWuzMOVrFW1aReaNxrI6tbw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@babel/core": "^7.28.3",
+ "@babel/plugin-transform-react-jsx-self": "^7.27.1",
+ "@babel/plugin-transform-react-jsx-source": "^7.27.1",
+ "@rolldown/pluginutils": "1.0.0-beta.34",
+ "@types/babel__core": "^7.20.5",
+ "react-refresh": "^0.17.0"
+ },
+ "engines": {
+ "node": "^20.19.0 || >=22.12.0"
+ },
+ "peerDependencies": {
+ "vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0"
+ }
+ },
+ "node_modules/acorn": {
+ "version": "8.15.0",
+ "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.15.0.tgz",
+ "integrity": "sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg==",
+ "dev": true,
+ "license": "MIT",
+ "bin": {
+ "acorn": "bin/acorn"
+ },
+ "engines": {
+ "node": ">=0.4.0"
+ }
+ },
+ "node_modules/acorn-jsx": {
+ "version": "5.3.2",
+ "resolved": "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-5.3.2.tgz",
+ "integrity": "sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==",
+ "dev": true,
+ "license": "MIT",
+ "peerDependencies": {
+ "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0"
+ }
+ },
+ "node_modules/ajv": {
+ "version": "6.12.6",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz",
+ "integrity": "sha512-j3fVLgvTo527anyYyJOGTYJbG+vnnQYvE0m5mmkc1TK+nxAppkCLMIL0aZ4dblVCNoGShhm+kzE4ZUykBoMg4g==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "fast-json-stable-stringify": "^2.0.0",
+ "json-schema-traverse": "^0.4.1",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/ansi-regex": {
+ "version": "5.0.1",
+ "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
+ "integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/ansi-styles": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
+ "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
+ "license": "MIT",
+ "dependencies": {
+ "color-convert": "^2.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/ansi-styles?sponsor=1"
+ }
+ },
+ "node_modules/argparse": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz",
+ "integrity": "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==",
+ "dev": true,
+ "license": "Python-2.0"
+ },
+ "node_modules/asynckit": {
+ "version": "0.4.0",
+ "resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
+ "integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
+ "license": "MIT"
+ },
+ "node_modules/axios": {
+ "version": "1.11.0",
+ "resolved": "https://registry.npmjs.org/axios/-/axios-1.11.0.tgz",
+ "integrity": "sha512-1Lx3WLFQWm3ooKDYZD1eXmoGO9fxYQjrycfHFC8P0sCfQVXyROp0p9PFWBehewBOdCwHc+f/b8I0fMto5eSfwA==",
+ "license": "MIT",
+ "dependencies": {
+ "follow-redirects": "^1.15.6",
+ "form-data": "^4.0.4",
+ "proxy-from-env": "^1.1.0"
+ }
+ },
+ "node_modules/balanced-match": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
+ "integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/brace-expansion": {
+ "version": "1.1.12",
+ "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
+ "integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "balanced-match": "^1.0.0",
+ "concat-map": "0.0.1"
+ }
+ },
+ "node_modules/browserslist": {
+ "version": "4.25.4",
+ "resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.25.4.tgz",
+ "integrity": "sha512-4jYpcjabC606xJ3kw2QwGEZKX0Aw7sgQdZCvIK9dhVSPh76BKo+C+btT1RRofH7B+8iNpEbgGNVWiLki5q93yg==",
+ "dev": true,
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/browserslist"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/browserslist"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "license": "MIT",
+ "dependencies": {
+ "caniuse-lite": "^1.0.30001737",
+ "electron-to-chromium": "^1.5.211",
+ "node-releases": "^2.0.19",
+ "update-browserslist-db": "^1.1.3"
+ },
+ "bin": {
+ "browserslist": "cli.js"
+ },
+ "engines": {
+ "node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7"
+ }
+ },
+ "node_modules/call-bind-apply-helpers": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
+ "integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
+ "license": "MIT",
+ "dependencies": {
+ "es-errors": "^1.3.0",
+ "function-bind": "^1.1.2"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/callsites": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
+ "integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/caniuse-lite": {
+ "version": "1.0.30001737",
+ "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001737.tgz",
+ "integrity": "sha512-BiloLiXtQNrY5UyF0+1nSJLXUENuhka2pzy2Fx5pGxqavdrxSCW4U6Pn/PoG3Efspi2frRbHpBV2XsrPE6EDlw==",
+ "dev": true,
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/browserslist"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/caniuse-lite"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "license": "CC-BY-4.0"
+ },
+ "node_modules/chalk": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
+ "integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "ansi-styles": "^4.1.0",
+ "supports-color": "^7.1.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/chalk?sponsor=1"
+ }
+ },
+ "node_modules/chart.js": {
+ "version": "4.5.0",
+ "resolved": "https://registry.npmjs.org/chart.js/-/chart.js-4.5.0.tgz",
+ "integrity": "sha512-aYeC/jDgSEx8SHWZvANYMioYMZ2KX02W6f6uVfyteuCGcadDLcYVHdfdygsTQkQ4TKn5lghoojAsPj5pu0SnvQ==",
+ "license": "MIT",
+ "dependencies": {
+ "@kurkle/color": "^0.3.0"
+ },
+ "engines": {
+ "pnpm": ">=8"
+ }
+ },
+ "node_modules/chartjs-adapter-date-fns": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/chartjs-adapter-date-fns/-/chartjs-adapter-date-fns-3.0.0.tgz",
+ "integrity": "sha512-Rs3iEB3Q5pJ973J93OBTpnP7qoGwvq3nUnoMdtxO+9aoJof7UFcRbWcIDteXuYd1fgAvct/32T9qaLyLuZVwCg==",
+ "license": "MIT",
+ "peerDependencies": {
+ "chart.js": ">=2.8.0",
+ "date-fns": ">=2.0.0"
+ }
+ },
+ "node_modules/cliui": {
+ "version": "8.0.1",
+ "resolved": "https://registry.npmjs.org/cliui/-/cliui-8.0.1.tgz",
+ "integrity": "sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ==",
+ "license": "ISC",
+ "dependencies": {
+ "string-width": "^4.2.0",
+ "strip-ansi": "^6.0.1",
+ "wrap-ansi": "^7.0.0"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/color-convert": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
+ "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
+ "license": "MIT",
+ "dependencies": {
+ "color-name": "~1.1.4"
+ },
+ "engines": {
+ "node": ">=7.0.0"
+ }
+ },
+ "node_modules/color-name": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
+ "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
+ "license": "MIT"
+ },
+ "node_modules/combined-stream": {
+ "version": "1.0.8",
+ "resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
+ "integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
+ "license": "MIT",
+ "dependencies": {
+ "delayed-stream": "~1.0.0"
+ },
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/concat-map": {
+ "version": "0.0.1",
+ "resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
+ "integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/convert-source-map": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/convert-source-map/-/convert-source-map-2.0.0.tgz",
+ "integrity": "sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/cookie": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/cookie/-/cookie-1.0.2.tgz",
+ "integrity": "sha512-9Kr/j4O16ISv8zBBhJoi4bXOYNTkFLOqSL3UDB0njXxCXNezjeyVrJyGOWtgfs/q2km1gwBcfH8q1yEGoMYunA==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/cross-spawn": {
+ "version": "7.0.6",
+ "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
+ "integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "path-key": "^3.1.0",
+ "shebang-command": "^2.0.0",
+ "which": "^2.0.1"
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/csstype": {
+ "version": "3.1.3",
+ "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.1.3.tgz",
+ "integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/date-fns": {
+ "version": "2.30.0",
+ "resolved": "https://registry.npmjs.org/date-fns/-/date-fns-2.30.0.tgz",
+ "integrity": "sha512-fnULvOpxnC5/Vg3NCiWelDsLiUc9bRwAPs/+LfTLNvetFCtCTN+yQz15C/fs4AwX1R9K5GLtLfn8QW+dWisaAw==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/runtime": "^7.21.0"
+ },
+ "engines": {
+ "node": ">=0.11"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/date-fns"
+ }
+ },
+ "node_modules/debug": {
+ "version": "4.4.1",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-4.4.1.tgz",
+ "integrity": "sha512-KcKCqiftBJcZr++7ykoDIEwSa3XWowTfNPo92BYxjXiyYEVrUQh2aLyhxBCwww+heortUFxEJYcRzosstTEBYQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "ms": "^2.1.3"
+ },
+ "engines": {
+ "node": ">=6.0"
+ },
+ "peerDependenciesMeta": {
+ "supports-color": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/deep-is": {
+ "version": "0.1.4",
+ "resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz",
+ "integrity": "sha512-oIPzksmTg4/MriiaYGO+okXDT7ztn/w3Eptv/+gSIdMdKsJo0u4CfYNFJPy+4SKMuCqGw2wxnA+URMg3t8a/bQ==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/delayed-stream": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
+ "integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.4.0"
+ }
+ },
+ "node_modules/dunder-proto": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
+ "integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
+ "license": "MIT",
+ "dependencies": {
+ "call-bind-apply-helpers": "^1.0.1",
+ "es-errors": "^1.3.0",
+ "gopd": "^1.2.0"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/electron-to-chromium": {
+ "version": "1.5.211",
+ "resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.211.tgz",
+ "integrity": "sha512-IGBvimJkotaLzFnwIVgW9/UD/AOJ2tByUmeOrtqBfACSbAw5b1G0XpvdaieKyc7ULmbwXVx+4e4Be8pOPBrYkw==",
+ "dev": true,
+ "license": "ISC"
+ },
+ "node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==",
+ "license": "MIT"
+ },
+ "node_modules/es-define-property": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
+ "integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/es-errors": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
+ "integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/es-object-atoms": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
+ "integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
+ "license": "MIT",
+ "dependencies": {
+ "es-errors": "^1.3.0"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/es-set-tostringtag": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
+ "integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
+ "license": "MIT",
+ "dependencies": {
+ "es-errors": "^1.3.0",
+ "get-intrinsic": "^1.2.6",
+ "has-tostringtag": "^1.0.2",
+ "hasown": "^2.0.2"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/esbuild": {
+ "version": "0.25.9",
+ "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.25.9.tgz",
+ "integrity": "sha512-CRbODhYyQx3qp7ZEwzxOk4JBqmD/seJrzPa/cGjY1VtIn5E09Oi9/dB4JwctnfZ8Q8iT7rioVv5k/FNT/uf54g==",
+ "dev": true,
+ "hasInstallScript": true,
+ "license": "MIT",
+ "bin": {
+ "esbuild": "bin/esbuild"
+ },
+ "engines": {
+ "node": ">=18"
+ },
+ "optionalDependencies": {
+ "@esbuild/aix-ppc64": "0.25.9",
+ "@esbuild/android-arm": "0.25.9",
+ "@esbuild/android-arm64": "0.25.9",
+ "@esbuild/android-x64": "0.25.9",
+ "@esbuild/darwin-arm64": "0.25.9",
+ "@esbuild/darwin-x64": "0.25.9",
+ "@esbuild/freebsd-arm64": "0.25.9",
+ "@esbuild/freebsd-x64": "0.25.9",
+ "@esbuild/linux-arm": "0.25.9",
+ "@esbuild/linux-arm64": "0.25.9",
+ "@esbuild/linux-ia32": "0.25.9",
+ "@esbuild/linux-loong64": "0.25.9",
+ "@esbuild/linux-mips64el": "0.25.9",
+ "@esbuild/linux-ppc64": "0.25.9",
+ "@esbuild/linux-riscv64": "0.25.9",
+ "@esbuild/linux-s390x": "0.25.9",
+ "@esbuild/linux-x64": "0.25.9",
+ "@esbuild/netbsd-arm64": "0.25.9",
+ "@esbuild/netbsd-x64": "0.25.9",
+ "@esbuild/openbsd-arm64": "0.25.9",
+ "@esbuild/openbsd-x64": "0.25.9",
+ "@esbuild/openharmony-arm64": "0.25.9",
+ "@esbuild/sunos-x64": "0.25.9",
+ "@esbuild/win32-arm64": "0.25.9",
+ "@esbuild/win32-ia32": "0.25.9",
+ "@esbuild/win32-x64": "0.25.9"
+ }
+ },
+ "node_modules/escalade": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz",
+ "integrity": "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/escape-string-regexp": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz",
+ "integrity": "sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/eslint": {
+ "version": "9.34.0",
+ "resolved": "https://registry.npmjs.org/eslint/-/eslint-9.34.0.tgz",
+ "integrity": "sha512-RNCHRX5EwdrESy3Jc9o8ie8Bog+PeYvvSR8sDGoZxNFTvZ4dlxUB3WzQ3bQMztFrSRODGrLLj8g6OFuGY/aiQg==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@eslint-community/eslint-utils": "^4.2.0",
+ "@eslint-community/regexpp": "^4.12.1",
+ "@eslint/config-array": "^0.21.0",
+ "@eslint/config-helpers": "^0.3.1",
+ "@eslint/core": "^0.15.2",
+ "@eslint/eslintrc": "^3.3.1",
+ "@eslint/js": "9.34.0",
+ "@eslint/plugin-kit": "^0.3.5",
+ "@humanfs/node": "^0.16.6",
+ "@humanwhocodes/module-importer": "^1.0.1",
+ "@humanwhocodes/retry": "^0.4.2",
+ "@types/estree": "^1.0.6",
+ "@types/json-schema": "^7.0.15",
+ "ajv": "^6.12.4",
+ "chalk": "^4.0.0",
+ "cross-spawn": "^7.0.6",
+ "debug": "^4.3.2",
+ "escape-string-regexp": "^4.0.0",
+ "eslint-scope": "^8.4.0",
+ "eslint-visitor-keys": "^4.2.1",
+ "espree": "^10.4.0",
+ "esquery": "^1.5.0",
+ "esutils": "^2.0.2",
+ "fast-deep-equal": "^3.1.3",
+ "file-entry-cache": "^8.0.0",
+ "find-up": "^5.0.0",
+ "glob-parent": "^6.0.2",
+ "ignore": "^5.2.0",
+ "imurmurhash": "^0.1.4",
+ "is-glob": "^4.0.0",
+ "json-stable-stringify-without-jsonify": "^1.0.1",
+ "lodash.merge": "^4.6.2",
+ "minimatch": "^3.1.2",
+ "natural-compare": "^1.4.0",
+ "optionator": "^0.9.3"
+ },
+ "bin": {
+ "eslint": "bin/eslint.js"
+ },
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ },
+ "funding": {
+ "url": "https://eslint.org/donate"
+ },
+ "peerDependencies": {
+ "jiti": "*"
+ },
+ "peerDependenciesMeta": {
+ "jiti": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/eslint-plugin-react-hooks": {
+ "version": "5.2.0",
+ "resolved": "https://registry.npmjs.org/eslint-plugin-react-hooks/-/eslint-plugin-react-hooks-5.2.0.tgz",
+ "integrity": "sha512-+f15FfK64YQwZdJNELETdn5ibXEUQmW1DZL6KXhNnc2heoy/sg9VJJeT7n8TlMWouzWqSWavFkIhHyIbIAEapg==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=10"
+ },
+ "peerDependencies": {
+ "eslint": "^3.0.0 || ^4.0.0 || ^5.0.0 || ^6.0.0 || ^7.0.0 || ^8.0.0-0 || ^9.0.0"
+ }
+ },
+ "node_modules/eslint-plugin-react-refresh": {
+ "version": "0.4.20",
+ "resolved": "https://registry.npmjs.org/eslint-plugin-react-refresh/-/eslint-plugin-react-refresh-0.4.20.tgz",
+ "integrity": "sha512-XpbHQ2q5gUF8BGOX4dHe+71qoirYMhApEPZ7sfhF/dNnOF1UXnCMGZf79SFTBO7Bz5YEIT4TMieSlJBWhP9WBA==",
+ "dev": true,
+ "license": "MIT",
+ "peerDependencies": {
+ "eslint": ">=8.40"
+ }
+ },
+ "node_modules/eslint-scope": {
+ "version": "8.4.0",
+ "resolved": "https://registry.npmjs.org/eslint-scope/-/eslint-scope-8.4.0.tgz",
+ "integrity": "sha512-sNXOfKCn74rt8RICKMvJS7XKV/Xk9kA7DyJr8mJik3S7Cwgy3qlkkmyS2uQB3jiJg6VNdZd/pDBJu0nvG2NlTg==",
+ "dev": true,
+ "license": "BSD-2-Clause",
+ "dependencies": {
+ "esrecurse": "^4.3.0",
+ "estraverse": "^5.2.0"
+ },
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ },
+ "funding": {
+ "url": "https://opencollective.com/eslint"
+ }
+ },
+ "node_modules/eslint-visitor-keys": {
+ "version": "4.2.1",
+ "resolved": "https://registry.npmjs.org/eslint-visitor-keys/-/eslint-visitor-keys-4.2.1.tgz",
+ "integrity": "sha512-Uhdk5sfqcee/9H/rCOJikYz67o0a2Tw2hGRPOG2Y1R2dg7brRe1uG0yaNQDHu+TO/uQPF/5eCapvYSmHUjt7JQ==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ },
+ "funding": {
+ "url": "https://opencollective.com/eslint"
+ }
+ },
+ "node_modules/espree": {
+ "version": "10.4.0",
+ "resolved": "https://registry.npmjs.org/espree/-/espree-10.4.0.tgz",
+ "integrity": "sha512-j6PAQ2uUr79PZhBjP5C5fhl8e39FmRnOjsD5lGnWrFU8i2G776tBK7+nP8KuQUTTyAZUwfQqXAgrVH5MbH9CYQ==",
+ "dev": true,
+ "license": "BSD-2-Clause",
+ "dependencies": {
+ "acorn": "^8.15.0",
+ "acorn-jsx": "^5.3.2",
+ "eslint-visitor-keys": "^4.2.1"
+ },
+ "engines": {
+ "node": "^18.18.0 || ^20.9.0 || >=21.1.0"
+ },
+ "funding": {
+ "url": "https://opencollective.com/eslint"
+ }
+ },
+ "node_modules/esquery": {
+ "version": "1.6.0",
+ "resolved": "https://registry.npmjs.org/esquery/-/esquery-1.6.0.tgz",
+ "integrity": "sha512-ca9pw9fomFcKPvFLXhBKUK90ZvGibiGOvRJNbjljY7s7uq/5YO4BOzcYtJqExdx99rF6aAcnRxHmcUHcz6sQsg==",
+ "dev": true,
+ "license": "BSD-3-Clause",
+ "dependencies": {
+ "estraverse": "^5.1.0"
+ },
+ "engines": {
+ "node": ">=0.10"
+ }
+ },
+ "node_modules/esrecurse": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/esrecurse/-/esrecurse-4.3.0.tgz",
+ "integrity": "sha512-KmfKL3b6G+RXvP8N1vr3Tq1kL/oCFgn2NYXEtqP8/L3pKapUA4G8cFVaoF3SU323CD4XypR/ffioHmkti6/Tag==",
+ "dev": true,
+ "license": "BSD-2-Clause",
+ "dependencies": {
+ "estraverse": "^5.2.0"
+ },
+ "engines": {
+ "node": ">=4.0"
+ }
+ },
+ "node_modules/estraverse": {
+ "version": "5.3.0",
+ "resolved": "https://registry.npmjs.org/estraverse/-/estraverse-5.3.0.tgz",
+ "integrity": "sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA==",
+ "dev": true,
+ "license": "BSD-2-Clause",
+ "engines": {
+ "node": ">=4.0"
+ }
+ },
+ "node_modules/esutils": {
+ "version": "2.0.3",
+ "resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
+ "integrity": "sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==",
+ "dev": true,
+ "license": "BSD-2-Clause",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/fast-deep-equal": {
+ "version": "3.1.3",
+ "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz",
+ "integrity": "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/fast-json-stable-stringify": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz",
+ "integrity": "sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/fast-levenshtein": {
+ "version": "2.0.6",
+ "resolved": "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz",
+ "integrity": "sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/faye-websocket": {
+ "version": "0.11.4",
+ "resolved": "https://registry.npmjs.org/faye-websocket/-/faye-websocket-0.11.4.tgz",
+ "integrity": "sha512-CzbClwlXAuiRQAlUyfqPgvPoNKTckTPGfwZV4ZdAhVcP2lh9KUxJg2b5GkE7XbjKQ3YJnQ9z6D9ntLAlB+tP8g==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "websocket-driver": ">=0.5.1"
+ },
+ "engines": {
+ "node": ">=0.8.0"
+ }
+ },
+ "node_modules/fdir": {
+ "version": "6.5.0",
+ "resolved": "https://registry.npmjs.org/fdir/-/fdir-6.5.0.tgz",
+ "integrity": "sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=12.0.0"
+ },
+ "peerDependencies": {
+ "picomatch": "^3 || ^4"
+ },
+ "peerDependenciesMeta": {
+ "picomatch": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/file-entry-cache": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/file-entry-cache/-/file-entry-cache-8.0.0.tgz",
+ "integrity": "sha512-XXTUwCvisa5oacNGRP9SfNtYBNAMi+RPwBFmblZEF7N7swHYQS6/Zfk7SRwx4D5j3CH211YNRco1DEMNVfZCnQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "flat-cache": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=16.0.0"
+ }
+ },
+ "node_modules/find-up": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/find-up/-/find-up-5.0.0.tgz",
+ "integrity": "sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "locate-path": "^6.0.0",
+ "path-exists": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/firebase": {
+ "version": "12.2.1",
+ "resolved": "https://registry.npmjs.org/firebase/-/firebase-12.2.1.tgz",
+ "integrity": "sha512-UkuW2ZYaq/QuOQ24bfaqmkVqoBFhkA/ptATfPuRtc5vdm+zhwc3mfZBwFe6LqH9yrCN/6rAblgxKz2/0tDvA7w==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "@firebase/ai": "2.2.1",
+ "@firebase/analytics": "0.10.18",
+ "@firebase/analytics-compat": "0.2.24",
+ "@firebase/app": "0.14.2",
+ "@firebase/app-check": "0.11.0",
+ "@firebase/app-check-compat": "0.4.0",
+ "@firebase/app-compat": "0.5.2",
+ "@firebase/app-types": "0.9.3",
+ "@firebase/auth": "1.11.0",
+ "@firebase/auth-compat": "0.6.0",
+ "@firebase/data-connect": "0.3.11",
+ "@firebase/database": "1.1.0",
+ "@firebase/database-compat": "2.1.0",
+ "@firebase/firestore": "4.9.1",
+ "@firebase/firestore-compat": "0.4.1",
+ "@firebase/functions": "0.13.1",
+ "@firebase/functions-compat": "0.4.1",
+ "@firebase/installations": "0.6.19",
+ "@firebase/installations-compat": "0.2.19",
+ "@firebase/messaging": "0.12.23",
+ "@firebase/messaging-compat": "0.2.23",
+ "@firebase/performance": "0.7.9",
+ "@firebase/performance-compat": "0.2.22",
+ "@firebase/remote-config": "0.6.6",
+ "@firebase/remote-config-compat": "0.2.19",
+ "@firebase/storage": "0.14.0",
+ "@firebase/storage-compat": "0.4.0",
+ "@firebase/util": "1.13.0"
+ }
+ },
+ "node_modules/flat-cache": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmjs.org/flat-cache/-/flat-cache-4.0.1.tgz",
+ "integrity": "sha512-f7ccFPK3SXFHpx15UIGyRJ/FJQctuKZ0zVuN3frBo4HnK3cay9VEW0R6yPYFHC0AgqhukPzKjq22t5DmAyqGyw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "flatted": "^3.2.9",
+ "keyv": "^4.5.4"
+ },
+ "engines": {
+ "node": ">=16"
+ }
+ },
+ "node_modules/flatted": {
+ "version": "3.3.3",
+ "resolved": "https://registry.npmjs.org/flatted/-/flatted-3.3.3.tgz",
+ "integrity": "sha512-GX+ysw4PBCz0PzosHDepZGANEuFCMLrnRTiEy9McGjmkCQYwRq4A/X786G/fjM/+OjsWSU1ZrY5qyARZmO/uwg==",
+ "dev": true,
+ "license": "ISC"
+ },
+ "node_modules/follow-redirects": {
+ "version": "1.15.11",
+ "resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
+ "integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
+ "funding": [
+ {
+ "type": "individual",
+ "url": "https://github.com/sponsors/RubenVerborgh"
+ }
+ ],
+ "license": "MIT",
+ "engines": {
+ "node": ">=4.0"
+ },
+ "peerDependenciesMeta": {
+ "debug": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/form-data": {
+ "version": "4.0.4",
+ "resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
+ "integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
+ "license": "MIT",
+ "dependencies": {
+ "asynckit": "^0.4.0",
+ "combined-stream": "^1.0.8",
+ "es-set-tostringtag": "^2.1.0",
+ "hasown": "^2.0.2",
+ "mime-types": "^2.1.12"
+ },
+ "engines": {
+ "node": ">= 6"
+ }
+ },
+ "node_modules/fsevents": {
+ "version": "2.3.3",
+ "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz",
+ "integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==",
+ "dev": true,
+ "hasInstallScript": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ],
+ "engines": {
+ "node": "^8.16.0 || ^10.6.0 || >=11.0.0"
+ }
+ },
+ "node_modules/function-bind": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
+ "integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
+ "license": "MIT",
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/gensync": {
+ "version": "1.0.0-beta.2",
+ "resolved": "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz",
+ "integrity": "sha512-3hN7NaskYvMDLQY55gnW3NQ+mesEAepTqlg+VEbj7zzqEMBVNhzcGYYeqFo/TlYz6eQiFcp1HcsCZO+nGgS8zg==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/get-caller-file": {
+ "version": "2.0.5",
+ "resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
+ "integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==",
+ "license": "ISC",
+ "engines": {
+ "node": "6.* || 8.* || >= 10.*"
+ }
+ },
+ "node_modules/get-intrinsic": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
+ "integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
+ "license": "MIT",
+ "dependencies": {
+ "call-bind-apply-helpers": "^1.0.2",
+ "es-define-property": "^1.0.1",
+ "es-errors": "^1.3.0",
+ "es-object-atoms": "^1.1.1",
+ "function-bind": "^1.1.2",
+ "get-proto": "^1.0.1",
+ "gopd": "^1.2.0",
+ "has-symbols": "^1.1.0",
+ "hasown": "^2.0.2",
+ "math-intrinsics": "^1.1.0"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/get-proto": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
+ "integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
+ "license": "MIT",
+ "dependencies": {
+ "dunder-proto": "^1.0.1",
+ "es-object-atoms": "^1.0.0"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/glob-parent": {
+ "version": "6.0.2",
+ "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-6.0.2.tgz",
+ "integrity": "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==",
+ "dev": true,
+ "license": "ISC",
+ "dependencies": {
+ "is-glob": "^4.0.3"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
+ "node_modules/globals": {
+ "version": "16.3.0",
+ "resolved": "https://registry.npmjs.org/globals/-/globals-16.3.0.tgz",
+ "integrity": "sha512-bqWEnJ1Nt3neqx2q5SFfGS8r/ahumIakg3HcwtNlrVlwXIeNumWn/c7Pn/wKzGhf6SaW6H6uWXLqC30STCMchQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=18"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/gopd": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
+ "integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/has-flag": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
+ "integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/has-symbols": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
+ "integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/has-tostringtag": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
+ "integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
+ "license": "MIT",
+ "dependencies": {
+ "has-symbols": "^1.0.3"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/hasown": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
+ "integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
+ "license": "MIT",
+ "dependencies": {
+ "function-bind": "^1.1.2"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/http-parser-js": {
+ "version": "0.5.10",
+ "resolved": "https://registry.npmjs.org/http-parser-js/-/http-parser-js-0.5.10.tgz",
+ "integrity": "sha512-Pysuw9XpUq5dVc/2SMHpuTY01RFl8fttgcyunjL7eEMhGM3cI4eOmiCycJDVCo/7O7ClfQD3SaI6ftDzqOXYMA==",
+ "license": "MIT"
+ },
+ "node_modules/idb": {
+ "version": "7.1.1",
+ "resolved": "https://registry.npmjs.org/idb/-/idb-7.1.1.tgz",
+ "integrity": "sha512-gchesWBzyvGHRO9W8tzUWFDycow5gwjvFKfyV9FF32Y7F50yZMp7mP+T2mJIWFx49zicqyC4uefHM17o6xKIVQ==",
+ "license": "ISC"
+ },
+ "node_modules/ignore": {
+ "version": "5.3.2",
+ "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz",
+ "integrity": "sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">= 4"
+ }
+ },
+ "node_modules/import-fresh": {
+ "version": "3.3.1",
+ "resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.1.tgz",
+ "integrity": "sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "parent-module": "^1.0.0",
+ "resolve-from": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/imurmurhash": {
+ "version": "0.1.4",
+ "resolved": "https://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz",
+ "integrity": "sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.8.19"
+ }
+ },
+ "node_modules/is-extglob": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
+ "integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-fullwidth-code-point": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
+ "integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/is-glob": {
+ "version": "4.0.3",
+ "resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz",
+ "integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "is-extglob": "^2.1.1"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/isexe": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz",
+ "integrity": "sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==",
+ "dev": true,
+ "license": "ISC"
+ },
+ "node_modules/js-tokens": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
+ "integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/js-yaml": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.0.tgz",
+ "integrity": "sha512-wpxZs9NoxZaJESJGIZTyDEaYpl0FKSA+FB9aJiyemKhMwkxQg63h4T1KJgUGHpTqPDNRcmmYLugrRjJlBtWvRA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "argparse": "^2.0.1"
+ },
+ "bin": {
+ "js-yaml": "bin/js-yaml.js"
+ }
+ },
+ "node_modules/jsesc": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/jsesc/-/jsesc-3.1.0.tgz",
+ "integrity": "sha512-/sM3dO2FOzXjKQhJuo0Q173wf2KOo8t4I8vHy6lF9poUp7bKT0/NHE8fPX23PwfhnykfqnC2xRxOnVw5XuGIaA==",
+ "dev": true,
+ "license": "MIT",
+ "bin": {
+ "jsesc": "bin/jsesc"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/json-buffer": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.1.tgz",
+ "integrity": "sha512-4bV5BfR2mqfQTJm+V5tPPdf+ZpuhiIvTuAB5g8kcrXOZpTT/QwwVRWBywX1ozr6lEuPdbHxwaJlm9G6mI2sfSQ==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/json-schema-traverse": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz",
+ "integrity": "sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/json-stable-stringify-without-jsonify": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/json-stable-stringify-without-jsonify/-/json-stable-stringify-without-jsonify-1.0.1.tgz",
+ "integrity": "sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/json5": {
+ "version": "2.2.3",
+ "resolved": "https://registry.npmjs.org/json5/-/json5-2.2.3.tgz",
+ "integrity": "sha512-XmOWe7eyHYH14cLdVPoyg+GOH3rYX++KpzrylJwSW98t3Nk+U8XOl8FWKOgwtzdb8lXGf6zYwDUzeHMWfxasyg==",
+ "dev": true,
+ "license": "MIT",
+ "bin": {
+ "json5": "lib/cli.js"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/keyv": {
+ "version": "4.5.4",
+ "resolved": "https://registry.npmjs.org/keyv/-/keyv-4.5.4.tgz",
+ "integrity": "sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "json-buffer": "3.0.1"
+ }
+ },
+ "node_modules/levn": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/levn/-/levn-0.4.1.tgz",
+ "integrity": "sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "prelude-ls": "^1.2.1",
+ "type-check": "~0.4.0"
+ },
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/locate-path": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz",
+ "integrity": "sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "p-locate": "^5.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/lodash.camelcase": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/lodash.camelcase/-/lodash.camelcase-4.3.0.tgz",
+ "integrity": "sha512-TwuEnCnxbc3rAvhf/LbG7tJUDzhqXyFnv3dtzLOPgCG/hODL7WFnsbwktkD7yUV0RrreP/l1PALq/YSg6VvjlA==",
+ "license": "MIT"
+ },
+ "node_modules/lodash.merge": {
+ "version": "4.6.2",
+ "resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz",
+ "integrity": "sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/long": {
+ "version": "5.3.2",
+ "resolved": "https://registry.npmjs.org/long/-/long-5.3.2.tgz",
+ "integrity": "sha512-mNAgZ1GmyNhD7AuqnTG3/VQ26o760+ZYBPKjPvugO8+nLbYfX6TVpJPseBvopbdY+qpZ/lKUnmEc1LeZYS3QAA==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/lru-cache": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz",
+ "integrity": "sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==",
+ "dev": true,
+ "license": "ISC",
+ "dependencies": {
+ "yallist": "^3.0.2"
+ }
+ },
+ "node_modules/lucide-react": {
+ "version": "0.544.0",
+ "resolved": "https://registry.npmjs.org/lucide-react/-/lucide-react-0.544.0.tgz",
+ "integrity": "sha512-t5tS44bqd825zAW45UQxpG2CvcC4urOwn2TrwSH8u+MjeE+1NnWl6QqeQ/6NdjMqdOygyiT9p3Ev0p1NJykxjw==",
+ "license": "ISC",
+ "peerDependencies": {
+ "react": "^16.5.1 || ^17.0.0 || ^18.0.0 || ^19.0.0"
+ }
+ },
+ "node_modules/math-intrinsics": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
+ "integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/mime-db": {
+ "version": "1.52.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
+ "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/mime-types": {
+ "version": "2.1.35",
+ "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
+ "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
+ "license": "MIT",
+ "dependencies": {
+ "mime-db": "1.52.0"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/minimatch": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
+ "integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
+ "dev": true,
+ "license": "ISC",
+ "dependencies": {
+ "brace-expansion": "^1.1.7"
+ },
+ "engines": {
+ "node": "*"
+ }
+ },
+ "node_modules/ms": {
+ "version": "2.1.3",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
+ "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/nanoid": {
+ "version": "3.3.11",
+ "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz",
+ "integrity": "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==",
+ "dev": true,
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "license": "MIT",
+ "bin": {
+ "nanoid": "bin/nanoid.cjs"
+ },
+ "engines": {
+ "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
+ }
+ },
+ "node_modules/natural-compare": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmjs.org/natural-compare/-/natural-compare-1.4.0.tgz",
+ "integrity": "sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/node-releases": {
+ "version": "2.0.19",
+ "resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.19.tgz",
+ "integrity": "sha512-xxOWJsBKtzAq7DY0J+DTzuz58K8e7sJbdgwkbMWQe8UYB6ekmsQ45q0M/tJDsGaZmbC+l7n57UV8Hl5tHxO9uw==",
+ "dev": true,
+ "license": "MIT"
+ },
+ "node_modules/optionator": {
+ "version": "0.9.4",
+ "resolved": "https://registry.npmjs.org/optionator/-/optionator-0.9.4.tgz",
+ "integrity": "sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "deep-is": "^0.1.3",
+ "fast-levenshtein": "^2.0.6",
+ "levn": "^0.4.1",
+ "prelude-ls": "^1.2.1",
+ "type-check": "^0.4.0",
+ "word-wrap": "^1.2.5"
+ },
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/p-limit": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz",
+ "integrity": "sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "yocto-queue": "^0.1.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/p-locate": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-5.0.0.tgz",
+ "integrity": "sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "p-limit": "^3.0.2"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/parent-module": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz",
+ "integrity": "sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "callsites": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/path-exists": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
+ "integrity": "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/path-key": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz",
+ "integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/picocolors": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz",
+ "integrity": "sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==",
+ "dev": true,
+ "license": "ISC"
+ },
+ "node_modules/picomatch": {
+ "version": "4.0.3",
+ "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-4.0.3.tgz",
+ "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/jonschlinkert"
+ }
+ },
+ "node_modules/postcss": {
+ "version": "8.5.6",
+ "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.6.tgz",
+ "integrity": "sha512-3Ybi1tAuwAP9s0r1UQ2J4n5Y0G05bJkpUIO0/bI9MhwmD70S5aTWbXGBwxHrelT+XM1k6dM0pk+SwNkpTRN7Pg==",
+ "dev": true,
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/postcss/"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/postcss"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "license": "MIT",
+ "dependencies": {
+ "nanoid": "^3.3.11",
+ "picocolors": "^1.1.1",
+ "source-map-js": "^1.2.1"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14"
+ }
+ },
+ "node_modules/prelude-ls": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/prelude-ls/-/prelude-ls-1.2.1.tgz",
+ "integrity": "sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/protobufjs": {
+ "version": "7.5.4",
+ "resolved": "https://registry.npmjs.org/protobufjs/-/protobufjs-7.5.4.tgz",
+ "integrity": "sha512-CvexbZtbov6jW2eXAvLukXjXUW1TzFaivC46BpWc/3BpcCysb5Vffu+B3XHMm8lVEuy2Mm4XGex8hBSg1yapPg==",
+ "hasInstallScript": true,
+ "license": "BSD-3-Clause",
+ "dependencies": {
+ "@protobufjs/aspromise": "^1.1.2",
+ "@protobufjs/base64": "^1.1.2",
+ "@protobufjs/codegen": "^2.0.4",
+ "@protobufjs/eventemitter": "^1.1.0",
+ "@protobufjs/fetch": "^1.1.0",
+ "@protobufjs/float": "^1.0.2",
+ "@protobufjs/inquire": "^1.1.0",
+ "@protobufjs/path": "^1.1.2",
+ "@protobufjs/pool": "^1.1.0",
+ "@protobufjs/utf8": "^1.1.0",
+ "@types/node": ">=13.7.0",
+ "long": "^5.0.0"
+ },
+ "engines": {
+ "node": ">=12.0.0"
+ }
+ },
+ "node_modules/proxy-from-env": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
+ "integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
+ "license": "MIT"
+ },
+ "node_modules/punycode": {
+ "version": "2.3.1",
+ "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz",
+ "integrity": "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/react": {
+ "version": "19.1.1",
+ "resolved": "https://registry.npmjs.org/react/-/react-19.1.1.tgz",
+ "integrity": "sha512-w8nqGImo45dmMIfljjMwOGtbmC/mk4CMYhWIicdSflH91J9TyCyczcPFXJzrZ/ZXcgGRFeP6BU0BEJTw6tZdfQ==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/react-chartjs-2": {
+ "version": "5.3.0",
+ "resolved": "https://registry.npmjs.org/react-chartjs-2/-/react-chartjs-2-5.3.0.tgz",
+ "integrity": "sha512-UfZZFnDsERI3c3CZGxzvNJd02SHjaSJ8kgW1djn65H1KK8rehwTjyrRKOG3VTMG8wtHZ5rgAO5oTHtHi9GCCmw==",
+ "license": "MIT",
+ "peerDependencies": {
+ "chart.js": "^4.1.1",
+ "react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0"
+ }
+ },
+ "node_modules/react-dom": {
+ "version": "19.1.1",
+ "resolved": "https://registry.npmjs.org/react-dom/-/react-dom-19.1.1.tgz",
+ "integrity": "sha512-Dlq/5LAZgF0Gaz6yiqZCf6VCcZs1ghAJyrsu84Q/GT0gV+mCxbfmKNoGRKBYMJ8IEdGPqu49YWXD02GCknEDkw==",
+ "license": "MIT",
+ "dependencies": {
+ "scheduler": "^0.26.0"
+ },
+ "peerDependencies": {
+ "react": "^19.1.1"
+ }
+ },
+ "node_modules/react-refresh": {
+ "version": "0.17.0",
+ "resolved": "https://registry.npmjs.org/react-refresh/-/react-refresh-0.17.0.tgz",
+ "integrity": "sha512-z6F7K9bV85EfseRCp2bzrpyQ0Gkw1uLoCel9XBVWPg/TjRj94SkJzUTGfOa4bs7iJvBWtQG0Wq7wnI0syw3EBQ==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/react-router": {
+ "version": "7.8.2",
+ "resolved": "https://registry.npmjs.org/react-router/-/react-router-7.8.2.tgz",
+ "integrity": "sha512-7M2fR1JbIZ/jFWqelpvSZx+7vd7UlBTfdZqf6OSdF9g6+sfdqJDAWcak6ervbHph200ePlu+7G8LdoiC3ReyAQ==",
+ "license": "MIT",
+ "dependencies": {
+ "cookie": "^1.0.1",
+ "set-cookie-parser": "^2.6.0"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "react": ">=18",
+ "react-dom": ">=18"
+ },
+ "peerDependenciesMeta": {
+ "react-dom": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/react-router-dom": {
+ "version": "7.8.2",
+ "resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-7.8.2.tgz",
+ "integrity": "sha512-Z4VM5mKDipal2jQ385H6UBhiiEDlnJPx6jyWsTYoZQdl5TrjxEV2a9yl3Fi60NBJxYzOTGTTHXPi0pdizvTwow==",
+ "license": "MIT",
+ "dependencies": {
+ "react-router": "7.8.2"
+ },
+ "engines": {
+ "node": ">=20.0.0"
+ },
+ "peerDependencies": {
+ "react": ">=18",
+ "react-dom": ">=18"
+ }
+ },
+ "node_modules/require-directory": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
+ "integrity": "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/resolve-from": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz",
+ "integrity": "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/rollup": {
+ "version": "4.49.0",
+ "resolved": "https://registry.npmjs.org/rollup/-/rollup-4.49.0.tgz",
+ "integrity": "sha512-3IVq0cGJ6H7fKXXEdVt+RcYvRCt8beYY9K1760wGQwSAHZcS9eot1zDG5axUbcp/kWRi5zKIIDX8MoKv/TzvZA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "@types/estree": "1.0.8"
+ },
+ "bin": {
+ "rollup": "dist/bin/rollup"
+ },
+ "engines": {
+ "node": ">=18.0.0",
+ "npm": ">=8.0.0"
+ },
+ "optionalDependencies": {
+ "@rollup/rollup-android-arm-eabi": "4.49.0",
+ "@rollup/rollup-android-arm64": "4.49.0",
+ "@rollup/rollup-darwin-arm64": "4.49.0",
+ "@rollup/rollup-darwin-x64": "4.49.0",
+ "@rollup/rollup-freebsd-arm64": "4.49.0",
+ "@rollup/rollup-freebsd-x64": "4.49.0",
+ "@rollup/rollup-linux-arm-gnueabihf": "4.49.0",
+ "@rollup/rollup-linux-arm-musleabihf": "4.49.0",
+ "@rollup/rollup-linux-arm64-gnu": "4.49.0",
+ "@rollup/rollup-linux-arm64-musl": "4.49.0",
+ "@rollup/rollup-linux-loongarch64-gnu": "4.49.0",
+ "@rollup/rollup-linux-ppc64-gnu": "4.49.0",
+ "@rollup/rollup-linux-riscv64-gnu": "4.49.0",
+ "@rollup/rollup-linux-riscv64-musl": "4.49.0",
+ "@rollup/rollup-linux-s390x-gnu": "4.49.0",
+ "@rollup/rollup-linux-x64-gnu": "4.49.0",
+ "@rollup/rollup-linux-x64-musl": "4.49.0",
+ "@rollup/rollup-win32-arm64-msvc": "4.49.0",
+ "@rollup/rollup-win32-ia32-msvc": "4.49.0",
+ "@rollup/rollup-win32-x64-msvc": "4.49.0",
+ "fsevents": "~2.3.2"
+ }
+ },
+ "node_modules/safe-buffer": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
+ "integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/feross"
+ },
+ {
+ "type": "patreon",
+ "url": "https://www.patreon.com/feross"
+ },
+ {
+ "type": "consulting",
+ "url": "https://feross.org/support"
+ }
+ ],
+ "license": "MIT"
+ },
+ "node_modules/scheduler": {
+ "version": "0.26.0",
+ "resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.26.0.tgz",
+ "integrity": "sha512-NlHwttCI/l5gCPR3D1nNXtWABUmBwvZpEQiD4IXSbIDq8BzLIK/7Ir5gTFSGZDUu37K5cMNp0hFtzO38sC7gWA==",
+ "license": "MIT"
+ },
+ "node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "dev": true,
+ "license": "ISC",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/set-cookie-parser": {
+ "version": "2.7.1",
+ "resolved": "https://registry.npmjs.org/set-cookie-parser/-/set-cookie-parser-2.7.1.tgz",
+ "integrity": "sha512-IOc8uWeOZgnb3ptbCURJWNjWUPcO3ZnTTdzsurqERrP6nPyv+paC55vJM0LpOlT2ne+Ix+9+CRG1MNLlyZ4GjQ==",
+ "license": "MIT"
+ },
+ "node_modules/shebang-command": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz",
+ "integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "shebang-regex": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/shebang-regex": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz",
+ "integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/source-map-js": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz",
+ "integrity": "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==",
+ "dev": true,
+ "license": "BSD-3-Clause",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "license": "MIT",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/strip-ansi": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
+ "integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
+ "license": "MIT",
+ "dependencies": {
+ "ansi-regex": "^5.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/strip-json-comments": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz",
+ "integrity": "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/supports-color": {
+ "version": "7.2.0",
+ "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
+ "integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "has-flag": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/tinyglobby": {
+ "version": "0.2.14",
+ "resolved": "https://registry.npmjs.org/tinyglobby/-/tinyglobby-0.2.14.tgz",
+ "integrity": "sha512-tX5e7OM1HnYr2+a2C/4V0htOcSQcoSTH9KgJnVvNm5zm/cyEWKJ7j7YutsH9CxMdtOkkLFy2AHrMci9IM8IPZQ==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "fdir": "^6.4.4",
+ "picomatch": "^4.0.2"
+ },
+ "engines": {
+ "node": ">=12.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/SuperchupuDev"
+ }
+ },
+ "node_modules/tslib": {
+ "version": "2.8.1",
+ "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz",
+ "integrity": "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==",
+ "license": "0BSD"
+ },
+ "node_modules/type-check": {
+ "version": "0.4.0",
+ "resolved": "https://registry.npmjs.org/type-check/-/type-check-0.4.0.tgz",
+ "integrity": "sha512-XleUoc9uwGXqjWwXaUTZAmzMcFZ5858QA2vvx1Ur5xIcixXIP+8LnFDgRplU30us6teqdlskFfu+ae4K79Ooew==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "prelude-ls": "^1.2.1"
+ },
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/undici-types": {
+ "version": "7.10.0",
+ "resolved": "https://registry.npmjs.org/undici-types/-/undici-types-7.10.0.tgz",
+ "integrity": "sha512-t5Fy/nfn+14LuOc2KNYg75vZqClpAiqscVvMygNnlsHBFpSXdJaYtXMcdNLpl/Qvc3P2cB3s6lOV51nqsFq4ag==",
+ "license": "MIT"
+ },
+ "node_modules/update-browserslist-db": {
+ "version": "1.1.3",
+ "resolved": "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.1.3.tgz",
+ "integrity": "sha512-UxhIZQ+QInVdunkDAaiazvvT/+fXL5Osr0JZlJulepYu6Jd7qJtDZjlur0emRlT71EN3ScPoE7gvsuIKKNavKw==",
+ "dev": true,
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/browserslist"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/browserslist"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "license": "MIT",
+ "dependencies": {
+ "escalade": "^3.2.0",
+ "picocolors": "^1.1.1"
+ },
+ "bin": {
+ "update-browserslist-db": "cli.js"
+ },
+ "peerDependencies": {
+ "browserslist": ">= 4.21.0"
+ }
+ },
+ "node_modules/uri-js": {
+ "version": "4.4.1",
+ "resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz",
+ "integrity": "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==",
+ "dev": true,
+ "license": "BSD-2-Clause",
+ "dependencies": {
+ "punycode": "^2.1.0"
+ }
+ },
+ "node_modules/vite": {
+ "version": "7.1.3",
+ "resolved": "https://registry.npmjs.org/vite/-/vite-7.1.3.tgz",
+ "integrity": "sha512-OOUi5zjkDxYrKhTV3V7iKsoS37VUM7v40+HuwEmcrsf11Cdx9y3DIr2Px6liIcZFwt3XSRpQvFpL3WVy7ApkGw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "esbuild": "^0.25.0",
+ "fdir": "^6.5.0",
+ "picomatch": "^4.0.3",
+ "postcss": "^8.5.6",
+ "rollup": "^4.43.0",
+ "tinyglobby": "^0.2.14"
+ },
+ "bin": {
+ "vite": "bin/vite.js"
+ },
+ "engines": {
+ "node": "^20.19.0 || >=22.12.0"
+ },
+ "funding": {
+ "url": "https://github.com/vitejs/vite?sponsor=1"
+ },
+ "optionalDependencies": {
+ "fsevents": "~2.3.3"
+ },
+ "peerDependencies": {
+ "@types/node": "^20.19.0 || >=22.12.0",
+ "jiti": ">=1.21.0",
+ "less": "^4.0.0",
+ "lightningcss": "^1.21.0",
+ "sass": "^1.70.0",
+ "sass-embedded": "^1.70.0",
+ "stylus": ">=0.54.8",
+ "sugarss": "^5.0.0",
+ "terser": "^5.16.0",
+ "tsx": "^4.8.1",
+ "yaml": "^2.4.2"
+ },
+ "peerDependenciesMeta": {
+ "@types/node": {
+ "optional": true
+ },
+ "jiti": {
+ "optional": true
+ },
+ "less": {
+ "optional": true
+ },
+ "lightningcss": {
+ "optional": true
+ },
+ "sass": {
+ "optional": true
+ },
+ "sass-embedded": {
+ "optional": true
+ },
+ "stylus": {
+ "optional": true
+ },
+ "sugarss": {
+ "optional": true
+ },
+ "terser": {
+ "optional": true
+ },
+ "tsx": {
+ "optional": true
+ },
+ "yaml": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/web-vitals": {
+ "version": "4.2.4",
+ "resolved": "https://registry.npmjs.org/web-vitals/-/web-vitals-4.2.4.tgz",
+ "integrity": "sha512-r4DIlprAGwJ7YM11VZp4R884m0Vmgr6EAKe3P+kO0PPj3Unqyvv59rczf6UiGcb9Z8QxZVcqKNwv/g0WNdWwsw==",
+ "license": "Apache-2.0"
+ },
+ "node_modules/websocket-driver": {
+ "version": "0.7.4",
+ "resolved": "https://registry.npmjs.org/websocket-driver/-/websocket-driver-0.7.4.tgz",
+ "integrity": "sha512-b17KeDIQVjvb0ssuSDF2cYXSg2iztliJ4B9WdsuB6J952qCPKmnVq4DyW5motImXHDC1cBT/1UezrJVsKw5zjg==",
+ "license": "Apache-2.0",
+ "dependencies": {
+ "http-parser-js": ">=0.5.1",
+ "safe-buffer": ">=5.1.0",
+ "websocket-extensions": ">=0.1.1"
+ },
+ "engines": {
+ "node": ">=0.8.0"
+ }
+ },
+ "node_modules/websocket-extensions": {
+ "version": "0.1.4",
+ "resolved": "https://registry.npmjs.org/websocket-extensions/-/websocket-extensions-0.1.4.tgz",
+ "integrity": "sha512-OqedPIGOfsDlo31UNwYbCFMSaO9m9G/0faIHj5/dZFDMFqPTcx6UwqyOy3COEaEOg/9VsGIpdqn62W5KhoKSpg==",
+ "license": "Apache-2.0",
+ "engines": {
+ "node": ">=0.8.0"
+ }
+ },
+ "node_modules/which": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
+ "integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==",
+ "dev": true,
+ "license": "ISC",
+ "dependencies": {
+ "isexe": "^2.0.0"
+ },
+ "bin": {
+ "node-which": "bin/node-which"
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/word-wrap": {
+ "version": "1.2.5",
+ "resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz",
+ "integrity": "sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/wrap-ansi": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
+ "integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
+ "license": "MIT",
+ "dependencies": {
+ "ansi-styles": "^4.0.0",
+ "string-width": "^4.1.0",
+ "strip-ansi": "^6.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/wrap-ansi?sponsor=1"
+ }
+ },
+ "node_modules/y18n": {
+ "version": "5.0.8",
+ "resolved": "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz",
+ "integrity": "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==",
+ "license": "ISC",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/yallist": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz",
+ "integrity": "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==",
+ "dev": true,
+ "license": "ISC"
+ },
+ "node_modules/yargs": {
+ "version": "17.7.2",
+ "resolved": "https://registry.npmjs.org/yargs/-/yargs-17.7.2.tgz",
+ "integrity": "sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w==",
+ "license": "MIT",
+ "dependencies": {
+ "cliui": "^8.0.1",
+ "escalade": "^3.1.1",
+ "get-caller-file": "^2.0.5",
+ "require-directory": "^2.1.1",
+ "string-width": "^4.2.3",
+ "y18n": "^5.0.5",
+ "yargs-parser": "^21.1.1"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/yargs-parser": {
+ "version": "21.1.1",
+ "resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-21.1.1.tgz",
+ "integrity": "sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==",
+ "license": "ISC",
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/yocto-queue": {
+ "version": "0.1.0",
+ "resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz",
+ "integrity": "sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==",
+ "dev": true,
+ "license": "MIT",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ }
+ }
+}
diff --git a/frontend/package.json b/frontend/package.json
new file mode 100644
index 0000000..f61394d
--- /dev/null
+++ b/frontend/package.json
@@ -0,0 +1,35 @@
+{
+ "name": "frontend",
+ "private": true,
+ "version": "0.0.0",
+ "type": "module",
+ "scripts": {
+ "dev": "vite",
+ "build": "vite build",
+ "lint": "eslint .",
+ "preview": "vite preview"
+ },
+ "dependencies": {
+ "axios": "^1.11.0",
+ "chart.js": "^4.5.0",
+ "chartjs-adapter-date-fns": "^3.0.0",
+ "date-fns": "^2.30.0",
+ "firebase": "^12.2.1",
+ "lucide-react": "^0.544.0",
+ "react": "^19.1.1",
+ "react-chartjs-2": "^5.3.0",
+ "react-dom": "^19.1.1",
+ "react-router-dom": "^7.8.2"
+ },
+ "devDependencies": {
+ "@eslint/js": "^9.33.0",
+ "@types/react": "^19.1.10",
+ "@types/react-dom": "^19.1.7",
+ "@vitejs/plugin-react": "^5.0.0",
+ "eslint": "^9.33.0",
+ "eslint-plugin-react-hooks": "^5.2.0",
+ "eslint-plugin-react-refresh": "^0.4.20",
+ "globals": "^16.3.0",
+ "vite": "^7.1.2"
+ }
+}
diff --git a/frontend/public/_redirects b/frontend/public/_redirects
new file mode 100644
index 0000000..50a4633
--- /dev/null
+++ b/frontend/public/_redirects
@@ -0,0 +1 @@
+/* /index.html 200
\ No newline at end of file
diff --git a/frontend/src/App.jsx b/frontend/src/App.jsx
new file mode 100644
index 0000000..c79cbfe
--- /dev/null
+++ b/frontend/src/App.jsx
@@ -0,0 +1,40 @@
+import React from 'react';
+import { BrowserRouter as Router, Routes, Route, Navigate } from 'react-router-dom';
+import { AuthProvider } from './contexts/AuthContext';
+import { ToastProvider } from './components/ui/ToastProvider';
+import ProtectedRoute from './components/layout/ProtectedRoute';
+import Login from './pages/Login';
+import Dashboard from './pages/Dashboard';
+
+function App() {
+ return (
+
+
+
+
+
+ {/* Public route */}
+ } />
+
+ {/* Protected routes */}
+
+
+
+ }
+ />
+
+ {/* Redirect root to dashboard */}
+ } />
+
+ {/* Catch all - redirect to dashboard */}
+ } />
+
+
+
+
+
+ );
+}export default App;
diff --git a/frontend/src/assets/logo.png b/frontend/src/assets/logo.png
new file mode 100644
index 0000000..4383b48
Binary files /dev/null and b/frontend/src/assets/logo.png differ
diff --git a/frontend/src/components/analysis/AnalysisDashboard.jsx b/frontend/src/components/analysis/AnalysisDashboard.jsx
new file mode 100644
index 0000000..18c4a4b
--- /dev/null
+++ b/frontend/src/components/analysis/AnalysisDashboard.jsx
@@ -0,0 +1,529 @@
+import React, { useState, useEffect } from 'react';
+
+const AnalysisDashboard = ({ stock, onClose, analysisData, loading }) => {
+ const [modalKey, setModalKey] = useState(0);
+ console.log('AnalysisDashboard props - stock:', stock, 'loading:', loading, 'analysisData:', analysisData);
+ if (!stock) return null;
+
+ if (loading) {
+ console.log('Rendering loading state for AnalysisDashboard');
+ return (
+ e.stopPropagation()}
+ style={{
+ zIndex: 99999,
+ backdropFilter: 'blur(10px)',
+ background: 'rgba(0, 0, 0, 0.7)'
+ }}
+ >
+
+ {/* Header */}
+
+
+
+
AI Stock Analysis
+
Analysis in Progress
+
+
+
+
+
+
+
+
+
+ {/* Loading Animation */}
+
+
+
+
+
+
+ Analyzing {stock?.symbol}...
+
+
+
+
+
+ );
+ }
+
+ useEffect(() => {
+ // Force remount when new data arrives
+ if (analysisData) {
+ setModalKey(prev => prev + 1);
+ }
+ }, [analysisData]);
+
+ if (!analysisData || !analysisData.analysis) {
+ // If not loading and no data, show error state
+ if (!loading) {
+ return (
+ e.stopPropagation()}
+ style={{
+ zIndex: 99999,
+ backdropFilter: 'blur(10px)',
+ background: 'rgba(0, 0, 0, 0.7)'
+ }}
+ >
+
+
+
+
+ {stock?.symbol}
+
+
+
Analysis Failed
+
Unable to complete analysis
+
+
+
+
+
+
+
+
+
+
+
+
+ Unable to analyze {stock?.symbol}
+
+
+ Please check your connection and try again
+
+
+ Close
+
+
+
+
+ );
+ }
+
+ return (
+ e.stopPropagation()}
+ style={{ zIndex: 99999 }}
+ >
+
+
+
+
Loading analysis data...
+
+
+
+ );
+ }
+
+ const formatSummaryText = (text) => {
+ if (!text || typeof text !== 'string') {
+ return 'No analysis summary available';
+ }
+
+ // Split into paragraphs
+ let paragraphs = text.split('\n\n').filter(p => p.trim());
+
+ // Remove unwanted sections
+ paragraphs = paragraphs.filter(paragraph => {
+ const lowerParagraph = paragraph.toLowerCase();
+ // Remove analysis completion messages
+ if (lowerParagraph.includes('analysis completed using') ||
+ lowerParagraph.includes('tools across') ||
+ lowerParagraph.includes('reasoning iterations')) {
+ return false;
+ }
+ return true;
+ });
+
+ let finalRecommendationFound = false;
+ paragraphs = paragraphs.filter(paragraph => {
+ const isRecommendation = paragraph.toLowerCase().includes('final recommendation') ||
+ paragraph.toLowerCase().includes('recommendation:');
+
+ if (isRecommendation) {
+ if (finalRecommendationFound) {
+ return false; // Skip duplicate recommendations
+ }
+ finalRecommendationFound = true;
+ }
+ return true;
+ });
+
+ // Enhanced formatting with structure detection
+ paragraphs = paragraphs.map(paragraph => {
+ // Remove excessive line breaks
+ let cleaned = paragraph.replace(/\n{3,}/g, '\n\n').trim();
+
+ // Preserve markdown formatting for better styling
+ cleaned = cleaned
+ .replace(/\*\*(.*?)\*\*/g, '$1 ') // Bold text
+ .replace(/\*(.*?)\*/g, '$1 ') // Italic text
+ .replace(/#{1,6}\s(.+)/g, '$1 ') // Headers
+ .replace(/^\d+\.\s/gm, '') // Remove numbered list markers
+ .replace(/^[-•]\s/gm, '• '); // Standardize bullet points
+
+ return cleaned;
+ }).filter(paragraph => {
+ // Remove empty or nearly empty paragraphs
+ const textContent = paragraph.replace(/<[^>]*>/g, '').trim();
+ return textContent.length > 5; // Only keep paragraphs with meaningful content
+ });
+
+ return paragraphs;
+ };
+
+ // Helper function to clean markdown formatting from text
+ const cleanMarkdownText = (text) => {
+ if (!text || typeof text !== 'string') return text;
+
+ return text
+ .replace(/\*\*/g, '') // Remove bold markers
+ .replace(/\*/g, '') // Remove italic markers
+ .replace(/#{1,6}\s/g, '') // Remove heading markers
+ .replace(/`{1,3}/g, '') // Remove code markers
+ .trim();
+ };
+
+ const renderSentimentContent = (content) => {
+ if (!content) return null;
+
+ // Handle case where content is an object with a "content" property
+ let textContent = content;
+ if (typeof content === 'object') {
+ if (content.content) {
+ textContent = content.content;
+ } else {
+ // If it's a complex object, try to extract meaningful text
+ try {
+ textContent = JSON.stringify(content, null, 2);
+ } catch (e) {
+ return (
+
+
Error: Unable to display sentiment content - invalid data type
+
+ );
+ }
+ }
+ }
+
+ // Ensure content is a string
+ if (typeof textContent !== 'string') {
+ textContent = String(textContent || '');
+ }
+
+ // Split content into sections and format accordingly
+ const sections = textContent.split('\n\n');
+
+ return (
+
+ {sections.map((section, index) => {
+ // Skip empty sections
+ if (!section.trim()) return null;
+
+ // Handle headlines with bullet points and sentiment analysis
+ if (section.includes('Headline:') || section.includes('* Sentiment:')) {
+ return (
+
+ {section.split('\n').map((line, lineIndex) => {
+ if (line.trim().startsWith('Headline:')) {
+ return (
+
+ {cleanMarkdownText(line.replace('Headline:', ''))}
+
+ );
+ } else if (line.trim().startsWith('* Sentiment:')) {
+ const sentiment = line.replace('* Sentiment:', '').trim();
+ const sentimentClass = sentiment.toLowerCase().includes('positive') ? 'positive' :
+ sentiment.toLowerCase().includes('negative') ? 'negative' : 'neutral';
+ return (
+
+
+ {sentimentClass === 'positive' ? '+' : sentimentClass === 'negative' ? '-' : '~'}
+
+ Sentiment: {sentiment}
+
+ );
+ } else if (line.trim().startsWith('* Justification:')) {
+ return (
+
+ {cleanMarkdownText(line.replace('* Justification:', '').trim())}
+
+ );
+ } else if (line.trim()) {
+ // Remove numbering and bullet points from the line
+ const cleanLine = line.replace(/^\d+\.\s*/, '').replace(/^[•\-*]\s*/, '').trim();
+ if (cleanLine) {
+ return (
+
+ {cleanMarkdownText(cleanLine)}
+
+ );
+ }
+ }
+ return null;
+ })}
+
+ );
+ }
+
+ // Handle section headers (like "Overall Market Sentiment Summary:")
+ if (section.includes(':') && section.split('\n')[0].endsWith(':')) {
+ const lines = section.split('\n');
+ const header = lines[0];
+ const content = lines.slice(1).join('\n');
+
+ return (
+
+
+ {header.replace(':', '')}
+
+
+ {content.split('\n').map((line, lineIdx) => {
+ if (!line.trim()) return null;
+ if (line.startsWith('*')) {
+ // Remove bullet points and numbering
+ const cleanLine = line.replace(/^\*\s*/, '').replace(/^\d+\.\s*/, '').replace(/^[•\-*]\s*/, '').trim();
+ return (
+
+ {cleanMarkdownText(cleanLine)}
+
+ );
+ }
+ return (
+
+ {cleanMarkdownText(line)}
+
+ );
+ })}
+
+
+ );
+ }
+
+ // Handle list items with bullets
+ if (section.includes('•') || section.match(/^\d+\./)) {
+ const items = section.split('\n').filter(item => item.trim());
+ return (
+
+ {items.map((item, itemIndex) => {
+ // Remove numbering and bullet points from each item
+ const cleanItem = item.replace(/^\d+\.\s*/, '').replace(/^[•\-*]\s*/, '').trim();
+ return (
+
+ {cleanMarkdownText(cleanItem)}
+
+ );
+ })}
+
+ );
+ }
+
+ // Handle regular paragraphs
+ if (section.trim()) {
+ return (
+
+ {cleanMarkdownText(section)}
+
+ );
+ }
+
+ return null;
+ }).filter(Boolean)}
+
+ );
+ };
+
+ return (
+ e.stopPropagation()}
+ style={{
+ zIndex: 99999,
+ backdropFilter: 'blur(10px)',
+ background: 'rgba(0, 0, 0, 0.7)'
+ }}
+ >
+
+ {/* Header - Always at top */}
+
+ {/* Scrollable Content */}
+
+
+ {/* Investment Verdict Card */}
+
+
+
+
Investment Verdict
+
+
+
+ {analysisData?.analysis?.recommendation || 'ANALYZING'}
+
+ {analysisData?.analysis?.confidence && (
+
+
Confidence:
+
+ {analysisData.analysis.confidence.toUpperCase()}
+
+
+ )}
+
+
+
+
+ {/* AI Analysis Summary Card - Moved to top */}
+ {analysisData?.analysis?.summary && (
+
+
+
+
AI Analysis Summary
+
+
+ {(() => {
+ const formattedSummary = formatSummaryText(analysisData.analysis.summary);
+ return Array.isArray(formattedSummary)
+ ? formattedSummary.map((paragraph, index) => (
+
+ ))
+ :
{formattedSummary}
;
+ })()}
+
+
+
+ )}
+
+ {/* Market Sentiment Card */}
+ {analysisData?.analysis?.sentiment_report && (
+
+
+
+
Market Sentiment
+
+
+ {renderSentimentContent(analysisData.analysis.sentiment_report)}
+
+
+
+ )}
+
+ {/* Recent Headlines Card */}
+ {analysisData?.data_sources?.news_headlines?.headlines && (
+
+
+
+
Recent News Headlines
+
+
+ {(() => {
+ try {
+ const headlines = analysisData.data_sources.news_headlines.headlines;
+ if (!Array.isArray(headlines)) return null;
+
+ return headlines.slice(0, 4).map((headline, index) => (
+
+
+ {headline?.headline || headline}
+
+ {headline?.url && (
+
+ Read More →
+
+ )}
+
+ ));
+ } catch (error) {
+ return
Error loading headlines
;
+ }
+ })()}
+
+
+
+ )}
+
+ {/* Footer Actions */}
+
+
+
+
+
+ Back to Dashboard
+
+
+
+
+
+
+
+ );
+};
+
+export default AnalysisDashboard;
\ No newline at end of file
diff --git a/frontend/src/components/analysis/SimpleAnalysisTest.jsx b/frontend/src/components/analysis/SimpleAnalysisTest.jsx
new file mode 100644
index 0000000..48a96c5
--- /dev/null
+++ b/frontend/src/components/analysis/SimpleAnalysisTest.jsx
@@ -0,0 +1,41 @@
+import React from 'react';
+
+const SimpleAnalysisTest = ({ onClose }) => {
+ return (
+
+
+
Test Analysis Modal
+
This is a simple test to see if modals work.
+
+ Close Test
+
+
+
+ );
+};
+
+export default SimpleAnalysisTest;
diff --git a/frontend/src/components/analysis/StockChart.jsx b/frontend/src/components/analysis/StockChart.jsx
new file mode 100644
index 0000000..e42442b
--- /dev/null
+++ b/frontend/src/components/analysis/StockChart.jsx
@@ -0,0 +1,188 @@
+import React, { useEffect, useRef, useState } from 'react';
+
+const StockChart = ({ ticker }) => {
+ const chartContainerRef = useRef(null);
+ const chartRef = useRef(null);
+ const [isLoading, setIsLoading] = useState(false);
+ const [scriptLoaded, setScriptLoaded] = useState(false);
+
+ // Check if TradingView script is loaded
+ useEffect(() => {
+ console.log('Checking TradingView script availability...');
+
+ const checkTradingView = () => {
+ if (window.TradingView) {
+ console.log('TradingView script found immediately');
+ setScriptLoaded(true);
+ } else {
+ console.log('TradingView script not found, polling...');
+ // Poll for TradingView availability
+ const pollInterval = setInterval(() => {
+ if (window.TradingView) {
+ console.log('TradingView script loaded via polling');
+ setScriptLoaded(true);
+ clearInterval(pollInterval);
+ }
+ }, 100);
+
+ // Cleanup after 10 seconds
+ const timeout = setTimeout(() => {
+ console.log('TradingView script polling timeout after 10 seconds');
+ clearInterval(pollInterval);
+ }, 10000);
+
+ return () => {
+ clearInterval(pollInterval);
+ clearTimeout(timeout);
+ };
+ }
+ };
+
+ checkTradingView();
+ }, []);
+
+ useEffect(() => {
+ console.log('Chart creation useEffect triggered:', { ticker, scriptLoaded, containerReady: !!chartContainerRef.current });
+
+ if (!ticker || !chartContainerRef.current || !scriptLoaded) {
+ console.log('Conditions not met for chart creation:', { ticker, scriptLoaded, containerReady: !!chartContainerRef.current });
+ return;
+ }
+
+ console.log('Setting loading to true for ticker:', ticker);
+ setIsLoading(true);
+
+ if (chartRef.current) {
+ try {
+ chartRef.current.remove();
+ } catch (error) {
+ console.warn('Error removing previous chart:', error);
+ }
+ chartRef.current = null;
+ }
+
+ chartContainerRef.current.innerHTML = '';
+
+ const containerId = `tradingview-chart-${ticker}-${Date.now()}`;
+ chartContainerRef.current.id = containerId;
+
+ console.log('Creating TradingView widget for:', ticker, 'in container:', containerId);
+
+ // Create new chart with a slight delay to ensure container is ready
+ const timeoutId = setTimeout(() => {
+ try {
+ chartRef.current = new window.TradingView.widget({
+ // Essential settings
+ width: '100%',
+ height: 400,
+ symbol: `NASDAQ:${ticker}`,
+ interval: 'D',
+ timezone: 'Etc/UTC',
+ theme: 'dark',
+ style: '1', // Candlestick chart
+ locale: 'en',
+ container_id: containerId,
+
+ // Clean UI settings
+ hide_top_toolbar: true,
+ hide_legend: false,
+ hide_side_toolbar: true,
+ toolbar_bg: '#1f2937',
+ enable_publishing: false,
+ allow_symbol_change: false,
+ save_image: false,
+ show_popup_button: false,
+
+ // Simplified appearance
+ hide_volume: false,
+ backgroundColor: '#1f2937',
+ gridColor: '#374151',
+
+ // Chart ready callback
+ onChartReady: () => {
+ console.log('Chart ready callback triggered for:', ticker);
+ setIsLoading(false);
+ }
+ });
+ console.log('TradingView widget created successfully');
+
+ // Fallback timeout in case onChartReady doesn't fire
+ setTimeout(() => {
+ console.log('Fallback timeout triggered, setting loading to false');
+ setIsLoading(false);
+ }, 5000);
+
+ } catch (error) {
+ console.error('Error creating TradingView chart:', error);
+ setIsLoading(false);
+ }
+ }, 100);
+
+ // Cleanup function
+ return () => {
+ console.log('Cleanup function called for ticker:', ticker);
+ clearTimeout(timeoutId);
+ if (chartRef.current) {
+ try {
+ chartRef.current.remove();
+ } catch (error) {
+ console.warn('Error during cleanup:', error);
+ }
+ chartRef.current = null;
+ }
+ setIsLoading(false);
+ };
+ }, [ticker, scriptLoaded]);
+
+ if (!scriptLoaded) {
+ return (
+
+
+
+
+
+
Loading TradingView...
+
+
+
+ );
+ }
+
+ return (
+
+
+
+ {ticker ? `${ticker}` : 'Select a stock to view chart'}
+
+ {ticker && isLoading && (
+
Loading chart data...
+ )}
+
+
+
+ {isLoading && ticker && (
+
+
+
+
Loading {ticker} chart...
+
+
+ )}
+
+
+
+ {!ticker && (
+
+
Please select a ticker symbol to display the chart
+
+ )}
+
+
+ );
+};
+
+export default StockChart;
\ No newline at end of file
diff --git a/frontend/src/components/analysis/StockChartDashboard.jsx b/frontend/src/components/analysis/StockChartDashboard.jsx
new file mode 100644
index 0000000..0b2d987
--- /dev/null
+++ b/frontend/src/components/analysis/StockChartDashboard.jsx
@@ -0,0 +1,329 @@
+import React, { useState, useEffect, useRef } from 'react';
+import StockChart from './StockChart';
+
+const StockChartDashboard = () => {
+ const [selectedTicker, setSelectedTicker] = useState('AAPL'); // Default to AAPL
+ const [searchTerm, setSearchTerm] = useState('');
+ const [isDropdownOpen, setIsDropdownOpen] = useState(false);
+ const searchRef = useRef(null);
+
+ // Close dropdown when clicking outside
+ useEffect(() => {
+ const handleClickOutside = (event) => {
+ if (searchRef.current && !searchRef.current.contains(event.target)) {
+ setIsDropdownOpen(false);
+ }
+ };
+
+ document.addEventListener('mousedown', handleClickOutside);
+ return () => {
+ document.removeEventListener('mousedown', handleClickOutside);
+ };
+ }, []);
+
+ // Complete list of NASDAQ stocks from the CSV file
+ const tickers = [
+ { value: 'AAPL', label: 'Apple Inc. (AAPL)' },
+ { value: 'ABNB', label: 'Airbnb Inc. (ABNB)' },
+ { value: 'ADBE', label: 'Adobe Inc. (ADBE)' },
+ { value: 'ADI', label: 'Analog Devices Inc. (ADI)' },
+ { value: 'ADP', label: 'Automatic Data Processing Inc. (ADP)' },
+ { value: 'ADSK', label: 'Autodesk Inc. (ADSK)' },
+ { value: 'AEP', label: 'American Electric Power Company Inc. (AEP)' },
+ { value: 'AFRM', label: 'Affirm Holdings Inc. (AFRM)' },
+ { value: 'AGNC', label: 'AGNC Investment Corp. (AGNC)' },
+ { value: 'AKAM', label: 'Akamai Technologies Inc. (AKAM)' },
+ { value: 'ALAB', label: 'Astera Labs Inc. (ALAB)' },
+ { value: 'ALGN', label: 'Align Technology Inc. (ALGN)' },
+ { value: 'ALNY', label: 'Alnylam Pharmaceuticals Inc. (ALNY)' },
+ { value: 'AMAT', label: 'Applied Materials Inc. (AMAT)' },
+ { value: 'AMD', label: 'Advanced Micro Devices Inc. (AMD)' },
+ { value: 'AMGN', label: 'Amgen Inc. (AMGN)' },
+ { value: 'AMZN', label: 'Amazon.com Inc. (AMZN)' },
+ { value: 'APP', label: 'Applovin Corporation (APP)' },
+ { value: 'ARCC', label: 'Ares Capital Corporation (ARCC)' },
+ { value: 'ASTS', label: 'AST SpaceMobile Inc. (ASTS)' },
+ { value: 'AUR', label: 'Aurora Innovation Inc. (AUR)' },
+ { value: 'AVAV', label: 'AeroVironment Inc. (AVAV)' },
+ { value: 'AVGO', label: 'Broadcom Inc. (AVGO)' },
+ { value: 'AXON', label: 'Axon Enterprise Inc. (AXON)' },
+ { value: 'BIIB', label: 'Biogen Inc. (BIIB)' },
+ { value: 'BKNG', label: 'Booking Holdings Inc. (BKNG)' },
+ { value: 'BKR', label: 'Baker Hughes Company (BKR)' },
+ { value: 'BMRN', label: 'BioMarin Pharmaceutical Inc. (BMRN)' },
+ { value: 'BSY', label: 'Bentley Systems Incorporated (BSY)' },
+ { value: 'CAI', label: 'Caris Life Sciences Inc. (CAI)' },
+ { value: 'CART', label: 'Maplebear Inc. (CART)' },
+ { value: 'CASY', label: 'Casey\'s General Stores Inc. (CASY)' },
+ { value: 'CDNS', label: 'Cadence Design Systems Inc. (CDNS)' },
+ { value: 'CDW', label: 'CDW Corporation (CDW)' },
+ { value: 'CEG', label: 'Constellation Energy Corporation (CEG)' },
+ { value: 'CELH', label: 'Celsius Holdings Inc. (CELH)' },
+ { value: 'CG', label: 'The Carlyle Group Inc. (CG)' },
+ { value: 'CHRW', label: 'C.H. Robinson Worldwide Inc. (CHRW)' },
+ { value: 'CHTR', label: 'Charter Communications Inc. (CHTR)' },
+ { value: 'CINF', label: 'Cincinnati Financial Corporation (CINF)' },
+ { value: 'CMCSA', label: 'Comcast Corporation (CMCSA)' },
+ { value: 'CME', label: 'CME Group Inc. (CME)' },
+ { value: 'COIN', label: 'Coinbase Global Inc. (COIN)' },
+ { value: 'COKE', label: 'Coca-Cola Consolidated Inc. (COKE)' },
+ { value: 'COO', label: 'The Cooper Companies Inc. (COO)' },
+ { value: 'COOP', label: 'Mr. Cooper Group Inc. (COOP)' },
+ { value: 'COST', label: 'Costco Wholesale Corporation (COST)' },
+ { value: 'CPRT', label: 'Copart Inc. (CPRT)' },
+ { value: 'CRDO', label: 'Credo Technology Group Holding Ltd (CRDO)' },
+ { value: 'CRWD', label: 'CrowdStrike Holdings Inc. (CRWD)' },
+ { value: 'CSCO', label: 'Cisco Systems Inc. (CSCO)' },
+ { value: 'CSGP', label: 'CoStar Group Inc. (CSGP)' },
+ { value: 'CSX', label: 'CSX Corporation (CSX)' },
+ { value: 'CTAS', label: 'Cintas Corporation (CTAS)' },
+ { value: 'CTSH', label: 'Cognizant Technology Solutions Corporation (CTSH)' },
+ { value: 'DASH', label: 'DoorDash Inc. (DASH)' },
+ { value: 'DDOG', label: 'Datadog Inc. (DDOG)' },
+ { value: 'DKNG', label: 'DraftKings Inc. (DKNG)' },
+ { value: 'DLTR', label: 'Dollar Tree Inc. (DLTR)' },
+ { value: 'DOCU', label: 'DocuSign Inc. (DOCU)' },
+ { value: 'DPZ', label: 'Domino\'s Pizza Inc (DPZ)' },
+ { value: 'DUOL', label: 'Duolingo Inc. (DUOL)' },
+ { value: 'DXCM', label: 'DexCom Inc. (DXCM)' },
+ { value: 'EA', label: 'Electronic Arts Inc. (EA)' },
+ { value: 'EBAY', label: 'eBay Inc. (EBAY)' },
+ { value: 'ENTG', label: 'Entegris Inc. (ENTG)' },
+ { value: 'EQIX', label: 'Equinix Inc. (EQIX)' },
+ { value: 'ERIE', label: 'Erie Indemnity Company (ERIE)' },
+ { value: 'EVRG', label: 'Evergy Inc. (EVRG)' },
+ { value: 'EWBC', label: 'East West Bancorp Inc. (EWBC)' },
+ { value: 'EXC', label: 'Exelon Corporation (EXC)' },
+ { value: 'EXE', label: 'Expand Energy Corporation (EXE)' },
+ { value: 'EXEL', label: 'Exelixis Inc. (EXEL)' },
+ { value: 'EXPE', label: 'Expedia Group Inc. (EXPE)' },
+ { value: 'FANG', label: 'Diamondback Energy Inc. (FANG)' },
+ { value: 'FAST', label: 'Fastenal Company (FAST)' },
+ { value: 'FCNCA', label: 'First Citizens BancShares Inc. (FCNCA)' },
+ { value: 'FFIV', label: 'F5 Inc. (FFIV)' },
+ { value: 'FITB', label: 'Fifth Third Bancorp (FITB)' },
+ { value: 'FOX', label: 'Fox Corporation (FOX)' },
+ { value: 'FOXA', label: 'Fox Corporation (FOXA)' },
+ { value: 'FSLR', label: 'First Solar Inc. (FSLR)' },
+ { value: 'FTAI', label: 'FTAI Aviation Ltd. (FTAI)' },
+ { value: 'FTNT', label: 'Fortinet Inc. (FTNT)' },
+ { value: 'GEHC', label: 'GE HealthCare Technologies Inc. (GEHC)' },
+ { value: 'GEN', label: 'Gen Digital Inc. (GEN)' },
+ { value: 'GFS', label: 'GlobalFoundries Inc. (GFS)' },
+ { value: 'GILD', label: 'Gilead Sciences Inc. (GILD)' },
+ { value: 'GLPI', label: 'Gaming and Leisure Properties Inc. (GLPI)' },
+ { value: 'GOOG', label: 'Alphabet Inc. Class C (GOOG)' },
+ { value: 'GOOGL', label: 'Alphabet Inc. Class A (GOOGL)' },
+ { value: 'HAS', label: 'Hasbro Inc. (HAS)' },
+ { value: 'HBAN', label: 'Huntington Bancshares Incorporated (HBAN)' },
+ { value: 'HOLX', label: 'Hologic Inc. (HOLX)' },
+ { value: 'HON', label: 'Honeywell International Inc. (HON)' },
+ { value: 'HOOD', label: 'Robinhood Markets Inc. (HOOD)' },
+ { value: 'HST', label: 'Host Hotels & Resorts Inc. (HST)' },
+ { value: 'IBKR', label: 'Interactive Brokers Group Inc. (IBKR)' },
+ { value: 'IDXX', label: 'IDEXX Laboratories Inc. (IDXX)' },
+ { value: 'ILMN', label: 'Illumina Inc. (ILMN)' },
+ { value: 'INCY', label: 'Incyte Corp. (INCY)' },
+ { value: 'INSM', label: 'Insmed Incorporated (INSM)' },
+ { value: 'INTC', label: 'Intel Corporation (INTC)' },
+ { value: 'INTU', label: 'Intuit Inc. (INTU)' },
+ { value: 'ISRG', label: 'Intuitive Surgical Inc. (ISRG)' },
+ { value: 'JBHT', label: 'J.B. Hunt Transport Services Inc. (JBHT)' },
+ { value: 'JKHY', label: 'Jack Henry & Associates Inc. (JKHY)' },
+ { value: 'KDP', label: 'Keurig Dr Pepper Inc. (KDP)' },
+ { value: 'KHC', label: 'The Kraft Heinz Company (KHC)' },
+ { value: 'KLAC', label: 'KLA Corporation (KLAC)' },
+ { value: 'KMB', label: 'Kimberly-Clark Corporation (KMB)' },
+ { value: 'KTOS', label: 'Kratos Defense & Security Solutions Inc. (KTOS)' },
+ { value: 'LAMR', label: 'Lamar Advertising Company (LAMR)' },
+ { value: 'LECO', label: 'Lincoln Electric Holdings Inc. (LECO)' },
+ { value: 'LIN', label: 'Linde plc (LIN)' },
+ { value: 'LNT', label: 'Alliant Energy Corporation (LNT)' },
+ { value: 'LOGI', label: 'Logitech International S.A. (LOGI)' },
+ { value: 'LPLA', label: 'LPL Financial Holdings Inc. (LPLA)' },
+ { value: 'LRCX', label: 'Lam Research Corporation (LRCX)' },
+ { value: 'MANH', label: 'Manhattan Associates Inc. (MANH)' },
+ { value: 'MAR', label: 'Marriott International (MAR)' },
+ { value: 'MCHP', label: 'Microchip Technology Incorporated (MCHP)' },
+ { value: 'MDB', label: 'MongoDB Inc. (MDB)' },
+ { value: 'MDLZ', label: 'Mondelez International Inc. (MDLZ)' },
+ { value: 'MEDP', label: 'Medpace Holdings Inc. (MEDP)' },
+ { value: 'META', label: 'Meta Platforms Inc. (META)' },
+ { value: 'MNST', label: 'Monster Beverage Corporation (MNST)' },
+ { value: 'MORN', label: 'Morningstar Inc. (MORN)' },
+ { value: 'MPWR', label: 'Monolithic Power Systems Inc. (MPWR)' },
+ { value: 'MRVL', label: 'Marvell Technology Inc. (MRVL)' },
+ { value: 'MSFT', label: 'Microsoft Corporation (MSFT)' },
+ { value: 'MSTR', label: 'MicroStrategy Inc (MSTR)' },
+ { value: 'MU', label: 'Micron Technology Inc. (MU)' },
+ { value: 'NBIX', label: 'Neurocrine Biosciences Inc. (NBIX)' },
+ { value: 'NDAQ', label: 'Nasdaq Inc. (NDAQ)' },
+ { value: 'NDSN', label: 'Nordson Corporation (NDSN)' },
+ { value: 'NFLX', label: 'Netflix Inc. (NFLX)' },
+ { value: 'NTAP', label: 'NetApp Inc. (NTAP)' },
+ { value: 'NTNX', label: 'Nutanix Inc. (NTNX)' },
+ { value: 'NTRA', label: 'Natera Inc. (NTRA)' },
+ { value: 'NTRS', label: 'Northern Trust Corporation (NTRS)' },
+ { value: 'NVDA', label: 'NVIDIA Corporation (NVDA)' },
+ { value: 'NWS', label: 'News Corporation (NWS)' },
+ { value: 'NWSA', label: 'News Corporation (NWSA)' },
+ { value: 'ODFL', label: 'Old Dominion Freight Line Inc. (ODFL)' },
+ { value: 'OKTA', label: 'Okta Inc. (OKTA)' },
+ { value: 'ON', label: 'ON Semiconductor Corporation (ON)' },
+ { value: 'ORLY', label: 'O\'Reilly Automotive Inc. (ORLY)' },
+ { value: 'PAA', label: 'Plains All American Pipeline L.P. (PAA)' },
+ { value: 'PANW', label: 'Palo Alto Networks Inc. (PANW)' },
+ { value: 'PAYX', label: 'Paychex Inc. (PAYX)' },
+ { value: 'PCAR', label: 'PACCAR Inc. (PCAR)' },
+ { value: 'PEP', label: 'PepsiCo Inc. (PEP)' },
+ { value: 'PFG', label: 'Principal Financial Group Inc (PFG)' },
+ { value: 'PLTR', label: 'Palantir Technologies Inc. (PLTR)' },
+ { value: 'PODD', label: 'Insulet Corporation (PODD)' },
+ { value: 'POOL', label: 'Pool Corporation (POOL)' },
+ { value: 'PPC', label: 'Pilgrim\'s Pride Corporation (PPC)' },
+ { value: 'PTC', label: 'PTC Inc. (PTC)' },
+ { value: 'PYPL', label: 'PayPal Holdings Inc. (PYPL)' },
+ { value: 'QCOM', label: 'QUALCOMM Incorporated (QCOM)' },
+ { value: 'REG', label: 'Regency Centers Corporation (REG)' },
+ { value: 'REGN', label: 'Regeneron Pharmaceuticals Inc. (REGN)' },
+ { value: 'RGLD', label: 'Royal Gold Inc. (RGLD)' },
+ { value: 'RIVN', label: 'Rivian Automotive Inc. (RIVN)' },
+ { value: 'RKLB', label: 'Rocket Lab Corporation (RKLB)' },
+ { value: 'ROKU', label: 'Roku Inc. (ROKU)' },
+ { value: 'ROP', label: 'Roper Technologies Inc. (ROP)' },
+ { value: 'ROST', label: 'Ross Stores Inc. (ROST)' },
+ { value: 'RPRX', label: 'Royalty Pharma plc (RPRX)' },
+ { value: 'SATS', label: 'EchoStar Corporation (SATS)' },
+ { value: 'SBAC', label: 'SBA Communications Corporation (SBAC)' },
+ { value: 'SBUX', label: 'Starbucks Corporation (SBUX)' },
+ { value: 'SEIC', label: 'SEI Investments Company (SEIC)' },
+ { value: 'SFM', label: 'Sprouts Farmers Market Inc. (SFM)' },
+ { value: 'SMCI', label: 'Super Micro Computer Inc. (SMCI)' },
+ { value: 'SNPS', label: 'Synopsys Inc. (SNPS)' },
+ { value: 'SOFI', label: 'SoFi Technologies Inc. (SOFI)' },
+ { value: 'SSNC', label: 'SS&C Technologies Holdings Inc. (SSNC)' },
+ { value: 'STLD', label: 'Steel Dynamics Inc. (STLD)' },
+ { value: 'SWKS', label: 'Skyworks Solutions Inc. (SWKS)' },
+ { value: 'SYM', label: 'Symbotic Inc. (SYM)' },
+ { value: 'TEM', label: 'Tempus AI Inc. (TEM)' },
+ { value: 'TER', label: 'Teradyne Inc. (TER)' },
+ { value: 'TLN', label: 'Talen Energy Corporation (TLN)' },
+ { value: 'TMUS', label: 'T-Mobile US Inc. (TMUS)' },
+ { value: 'TPG', label: 'TPG Inc. (TPG)' },
+ { value: 'TRI', label: 'Thomson Reuters Corporation (TRI)' },
+ { value: 'TRMB', label: 'Trimble Inc. (TRMB)' },
+ { value: 'TROW', label: 'T. Rowe Price Group Inc. (TROW)' },
+ { value: 'TSCO', label: 'Tractor Supply Company (TSCO)' },
+ { value: 'TSLA', label: 'Tesla Inc. (TSLA)' },
+ { value: 'TTD', label: 'The Trade Desk Inc. (TTD)' },
+ { value: 'TTWO', label: 'Take-Two Interactive Software Inc. (TTWO)' },
+ { value: 'TW', label: 'Tradeweb Markets Inc. (TW)' },
+ { value: 'TXN', label: 'Texas Instruments Incorporated (TXN)' },
+ { value: 'TXRH', label: 'Texas Roadhouse Inc. (TXRH)' },
+ { value: 'UAL', label: 'United Airlines Holdings Inc. (UAL)' },
+ { value: 'ULTA', label: 'Ulta Beauty Inc. (ULTA)' },
+ { value: 'UTHR', label: 'United Therapeutics Corporation (UTHR)' },
+ { value: 'VNOM', label: 'Viper Energy Inc. (VNOM)' },
+ { value: 'VRSK', label: 'Verisk Analytics Inc. (VRSK)' },
+ { value: 'VRSN', label: 'VeriSign Inc. (VRSN)' },
+ { value: 'VRTX', label: 'Vertex Pharmaceuticals Incorporated (VRTX)' },
+ { value: 'VTRS', label: 'Viatris Inc. (VTRS)' },
+ { value: 'WBD', label: 'Warner Bros. Discovery Inc. (WBD)' },
+ { value: 'WDAY', label: 'Workday Inc. (WDAY)' },
+ { value: 'WDC', label: 'Western Digital Corporation (WDC)' },
+ { value: 'WMG', label: 'Warner Music Group Corp. (WMG)' },
+ { value: 'WWD', label: 'Woodward Inc. (WWD)' },
+ { value: 'WYNN', label: 'Wynn Resorts Limited (WYNN)' },
+ { value: 'XEL', label: 'Xcel Energy Inc. (XEL)' },
+ { value: 'Z', label: 'Zillow Group Inc. Class C (Z)' },
+ { value: 'ZBRA', label: 'Zebra Technologies Corporation (ZBRA)' },
+ { value: 'ZG', label: 'Zillow Group Inc. Class A (ZG)' },
+ { value: 'ZM', label: 'Zoom Communications Inc. (ZM)' },
+ { value: 'ZS', label: 'Zscaler Inc. (ZS)' }
+ ];
+
+ const handleTickerChange = (event) => {
+ setSelectedTicker(event.target.value);
+ };
+
+ const handleSearchChange = (event) => {
+ setSearchTerm(event.target.value);
+ setIsDropdownOpen(true);
+ };
+
+ const handleTickerSelect = (ticker) => {
+ setSelectedTicker(ticker);
+ setSearchTerm('');
+ setIsDropdownOpen(false);
+ };
+
+ // Filter tickers based on search term
+ const filteredTickers = tickers.filter(ticker =>
+ ticker.label.toLowerCase().includes(searchTerm.toLowerCase()) ||
+ ticker.value.toLowerCase().includes(searchTerm.toLowerCase())
+ );
+
+ // Get the label for the selected ticker
+ const selectedTickerLabel = tickers.find(t => t.value === selectedTicker)?.label || selectedTicker;
+
+ return (
+
+
+
+
+
+ {/* Search Input */}
+
+
setIsDropdownOpen(true)}
+ placeholder={`Search stocks..`}
+ className="w-full px-4 py-3 bg-gray-800 border border-white/30 rounded-lg text-white placeholder-white/50 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
+ />
+
+ {/* Search Icon */}
+
+
+ {/* Dropdown Results */}
+ {isDropdownOpen && searchTerm && (
+
+ {filteredTickers.length > 0 ? (
+ filteredTickers.slice(0, 10).map((ticker) => (
+
handleTickerSelect(ticker.value)}
+ className="w-full px-4 py-2 text-left text-white hover:bg-gray-700 focus:bg-gray-700 focus:outline-none first:rounded-t-lg last:rounded-b-lg"
+ >
+ {ticker.value}
+ - {ticker.label.split('(')[0].trim()}
+
+ ))
+ ) : (
+
+ No stocks found matching "{searchTerm}"
+
+ )}
+
+ )}
+
+
+
+
+
+
+
+
+ {/* Stock Chart Component */}
+
+
+ );
+};
+
+export default StockChartDashboard;
\ No newline at end of file
diff --git a/frontend/src/components/analysis/index.js b/frontend/src/components/analysis/index.js
new file mode 100644
index 0000000..90aee8d
--- /dev/null
+++ b/frontend/src/components/analysis/index.js
@@ -0,0 +1,5 @@
+// Analysis Components
+export { default as AnalysisDashboard } from './AnalysisDashboard';
+export { default as SimpleAnalysisTest } from './SimpleAnalysisTest';
+export { default as StockChart } from './StockChart';
+export { default as StockChartDashboard } from './StockChartDashboard';
\ No newline at end of file
diff --git a/frontend/src/components/index.js b/frontend/src/components/index.js
new file mode 100644
index 0000000..41f1e1d
--- /dev/null
+++ b/frontend/src/components/index.js
@@ -0,0 +1,16 @@
+// Component exports organized by functionality
+
+// UI Components
+export * from './ui';
+
+// Layout Components
+export * from './layout';
+
+// Watchlist Components
+export * from './watchlist';
+
+// News Components
+export * from './news';
+
+// Analysis Components
+export * from './analysis';
\ No newline at end of file
diff --git a/frontend/src/components/layout/Header.jsx b/frontend/src/components/layout/Header.jsx
new file mode 100644
index 0000000..8771fe4
--- /dev/null
+++ b/frontend/src/components/layout/Header.jsx
@@ -0,0 +1,56 @@
+import React from 'react';
+import { useAuth } from '../../contexts/AuthContext';
+import Button from '../ui/Button';
+import logo from '../../assets/logo.png';
+
+const Header = ({ title = "StockSense", subtitle = "Your Personal Stock Assistant" }) => {
+ const { currentUser, logout } = useAuth();
+
+ const handleLogout = async () => {
+ try {
+ await logout();
+ } catch (error) {
+ console.error('Logout error:', error);
+ }
+ };
+
+ return (
+
+ {/* Left Section - Logo and Title */}
+
+
+ {/* Logo */}
+
+
+
+
+ {/* Title and Subtitle */}
+
+ {title}
+ {subtitle}
+
+
+
+
+
+
+ );
+};
+
+export default Header;
+
diff --git a/frontend/src/components/layout/ProtectedRoute.jsx b/frontend/src/components/layout/ProtectedRoute.jsx
new file mode 100644
index 0000000..15e7c7a
--- /dev/null
+++ b/frontend/src/components/layout/ProtectedRoute.jsx
@@ -0,0 +1,43 @@
+import React from 'react';
+import { Navigate, useLocation } from 'react-router-dom';
+import { useAuth } from '../../contexts/AuthContext';
+
+const ProtectedRoute = ({ children }) => {
+ const { currentUser, loading } = useAuth();
+ const location = useLocation();
+
+ // Show loading spinner while checking auth state
+ if (loading) {
+ return (
+
+ );
+ }
+
+ // If not authenticated, redirect to login with current location
+ if (!currentUser) {
+ return ;
+ }
+
+ // If authenticated, render the protected component
+ return children;
+};
+
+export default ProtectedRoute;
diff --git a/frontend/src/components/layout/index.js b/frontend/src/components/layout/index.js
new file mode 100644
index 0000000..ebdd30c
--- /dev/null
+++ b/frontend/src/components/layout/index.js
@@ -0,0 +1,3 @@
+// Layout Components
+export { default as Header } from './Header';
+export { default as ProtectedRoute } from './ProtectedRoute';
\ No newline at end of file
diff --git a/frontend/src/components/news/NewsCard.jsx b/frontend/src/components/news/NewsCard.jsx
new file mode 100644
index 0000000..c36e5f9
--- /dev/null
+++ b/frontend/src/components/news/NewsCard.jsx
@@ -0,0 +1,181 @@
+import React, { useState, useEffect } from 'react';
+import { newsAPI } from '../../services/api/newsAPI';
+import NewsList from './NewsList';
+import { RefreshCcw } from 'lucide-react';
+
+
+const NewsCard = () => {
+ const [newsData, setNewsData] = useState(null);
+ const [loading, setLoading] = useState(true);
+ const [error, setError] = useState(null);
+
+ useEffect(() => {
+ fetchWatchlistNews();
+ }, []);
+
+ const fetchWatchlistNews = async () => {
+ try {
+ setLoading(true);
+ setError(null);
+
+ const result = await newsAPI.getWatchlistNews(5);
+
+ if (result.success) {
+ setNewsData(result);
+ } else {
+ setError(result.error || 'Failed to fetch news');
+ }
+ } catch (err) {
+ setError(err.message || 'An error occurred while fetching news');
+ console.error('Error fetching watchlist news:', err);
+ } finally {
+ setLoading(false);
+ }
+ };
+
+ const getArticleDate = (article) => {
+ const possibleDateFields = [
+ 'published_at',
+ 'published_on',
+ 'publishedAt',
+ 'date',
+ 'pubDate',
+ 'created_at',
+ 'createdAt'
+ ];
+
+ for (const field of possibleDateFields) {
+ if (article[field]) {
+ return formatDate(article[field]);
+ }
+ }
+
+ console.warn('No valid date field found in article:', Object.keys(article));
+ return 'Date not available';
+ };
+
+ const formatDate = (dateString) => {
+ if (!dateString) return 'Unknown date';
+
+ try {
+ // Handle different date formats
+ let date;
+
+ // If it's already a Date object
+ if (dateString instanceof Date) {
+ date = dateString;
+ }
+ // If it's a timestamp (number)
+ else if (typeof dateString === 'number') {
+ date = new Date(dateString * 1000); // Convert from Unix timestamp if needed
+ }
+ // If it's a string (like ISO 8601: "2025-10-01T03:30:00.000000Z")
+ else {
+ // Handle ISO format or other string formats
+ date = new Date(dateString);
+ }
+
+ // Check if date is valid
+ if (isNaN(date.getTime())) {
+ console.warn('Invalid date:', dateString);
+ return 'Invalid date';
+ }
+
+ const now = new Date();
+ const diffInHours = Math.abs(now - date) / (1000 * 60 * 60);
+
+ if (diffInHours < 24) {
+ const diffInMinutes = Math.floor(diffInHours * 60);
+ if (diffInMinutes < 60) {
+ return `${diffInMinutes}m ago`;
+ } else {
+ return `${Math.floor(diffInHours)}h ago`;
+ }
+ }
+ // If less than 7 days, show days ago
+ else if (diffInHours < 168) {
+ const days = Math.floor(diffInHours / 24);
+ return `${days}d ago`;
+ }
+ // Otherwise show formatted date
+ else {
+ return date.toLocaleDateString('en-US', {
+ month: 'short',
+ day: 'numeric',
+ year: date.getFullYear() !== now.getFullYear() ? 'numeric' : undefined
+ });
+ }
+ } catch (error) {
+ console.error('Error formatting date:', error, dateString);
+ return 'Date unavailable';
+ }
+ };
+
+ if (loading) {
+ return (
+
+
News
+
+
+
Loading latest news...
+
+
+ );
+ }
+
+ if (error) {
+ return (
+
+
News
+
+ {error}
+
+ Try Again
+
+
+
+ );
+ }
+
+ if (!newsData || newsData.articles.length === 0) {
+ return (
+
+
News
+
+
No news available for your watchlist stocks
+ {newsData?.symbols.length === 0 && (
+
+ Add some stocks to your watchlist to see related news
+
+ )}
+
+
+ );
+ }
+
+ return (
+
+ );
+};
+
+export default NewsCard;
\ No newline at end of file
diff --git a/frontend/src/components/news/NewsItem.jsx b/frontend/src/components/news/NewsItem.jsx
new file mode 100644
index 0000000..a9a7bd8
--- /dev/null
+++ b/frontend/src/components/news/NewsItem.jsx
@@ -0,0 +1,72 @@
+import React from 'react';
+
+const NewsItem = ({ article, getArticleDate }) => {
+ const truncateDescription = (text, maxLength = 150) => {
+ if (!text) return '';
+ return text.length > maxLength ? text.substring(0, maxLength) + '...' : text;
+ };
+
+ return (
+
+
+
+
+ {article.title}
+
+
+ {truncateDescription(article.description)}
+
+
+
+
+ {article.source}
+
+ •
+
+ {getArticleDate(article)}
+
+
+
+
+
+
+ Read More
+
+
+
+ {article.sentiment && (
+
+ {article.sentiment}
+
+ )}
+
+
+
+
+ {article.image_url && (
+
+
{
+ e.target.style.display = 'none';
+ }}
+ />
+
+ )}
+
+
+ );
+};
+
+export default NewsItem;
\ No newline at end of file
diff --git a/frontend/src/components/news/NewsList.jsx b/frontend/src/components/news/NewsList.jsx
new file mode 100644
index 0000000..4924d0d
--- /dev/null
+++ b/frontend/src/components/news/NewsList.jsx
@@ -0,0 +1,26 @@
+import React from 'react';
+import NewsItem from './NewsItem';
+
+const NewsList = ({ articles, getArticleDate }) => {
+ if (!articles || articles.length === 0) {
+ return (
+
+
No news articles available
+
+ );
+ }
+
+ return (
+
+ {articles.map((article, index) => (
+
+ ))}
+
+ );
+};
+
+export default NewsList;
\ No newline at end of file
diff --git a/frontend/src/components/news/index.js b/frontend/src/components/news/index.js
new file mode 100644
index 0000000..e9d518a
--- /dev/null
+++ b/frontend/src/components/news/index.js
@@ -0,0 +1,4 @@
+// News Components
+export { default as NewsCard } from './NewsCard';
+export { default as NewsList } from './NewsList';
+export { default as NewsItem } from './NewsItem';
\ No newline at end of file
diff --git a/frontend/src/components/ui/Button.jsx b/frontend/src/components/ui/Button.jsx
new file mode 100644
index 0000000..e710ccf
--- /dev/null
+++ b/frontend/src/components/ui/Button.jsx
@@ -0,0 +1,21 @@
+import React from "react";
+
+
+const Button = ({
+ children = "",
+ className = "",
+ onClick = () => {},
+ ...props
+}) => {
+ return (
+
+ {children}
+
+ );
+};
+
+export default Button;
diff --git a/frontend/src/components/ui/Toast.jsx b/frontend/src/components/ui/Toast.jsx
new file mode 100644
index 0000000..1978b69
--- /dev/null
+++ b/frontend/src/components/ui/Toast.jsx
@@ -0,0 +1,43 @@
+import React, { useState, useEffect } from 'react';
+
+const Toast = ({ message, type = 'info', duration = 4000, onClose }) => {
+ useEffect(() => {
+ const timer = setTimeout(() => {
+ onClose();
+ }, duration);
+
+ return () => clearTimeout(timer);
+ }, [duration, onClose]);
+
+ const getIcon = () => {
+ switch (type) {
+ case 'success':
+ return '';
+ case 'error':
+ return '';
+ case 'warning':
+ return '';
+ case 'info':
+ default:
+ return '';
+ }
+ };
+
+ return (
+
+
+ {getIcon()}
+ {message}
+
+
+ ×
+
+
+ );
+};
+
+export default Toast;
diff --git a/frontend/src/components/ui/ToastProvider.jsx b/frontend/src/components/ui/ToastProvider.jsx
new file mode 100644
index 0000000..9a8147b
--- /dev/null
+++ b/frontend/src/components/ui/ToastProvider.jsx
@@ -0,0 +1,64 @@
+import React, { createContext, useContext, useState, useCallback } from 'react';
+import Toast from './Toast';
+
+const ToastContext = createContext();
+
+export const useToast = () => {
+ const context = useContext(ToastContext);
+ if (!context) {
+ throw new Error('useToast must be used within a ToastProvider');
+ }
+ return context;
+};
+
+export const ToastProvider = ({ children }) => {
+ const [toasts, setToasts] = useState([]);
+
+ const addToast = useCallback((message, type = 'info', duration = 4000) => {
+ const id = Date.now() + Math.random();
+ const toast = { id, message, type, duration };
+
+ setToasts(prev => [...prev, toast]);
+
+ return id;
+ }, []);
+
+ const removeToast = useCallback((id) => {
+ setToasts(prev => prev.filter(toast => toast.id !== id));
+ }, []);
+
+ const showSuccess = useCallback((message) => addToast(message, 'success'), [addToast]);
+ const showError = useCallback((message) => addToast(message, 'error', 6000), [addToast]);
+ const showWarning = useCallback((message) => addToast(message, 'warning'), [addToast]);
+ const showInfo = useCallback((message) => addToast(message, 'info'), [addToast]);
+
+ const value = {
+ showSuccess,
+ showError,
+ showWarning,
+ showInfo,
+ addToast,
+ removeToast
+ };
+
+ return (
+
+ {children}
+
+ {/* Toast Container */}
+
+ {toasts.map(toast => (
+ removeToast(toast.id)}
+ />
+ ))}
+
+
+ );
+};
+
+export default ToastProvider;
diff --git a/frontend/src/components/ui/index.js b/frontend/src/components/ui/index.js
new file mode 100644
index 0000000..613800a
--- /dev/null
+++ b/frontend/src/components/ui/index.js
@@ -0,0 +1,4 @@
+// UI Components
+export { default as Button } from './Button';
+export { default as Toast } from './Toast';
+export { default as ToastProvider, useToast } from './ToastProvider';
\ No newline at end of file
diff --git a/frontend/src/components/watchlist/AddStockModal.jsx b/frontend/src/components/watchlist/AddStockModal.jsx
new file mode 100644
index 0000000..426224f
--- /dev/null
+++ b/frontend/src/components/watchlist/AddStockModal.jsx
@@ -0,0 +1,100 @@
+import React from 'react';
+
+const AddStockModal = ({
+ isOpen,
+ onClose,
+ stockSearch,
+ setStockSearch,
+ searchResults,
+ onSearchStocks,
+ onAddStock,
+ actionLoading
+}) => {
+ if (!isOpen) return null;
+
+ return (
+
+
+
+
+
Add Stock to Watchlist
+
Search and add stocks to track their performance
+
+
+ ✕
+
+
+
+
+
+
+ Search Stocks
+
+ {
+ setStockSearch(e.target.value);
+ onSearchStocks(e.target.value);
+ }}
+ placeholder="Search by symbol or name"
+ autoFocus
+ className="input input-warning bg-gray-800 text-white w-full"
+ />
+
+
+ {searchResults.length > 0 && (
+
+
Search Results
+
+ {searchResults.slice(0, 10).map(stock => (
+
+
+
+
+
{stock.symbol}
+
{stock.name}
+
+
onAddStock(stock)}
+ disabled={actionLoading}
+ >
+ {actionLoading ? (
+
+ ) : (
+ <>
+ Add
+ >
+ )}
+
+
+
+
+ ))}
+
+
+ )}
+
+
+
+
+ Cancel
+
+
+
+
+ );
+};
+
+export default AddStockModal;
\ No newline at end of file
diff --git a/frontend/src/components/watchlist/EmptyWatchlist.jsx b/frontend/src/components/watchlist/EmptyWatchlist.jsx
new file mode 100644
index 0000000..abd2c6b
--- /dev/null
+++ b/frontend/src/components/watchlist/EmptyWatchlist.jsx
@@ -0,0 +1,28 @@
+import React from 'react';
+import Button from '../ui/Button';
+
+const EmptyWatchlist = ({ onAddStock, actionLoading }) => {
+ return (
+
+
+
+
No stocks added yet
+
Start building your watchlist by adding some stocks to track
+
+
+
+ Add Stocks
+
+
+
+
+
+ );
+};
+
+export default EmptyWatchlist;
\ No newline at end of file
diff --git a/frontend/src/components/watchlist/StockCard.jsx b/frontend/src/components/watchlist/StockCard.jsx
new file mode 100644
index 0000000..d7951f9
--- /dev/null
+++ b/frontend/src/components/watchlist/StockCard.jsx
@@ -0,0 +1,161 @@
+import React, { useState, useEffect } from 'react';
+import Button from '../ui/Button';
+import { X, TrendingUp, TrendingDown, RefreshCw } from 'lucide-react';
+import { stockAPI } from '../../services/api/stockAPI';
+
+const StockCard = ({
+ stockItem,
+ onAnalyze,
+ onRemove,
+ actionLoading
+}) => {
+ const [priceData, setPriceData] = useState(null);
+ const [loading, setLoading] = useState(true);
+ const [error, setError] = useState(null);
+
+ useEffect(() => {
+ fetchPriceData();
+
+ const interval = setInterval(() => {
+ fetchPriceData();
+ }, 300000); // 5 minutes = 300,000 milliseconds
+
+ return () => clearInterval(interval);
+ }, [stockItem.stock.symbol]);
+
+ const fetchPriceData = async () => {
+ try {
+ setLoading(true);
+ setError(null);
+ const data = await stockAPI.getPriceData(stockItem.stock.symbol);
+
+ // Check for API errors
+ if (data['Error Message']) {
+ throw new Error(data['Error Message']);
+ }
+
+ if (data['Note']) {
+ throw new Error('API call frequency limit reached. Please try again later.');
+ }
+
+ if (data && data['Time Series (1min)']) {
+ const timeSeries = data['Time Series (1min)'];
+ const timestamps = Object.keys(timeSeries).sort((a, b) => new Date(b) - new Date(a));
+
+ if (timestamps.length >= 2) {
+ const currentPrice = parseFloat(timeSeries[timestamps[0]]['4. close']);
+ const previousPrice = parseFloat(timeSeries[timestamps[1]]['4. close']);
+ const change = currentPrice - previousPrice;
+ const changePercent = ((change / previousPrice) * 100);
+
+ setPriceData({
+ currentPrice,
+ previousPrice,
+ change,
+ changePercent,
+ isUp: change >= 0,
+ timestamp: timestamps[0],
+ volume: parseInt(timeSeries[timestamps[0]]['5. volume'])
+ });
+ } else {
+ throw new Error('Insufficient price data available');
+ }
+ } else {
+ throw new Error('No price data available for this symbol');
+ }
+ } catch (err) {
+ console.error('Error fetching price data for', stockItem.stock.symbol, err);
+ setError(err.message || 'Failed to load price data');
+ } finally {
+ setLoading(false);
+ }
+ };
+
+ const handleRemove = () => {
+ if (confirm(`Remove ${stockItem.stock.symbol} from the watchlist?`)) {
+ onRemove(stockItem.stock.id);
+ }
+ };
+
+ return (
+
+
+
+
+
+ {/* Stock Info */}
+
+
{stockItem.stock.symbol}
+
{stockItem.stock.name}
+
+
+ {/* Price Data */}
+
+ {loading ? (
+
+
+ Loading...
+
+ ) : error ? (
+
+ Price unavailable
+
+ ) : priceData ? (
+
+ {/* Current Price */}
+
+ ${priceData.currentPrice.toFixed(2)}
+
+
+ {/* Price Change */}
+
+ {priceData.isUp ? (
+
+ ) : (
+
+ )}
+
+ {priceData.isUp ? '+' : ''}${priceData.change.toFixed(2)}
+ ({priceData.isUp ? '+' : ''}{priceData.changePercent.toFixed(2)}%)
+
+
+
+
+ ) : (
+
No data
+ )}
+
+
+
+
+
+ {/* Action Buttons */}
+
+ onAnalyze(stockItem.stock)}
+ disabled={actionLoading}
+ title={`Analyze ${stockItem.stock.symbol} with AI`}
+ aria-label={`Analyze ${stockItem.stock.symbol}`}
+ >
+ Analyze
+
+
+
+
+
+
+
+
+ );
+};
+
+export default StockCard;
\ No newline at end of file
diff --git a/frontend/src/components/watchlist/WatchlistCard.jsx b/frontend/src/components/watchlist/WatchlistCard.jsx
new file mode 100644
index 0000000..79f323b
--- /dev/null
+++ b/frontend/src/components/watchlist/WatchlistCard.jsx
@@ -0,0 +1,151 @@
+import React, { useState } from 'react';
+import { useToast } from '../ui/ToastProvider';
+import { watchlistAPI } from '../../services/api/watchlistAPI';
+import { stockAPI } from '../../services/api/stockAPI';
+import WatchlistHeader from './WatchlistHeader';
+import EmptyWatchlist from './EmptyWatchlist';
+import StockCard from './StockCard';
+import AddStockModal from './AddStockModal';
+
+const WatchlistCard = ({
+ watchlist,
+ onDelete,
+ onAnalyzeStock,
+ onWatchlistUpdate,
+ showDeleteButton = true,
+ isLoading = false
+}) => {
+ const { showSuccess, showError } = useToast();
+ const [actionLoading, setActionLoading] = useState(false);
+ const [showAddStockModal, setShowAddStockModal] = useState(false);
+ const [stockSearch, setStockSearch] = useState('');
+ const [searchResults, setSearchResults] = useState([]);
+
+ const stockCount = watchlist.stocks ? watchlist.stocks.length : 0;
+
+ const handleSearchStocks = async (query) => {
+ if (!query.trim()) {
+ setSearchResults([]);
+ return;
+ }
+
+ try {
+ const results = await stockAPI.getStocks(query);
+ setSearchResults(results);
+ } catch (err) {
+ console.error('Failed to search stocks:', err);
+ setSearchResults([]);
+ }
+ };
+
+ const handleAddStock = async (stock) => {
+ try {
+ setActionLoading(true);
+ await watchlistAPI.addStock({
+ stock_id: stock.id
+ });
+
+ setStockSearch('');
+ setSearchResults([]);
+ setShowAddStockModal(false);
+ showSuccess(`${stock.symbol} added to watchlist!`);
+
+ if (onWatchlistUpdate) {
+ await onWatchlistUpdate();
+ }
+
+ } catch (err) {
+ showError(`Failed to add stock: ${err.message}`);
+ console.error('Failed to add stock:', err);
+ } finally {
+ setTimeout(() => {
+ setActionLoading(false);
+ }, 500);
+ }
+ };
+
+ const handleRemoveStock = async (stockId) => {
+ try {
+ setActionLoading(true);
+ await watchlistAPI.removeStock(stockId);
+
+ showSuccess('Stock removed from watchlist!');
+
+ if (onWatchlistUpdate) {
+ await onWatchlistUpdate();
+ }
+
+ } catch (err) {
+ showError(`Failed to remove stock: ${err.message}`);
+ console.error('Failed to remove stock:', err);
+ } finally {
+ setTimeout(() => {
+ setActionLoading(false);
+ }, 500);
+ }
+ };
+
+ const openAddStockModal = () => {
+ setShowAddStockModal(true);
+ setStockSearch('');
+ setSearchResults([]);
+ };
+
+ return (
+ <>
+
+
+
+
+ {isLoading ? (
+
+
+
+
Updating watchlist...
+
Fetching latest data
+
+
+ ) : !watchlist.stocks || watchlist.stocks.length === 0 ? (
+
+ ) : (
+
+ {watchlist.stocks.map(stockItem => (
+
+ ))}
+
+ )}
+
+
+
+ {/* Add Stock Modal */}
+ setShowAddStockModal(false)}
+ stockSearch={stockSearch}
+ setStockSearch={setStockSearch}
+ searchResults={searchResults}
+ onSearchStocks={handleSearchStocks}
+ onAddStock={handleAddStock}
+ actionLoading={actionLoading}
+ />
+ >
+ );
+};
+
+export default WatchlistCard;
diff --git a/frontend/src/components/watchlist/WatchlistHeader.jsx b/frontend/src/components/watchlist/WatchlistHeader.jsx
new file mode 100644
index 0000000..2a6d683
--- /dev/null
+++ b/frontend/src/components/watchlist/WatchlistHeader.jsx
@@ -0,0 +1,53 @@
+import React from 'react';
+import Button from '../ui/Button';
+import { Plus } from 'lucide-react';
+
+
+const WatchlistHeader = ({
+ watchlist,
+ onAddStock,
+ onDelete,
+ actionLoading,
+ showDeleteButton
+}) => {
+ const handleDelete = () => {
+ if (confirm('Are you sure you want to delete this watchlist?')) {
+ onDelete(watchlist.id);
+ }
+ };
+
+ return (
+
+
+
{watchlist.name}
+
+
+
+
+
+
+
+
+ {showDeleteButton && (
+
+ Delete
+
+ )}
+
+
+ );
+};
+
+export default WatchlistHeader;
\ No newline at end of file
diff --git a/frontend/src/components/watchlist/index.js b/frontend/src/components/watchlist/index.js
new file mode 100644
index 0000000..8f49480
--- /dev/null
+++ b/frontend/src/components/watchlist/index.js
@@ -0,0 +1,6 @@
+// Watchlist Components
+export { default as WatchlistCard } from './WatchlistCard';
+export { default as StockCard } from './StockCard';
+export { default as WatchlistHeader } from './WatchlistHeader';
+export { default as EmptyWatchlist } from './EmptyWatchlist';
+export { default as AddStockModal } from './AddStockModal';
\ No newline at end of file
diff --git a/frontend/src/contexts/AuthContext.jsx b/frontend/src/contexts/AuthContext.jsx
new file mode 100644
index 0000000..b8061cc
--- /dev/null
+++ b/frontend/src/contexts/AuthContext.jsx
@@ -0,0 +1,118 @@
+import React, { createContext, useContext, useEffect, useState } from 'react';
+import { auth, googleProvider } from '../firebase';
+import { signInWithPopup, signOut, onAuthStateChanged } from 'firebase/auth';
+import { authAPI } from '../services/api/authAPI';
+
+const AuthContext = createContext();
+
+export const useAuth = () => {
+ const context = useContext(AuthContext);
+ if (!context) {
+ throw new Error('useAuth must be used within an AuthProvider');
+ }
+ return context;
+};
+
+export const AuthProvider = ({ children }) => {
+ const [currentUser, setCurrentUser] = useState(null);
+ const [loading, setLoading] = useState(true);
+ const [error, setError] = useState(null);
+
+ // Sign in with Google
+ const signInWithGoogle = async () => {
+ try {
+ setError(null);
+ setLoading(true);
+
+ const result = await signInWithPopup(auth, googleProvider);
+ const idToken = await result.user.getIdToken();
+
+ // Store token in localStorage
+ localStorage.setItem('firebase_token', idToken);
+
+ // Send token to backend
+ const backendResponse = await authAPI.login(idToken);
+
+
+ console.log('Backend login successful:', backendResponse);
+
+ // Set the user after successful backend verification
+ setCurrentUser(result.user);
+ console.log('Signed in user:', result.user);
+
+ return result;
+ } catch (error) {
+ setError(error.message);
+ console.error('Login error:', error);
+
+ // Clean up on error
+ localStorage.removeItem('firebase_token');
+ setCurrentUser(null);
+
+ throw error;
+ } finally {
+ setLoading(false);
+ }
+ };
+
+ // Sign out
+ const logout = async () => {
+ try {
+ setError(null);
+ await signOut(auth);
+ localStorage.removeItem('firebase_token');
+ setCurrentUser(null);
+ } catch (error) {
+ setError(error.message);
+ console.error('Logout error:', error);
+ throw error;
+ }
+ };
+
+ // Monitor auth state changes
+ useEffect(() => {
+ const unsubscribe = onAuthStateChanged(auth, async (user) => {
+ try {
+ if (user) {
+ // User is signed in
+ const idToken = await user.getIdToken();
+ localStorage.setItem('firebase_token', idToken);
+
+ // Only set user if we have a valid token
+ if (idToken) {
+ setCurrentUser(user);
+ }
+ } else {
+ // User is signed out
+ localStorage.removeItem('firebase_token');
+ setCurrentUser(null);
+ }
+ } catch (error) {
+ console.error('Auth state change error:', error);
+ setError(error.message);
+ // On error, clear everything
+ localStorage.removeItem('firebase_token');
+ setCurrentUser(null);
+ } finally {
+ setLoading(false);
+ }
+ });
+
+ return unsubscribe;
+ }, []);
+
+ const value = {
+ currentUser,
+ signInWithGoogle,
+ logout,
+ loading,
+ error,
+ setError
+ };
+
+ return (
+
+ {children}
+
+ );
+};
diff --git a/frontend/src/firebase.js b/frontend/src/firebase.js
new file mode 100644
index 0000000..64a1a42
--- /dev/null
+++ b/frontend/src/firebase.js
@@ -0,0 +1,26 @@
+// Firebase configuration - uses environment variables
+import { initializeApp } from 'firebase/app';
+import { getAuth, GoogleAuthProvider } from 'firebase/auth';
+
+const firebaseConfig = {
+ apiKey: import.meta.env.VITE_FIREBASE_API_KEY,
+ authDomain: import.meta.env.VITE_FIREBASE_AUTH_DOMAIN,
+ projectId: import.meta.env.VITE_FIREBASE_PROJECT_ID,
+ storageBucket: import.meta.env.VITE_FIREBASE_STORAGE_BUCKET,
+ messagingSenderId: import.meta.env.VITE_FIREBASE_MESSAGING_SENDER_ID,
+ appId: import.meta.env.VITE_FIREBASE_APP_ID
+};
+
+// Check if Firebase config is properly loaded
+const missingEnvVars = Object.entries(firebaseConfig)
+ .filter(([key, value]) => !value)
+ .map(([key]) => key);
+
+if (missingEnvVars.length > 0) {
+ console.error('Missing Firebase environment variables:', missingEnvVars);
+}
+
+const app = initializeApp(firebaseConfig);
+
+export const auth = getAuth(app);
+export const googleProvider = new GoogleAuthProvider();
diff --git a/frontend/src/main.jsx b/frontend/src/main.jsx
new file mode 100644
index 0000000..3d9da8a
--- /dev/null
+++ b/frontend/src/main.jsx
@@ -0,0 +1,9 @@
+import { StrictMode } from 'react'
+import { createRoot } from 'react-dom/client'
+import App from './App.jsx'
+
+createRoot(document.getElementById('root')).render(
+
+
+ ,
+)
diff --git a/frontend/src/pages/Dashboard.jsx b/frontend/src/pages/Dashboard.jsx
new file mode 100644
index 0000000..b7961cc
--- /dev/null
+++ b/frontend/src/pages/Dashboard.jsx
@@ -0,0 +1,165 @@
+import React, { useState, useEffect } from 'react';
+import { useAuth } from '../contexts/AuthContext';
+import { useToast } from '../components/ui/ToastProvider';
+import { watchlistAPI } from '../services/api/watchlistAPI';
+import { analysisAPI } from '../services/api/analysisAPI';
+import Header from '../components/layout/Header';
+import WatchlistCard from '../components/watchlist/WatchlistCard';
+import AnalysisDashboard from '../components/analysis/AnalysisDashboard';
+import NewsCard from '../components/news/NewsCard';
+import StockChartDashboard from '../components/analysis/StockChartDashboard';
+
+const Dashboard = () => {
+ const { currentUser } = useAuth();
+ const { showSuccess, showError } = useToast();
+ const [watchlist, setWatchlist] = useState(null);
+ const [loading, setLoading] = useState(true);
+ const [showAnalysis, setShowAnalysis] = useState(false);
+ const [analysisData, setAnalysisData] = useState(null);
+ const [analyzingStock, setAnalyzingStock] = useState(null);
+ const [analysisLoading, setAnalysisLoading] = useState(false);
+
+ useEffect(() => {
+ fetchWatchlist();
+ }, []);
+
+ const fetchWatchlist = async () => {
+ try {
+ setLoading(true);
+ const data = await watchlistAPI.getWatchlist();
+ console.log('Fetched watchlist data:', data);
+ setWatchlist(data);
+ } catch (err) {
+ showError(`Failed to load watchlist: ${err.message}`);
+ console.error('Failed to fetch watchlist:', err);
+ } finally {
+ setLoading(false);
+ }
+ };
+
+ const handleAnalyzeStock = async (stock) => {
+ try {
+ console.log('Starting analysis for stock:', stock);
+
+ // Clear any previous analysis data first
+ setAnalysisData(null);
+
+ // Set loading states immediately
+ setAnalyzingStock(stock);
+ setAnalysisLoading(true);
+ setShowAnalysis(true);
+
+ console.log('Analysis states set - loading:', true, 'showAnalysis:', true, 'stock:', stock);
+
+ showSuccess(`Starting AI analysis for ${stock.symbol}...`);
+
+ const analysis = await analysisAPI.analyzeStock(stock.symbol);
+ console.log('Analysis response received:', analysis);
+ setAnalysisData(analysis);
+
+ showSuccess(`Analysis completed for ${stock.symbol}!`);
+
+ } catch (err) {
+ console.error('Analysis error:', err);
+ showError(`Failed to analyze ${stock.symbol}: ${err.message}`);
+ } finally {
+ console.log('Setting analysis loading to false');
+ setAnalysisLoading(false);
+ }
+ };
+
+ const closeAnalysis = () => {
+ console.log('Closing analysis modal');
+ setShowAnalysis(false);
+ setAnalysisData(null);
+ setAnalyzingStock(null);
+ setAnalysisLoading(false);
+
+ console.log('Analysis modal closed, should show dashboard');
+ };
+
+ // Ensure body has proper styling (in case modal messed it up)
+ useEffect(() => {
+ document.body.style.overflow = (showAnalysis || analysisLoading) ? 'hidden' : 'auto';
+ document.body.style.backgroundColor = '#121212';
+
+ return () => {
+ document.body.style.overflow = 'auto';
+ };
+ }, [showAnalysis, analysisLoading]);
+
+ return (
+
+
+
+ {/* Main Content - Always visible */}
+
+
+
+
+
+
+
+
+
+
+
+ {!watchlist ? (
+ loading ? (
+
+
+
Loading your watchlist...
+
+ ) : (
+
+
Watchlist Not Found
+ Try Again
+
+ )
+ ) : (
+ <>
+
+
+ >
+ )}
+
+
+ {/* News Section */}
+ {!loading && watchlist && (
+
+ )}
+ {/* Show placeholder if no watchlist */}
+ {!loading && !watchlist && (
+
+
News
+
+
Add stocks to your watchlist to see related news
+
+
+ )}
+
+
+
+
+
+ {/* Analysis Dashboard Modal */}
+ {analyzingStock && (showAnalysis || analysisLoading) && (
+
+ )}
+
+ );
+};
+
+export default Dashboard;
diff --git a/frontend/src/pages/Login.jsx b/frontend/src/pages/Login.jsx
new file mode 100644
index 0000000..a914b65
--- /dev/null
+++ b/frontend/src/pages/Login.jsx
@@ -0,0 +1,51 @@
+import React, { useState } from 'react';
+import { useAuth } from '../contexts/AuthContext';
+import { useNavigate } from 'react-router-dom';
+import Button from '../components/ui/Button';
+
+const Login = () => {
+ const { signInWithGoogle, loading } = useAuth();
+ const navigate = useNavigate();
+ const [isLoggingIn, setIsLoggingIn] = useState(false);
+
+ const handleGoogleLogin = async () => {
+ try {
+ setIsLoggingIn(true);
+ await signInWithGoogle();
+ navigate('/dashboard');
+ } catch (error) {
+ console.error('Login failed:', error);
+ } finally {
+ setIsLoggingIn(false);
+ }
+ };
+
+ return (
+
+
+
+
StockSense
+
+ AI-Powered Stock Analysis for Smarter Investments
+
+
+
+
+ Login With Google
+
+
+
+
+ );
+};
+
+export default Login;
+
diff --git a/frontend/src/services/api/analysisAPI.js b/frontend/src/services/api/analysisAPI.js
new file mode 100644
index 0000000..3be1155
--- /dev/null
+++ b/frontend/src/services/api/analysisAPI.js
@@ -0,0 +1,76 @@
+import axios from 'axios';
+
+// Analysis API calls (for StockSense AI)
+export const analysisAPI = {
+ // Analyze stock using AI
+ analyzeStock: async (ticker) => {
+ try {
+ // Use the StockSense AI API endpoint
+ const aiApiUrl = import.meta.env.VITE_AI_API_URL;
+ const response = await axios.get(`${aiApiUrl}/analyze/${ticker}`);
+ return response.data;
+ } catch (error) {
+ console.error('Analysis API Error:', error);
+ if (error.response?.status === 429) {
+ throw new Error('API rate limit reached. Please try again later.');
+ }
+ if (error.response?.status === 404) {
+ throw new Error(`Stock symbol ${ticker} not found. Please verify the symbol and try again.`);
+ }
+ if (error.code === 'ECONNREFUSED' || error.code === 'ERR_NETWORK') {
+ throw new Error('AI analysis service is currently unavailable. Please try again later.');
+ }
+ throw new Error(error.response?.data?.detail || error.message || 'Failed to analyze stock');
+ }
+ },
+
+ // Get historical analysis
+ getAnalysisHistory: async (ticker, limit = 10) => {
+ try {
+ const aiApiUrl = import.meta.env.VITE_AI_API_URL;
+ const response = await axios.get(`${aiApiUrl}/analysis/history/${ticker}?limit=${limit}`);
+ return response.data;
+ } catch (error) {
+ console.error('Analysis History API Error:', error);
+ throw new Error(error.response?.data?.detail || 'Failed to get analysis history');
+ }
+ },
+
+ // Get analysis status
+ getAnalysisStatus: async (analysisId) => {
+ try {
+ const aiApiUrl = import.meta.env.VITE_AI_API_URL;
+ const response = await axios.get(`${aiApiUrl}/analysis/status/${analysisId}`);
+ return response.data;
+ } catch (error) {
+ console.error('Analysis Status API Error:', error);
+ throw new Error(error.response?.data?.detail || 'Failed to get analysis status');
+ }
+ },
+
+ // Cancel ongoing analysis
+ cancelAnalysis: async (analysisId) => {
+ try {
+ const aiApiUrl = import.meta.env.VITE_AI_API_URL;
+ const response = await axios.post(`${aiApiUrl}/analysis/cancel/${analysisId}`);
+ return response.data;
+ } catch (error) {
+ console.error('Cancel Analysis API Error:', error);
+ throw new Error(error.response?.data?.detail || 'Failed to cancel analysis');
+ }
+ },
+
+ // Get market summary analysis
+ getMarketSummary: async () => {
+ try {
+ const aiApiUrl = import.meta.env.VITE_AI_API_URL;
+ const response = await axios.get(`${aiApiUrl}/market/summary`);
+ return response.data;
+ } catch (error) {
+ console.error('Market Summary API Error:', error);
+ throw new Error(error.response?.data?.detail || 'Failed to get market summary');
+ }
+ }
+};
+
+export default analysisAPI;
\ No newline at end of file
diff --git a/frontend/src/services/api/authAPI.js b/frontend/src/services/api/authAPI.js
new file mode 100644
index 0000000..2b2233b
--- /dev/null
+++ b/frontend/src/services/api/authAPI.js
@@ -0,0 +1,39 @@
+import apiClient from './client.js';
+
+// Auth API calls
+export const authAPI = {
+ login: async (idToken) => {
+ try {
+ const response = await apiClient.post('/users/login/', {
+ idToken: idToken
+ });
+ console.log('Login response:', response.data);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Login failed');
+ }
+ },
+
+ // Additional auth methods can be added here
+ logout: async () => {
+ try {
+ // Clear local storage
+ localStorage.removeItem('firebase_token');
+ return { success: true };
+ } catch (error) {
+ throw new Error('Logout failed');
+ }
+ },
+
+ // Verify token
+ verifyToken: async () => {
+ try {
+ const response = await apiClient.get('/users/verify/');
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Token verification failed');
+ }
+ }
+};
+
+export default authAPI;
\ No newline at end of file
diff --git a/frontend/src/services/api/client.js b/frontend/src/services/api/client.js
new file mode 100644
index 0000000..25565f2
--- /dev/null
+++ b/frontend/src/services/api/client.js
@@ -0,0 +1,41 @@
+import axios from 'axios';
+
+const API_BASE_URL = import.meta.env.VITE_API_BASE_URL;
+
+const apiClient = axios.create({
+ baseURL: API_BASE_URL,
+ headers: {
+ 'Content-Type': 'application/json',
+ },
+ timeout: 30000,
+});
+
+apiClient.interceptors.request.use(
+ (config) => {
+ const token = localStorage.getItem('firebase_token');
+ if (token) {
+ config.headers.Authorization = `Bearer ${token}`;
+ }
+
+ return config;
+ },
+ (error) => {
+ return Promise.reject(error);
+ }
+);
+
+apiClient.interceptors.response.use(
+ (response) => {
+ return response;
+ },
+ (error) => {
+ // If token is invalid, clear it
+ if (error.response?.status === 401) {
+ localStorage.removeItem('firebase_token');
+ }
+
+ return Promise.reject(error);
+ }
+);
+
+export default apiClient;
\ No newline at end of file
diff --git a/frontend/src/services/api/index.js b/frontend/src/services/api/index.js
new file mode 100644
index 0000000..69daf34
--- /dev/null
+++ b/frontend/src/services/api/index.js
@@ -0,0 +1,7 @@
+// Export all API modules for easy importing
+export { authAPI } from './authAPI.js';
+export { watchlistAPI } from './watchlistAPI.js';
+export { stockAPI } from './stockAPI.js';
+export { analysisAPI } from './analysisAPI.js';
+export { newsAPI } from './newsAPI.js';
+export { default as apiClient } from './client.js';
\ No newline at end of file
diff --git a/frontend/src/services/api/newsAPI.js b/frontend/src/services/api/newsAPI.js
new file mode 100644
index 0000000..ebfc729
--- /dev/null
+++ b/frontend/src/services/api/newsAPI.js
@@ -0,0 +1,125 @@
+import axios from 'axios';
+import { watchlistAPI } from './watchlistAPI.js';
+
+const BASE_URL = import.meta.env.VITE_NEWS_API_URL;
+const API_KEY = import.meta.env.VITE_NEWS_API_KEY;
+
+// Get user's watchlist stock symbols
+const getUserWatchlistSymbols = async () => {
+ try {
+ const watchlist = await watchlistAPI.getWatchlist();
+
+ if (!watchlist || !watchlist.stocks || watchlist.stocks.length === 0) {
+ console.log('No stocks found in watchlist');
+ return [];
+ }
+
+ // Extract symbols from the watchlist stocks
+ return watchlist.stocks.map(stockItem => stockItem.stock.symbol);
+ } catch (error) {
+ console.error('Error fetching watchlist symbols:', error);
+ return [];
+ }
+};
+
+// Fetch news for specific symbols
+const getNewsForSymbols = async (symbols, limit = 5) => {
+ try {
+ if (!symbols || symbols.length === 0) {
+ throw new Error('No symbols provided');
+ }
+
+ // Join symbols with comma for API
+ const symbolsString = symbols.join(',');
+
+ const params = {
+ symbols: symbolsString,
+ filter_entities: true,
+ language: 'en',
+ api_token: API_KEY,
+ limit: limit,
+ sort: 'published_on', // Sort by most recent
+ sort_order: 'desc'
+ };
+
+ const response = await axios.get(BASE_URL, { params });
+
+ if (response.data && response.data.data) {
+ return response.data.data;
+ } else {
+ throw new Error('Invalid response format from news API');
+ }
+ } catch (error) {
+ console.error('Error fetching news:', error);
+ throw new Error(error.response?.data?.message || 'Failed to fetch news');
+ }
+};
+
+// Main function to get news for user's watchlist
+const getWatchlistNews = async (limit = 5) => {
+ try {
+ // Step 1: Get user's watchlist symbols
+ const symbols = await getUserWatchlistSymbols();
+
+ if (symbols.length === 0) {
+ return {
+ success: true,
+ message: 'No stocks in watchlist',
+ articles: [],
+ symbols: []
+ };
+ }
+
+ console.log('Fetching news for symbols:', symbols);
+
+ // Step 2: Fetch news for those symbols
+ const articles = await getNewsForSymbols(symbols, limit);
+
+ return {
+ success: true,
+ articles: articles,
+ symbols: symbols,
+ totalSymbols: symbols.length,
+ totalArticles: articles.length
+ };
+ } catch (error) {
+ console.error('Error in getWatchlistNews:', error);
+ return {
+ success: false,
+ error: error.message,
+ articles: [],
+ symbols: []
+ };
+ }
+};
+
+// Get news for specific stock symbol
+const getNewsForStock = async (symbol, limit = 5) => {
+ try {
+ const articles = await getNewsForSymbols([symbol], limit);
+ return {
+ success: true,
+ symbol: symbol,
+ articles: articles,
+ totalArticles: articles.length
+ };
+ } catch (error) {
+ console.error(`Error fetching news for ${symbol}:`, error);
+ return {
+ success: false,
+ symbol: symbol,
+ error: error.message,
+ articles: []
+ };
+ }
+};
+
+// Export the news API functions
+export const newsAPI = {
+ getUserWatchlistSymbols,
+ getNewsForSymbols,
+ getWatchlistNews,
+ getNewsForStock
+};
+
+export default newsAPI;
\ No newline at end of file
diff --git a/frontend/src/services/api/stockAPI.js b/frontend/src/services/api/stockAPI.js
new file mode 100644
index 0000000..e186bd0
--- /dev/null
+++ b/frontend/src/services/api/stockAPI.js
@@ -0,0 +1,49 @@
+import apiClient from './client.js';
+
+
+const BASE_URL = import.meta.env.VITE_STOCK_API_URL || 'https://www.alphavantage.co/query';
+const API_KEY = import.meta.env.ALPHAVANTAGE_API_KEY
+
+// Stock API calls
+export const stockAPI = {
+ // Get all stocks with optional search
+ getStocks: async (search = '') => {
+ try {
+ const url = search ? `/stocks/?search=${encodeURIComponent(search)}` : '/stocks/';
+ const response = await apiClient.get(url);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to get stocks');
+ }
+ },
+
+ // Get stock by ID
+ getStock: async (stockId) => {
+ try {
+ const response = await apiClient.get(`/stocks/${stockId}/`);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to get stock');
+ }
+ },
+
+
+ getPriceData: async (symbol, interval = '1min') => {
+ try{
+ const url = `${BASE_URL}&symbol=${encodeURIComponent(symbol)}&interval=${interval}&apikey=${API_KEY}`;
+ const header = { 'User-Agent': 'request' };
+ const response = await fetch(url, { method: 'GET', headers: header });
+ const data = await response.json();
+ console.log(data);
+ return data;
+
+ } catch (error) {
+ console.error('Error fetching price data:', error);
+ throw new Error(error.response?.data?.message || 'Failed to fetch price data');
+ }
+
+
+ }
+
+}
+export default stockAPI;
\ No newline at end of file
diff --git a/frontend/src/services/api/watchlistAPI.js b/frontend/src/services/api/watchlistAPI.js
new file mode 100644
index 0000000..801bafe
--- /dev/null
+++ b/frontend/src/services/api/watchlistAPI.js
@@ -0,0 +1,76 @@
+import apiClient from './client.js';
+
+// Watchlist API calls
+export const watchlistAPI = {
+ getWatchlist: async () => {
+ try {
+ const response = await apiClient.get('/watchlists/');
+ console.log('Watchlist response:', response.data);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to get watchlist');
+ }
+ },
+
+ // Create a new watchlist
+ createWatchlist: async (watchlistData) => {
+ try {
+ const response = await apiClient.post('/watchlists/', watchlistData);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to create watchlist');
+ }
+ },
+
+ // Update watchlist
+ updateWatchlist: async (watchlistId, watchlistData) => {
+ try {
+ const response = await apiClient.put(`/watchlists/${watchlistId}/`, watchlistData);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to update watchlist');
+ }
+ },
+
+ // Delete watchlist
+ deleteWatchlist: async (watchlistId) => {
+ try {
+ await apiClient.delete(`/watchlists/${watchlistId}/`);
+ return { success: true };
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to delete watchlist');
+ }
+ },
+
+ // Add stock to watchlist
+ addStock: async (stockData) => {
+ try {
+ const response = await apiClient.post('/watchlists/stocks/', stockData);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to add stock to watchlist');
+ }
+ },
+
+ // Remove stock from watchlist
+ removeStock: async (stockId) => {
+ try {
+ await apiClient.delete(`/watchlists/stocks/${stockId}/`);
+ return { success: true };
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to remove stock from watchlist');
+ }
+ },
+
+ // Get watchlist stocks
+ getWatchlistStocks: async (watchlistId) => {
+ try {
+ const response = await apiClient.get(`/watchlists/${watchlistId}/stocks/`);
+ return response.data;
+ } catch (error) {
+ throw new Error(error.response?.data?.detail || 'Failed to get watchlist stocks');
+ }
+ }
+};
+
+export default watchlistAPI;
\ No newline at end of file
diff --git a/frontend/vite.config.js b/frontend/vite.config.js
new file mode 100644
index 0000000..a45e8cf
--- /dev/null
+++ b/frontend/vite.config.js
@@ -0,0 +1,19 @@
+import { defineConfig } from 'vite'
+import react from '@vitejs/plugin-react'
+
+// https://vite.dev/config/
+export default defineConfig({
+ plugins: [react()],
+ base: '/',
+ build: {
+ outDir: 'dist',
+ assetsDir: 'assets',
+ sourcemap: false
+ },
+ server: {
+ port: 3000
+ },
+ preview: {
+ port: 3000
+ }
+})
diff --git a/nasdaq_screener.csv b/nasdaq_screener.csv
deleted file mode 100644
index 119fd9c..0000000
--- a/nasdaq_screener.csv
+++ /dev/null
@@ -1,21 +0,0 @@
-Symbol,Name,Last Sale,Net Change,% Change,Market Cap,Country,IPO Year,Volume,Sector,Industry
-AAPL,Apple Inc.,231.7,1.23,0.53%,3543210000000,United States,1980,45123000,Technology,Consumer Electronics
-MSFT,Microsoft Corporation,421.45,2.87,0.69%,3123450000000,United States,1986,32145000,Technology,Software - Infrastructure
-GOOGL,Alphabet Inc.,171.23,-0.87,-0.50%,2156780000000,United States,2004,28945000,Communication Services,Internet Content & Information
-AMZN,Amazon.com Inc.,185.92,1.45,0.79%,1943210000000,United States,1997,34567000,Consumer Cyclical,Internet Retail
-TSLA,Tesla Inc.,247.86,-2.34,-0.93%,788450000000,United States,2010,67891000,Consumer Cyclical,Auto Manufacturers
-NVDA,NVIDIA Corporation,129.37,3.21,2.54%,3189670000000,United States,1999,89456000,Technology,Semiconductors
-META,Meta Platforms Inc.,542.81,4.56,0.85%,1378900000000,United States,2012,23456000,Communication Services,Internet Content & Information
-NFLX,Netflix Inc.,702.33,-1.23,-0.17%,311890000000,United States,2002,12345000,Communication Services,Entertainment
-AMD,Advanced Micro Devices Inc.,164.52,2.87,1.78%,265780000000,United States,1972,45678000,Technology,Semiconductors
-INTC,Intel Corporation,22.89,-0.45,-1.93%,97340000000,United States,1971,67890000,Technology,Semiconductors
-JPM,JPMorgan Chase & Co.,221.45,1.23,0.56%,642310000000,United States,1980,23456000,Financial Services,Banks - Diversified
-JNJ,Johnson & Johnson,161.78,-0.87,-0.53%,389670000000,United States,1944,12345000,Healthcare,Drug Manufacturers - General
-V,Visa Inc.,289.45,2.34,0.82%,567890000000,United States,2008,34567000,Financial Services,Credit Services
-PG,Procter & Gamble Company,166.23,0.78,0.47%,389120000000,United States,1946,23456000,Consumer Defensive,Household & Personal Products
-HD,Home Depot Inc.,346.78,1.45,0.42%,345670000000,United States,1981,34567000,Consumer Cyclical,Home Improvement Retail
-UNH,UnitedHealth Group Incorporated,524.67,3.21,0.62%,485930000000,United States,1984,12345000,Healthcare,Healthcare Plans
-BAC,Bank of America Corporation,44.56,-0.23,-0.51%,345670000000,United States,1968,67890000,Financial Services,Banks - Diversified
-XOM,Exxon Mobil Corporation,119.78,2.34,1.99%,456780000000,United States,1928,45678000,Energy,Oil & Gas Integrated
-KO,Coca-Cola Company,62.45,0.45,0.73%,268900000000,United States,1919,23456000,Consumer Defensive,Beverages - Non-Alcoholic
-PFE,Pfizer Inc.,28.67,-0.56,-1.91%,161230000000,United States,1942,34567000,Healthcare,Drug Manufacturers - General
diff --git a/pytest.ini b/pytest.ini
deleted file mode 100644
index 41e7f13..0000000
--- a/pytest.ini
+++ /dev/null
@@ -1,38 +0,0 @@
-# pytest configuration file
-[tool:pytest]
-# Test discovery
-testpaths = tests
-python_files = test_*.py
-python_classes = Test*
-python_functions = test_*
-
-# Output configuration
-addopts =
- -v
- --tb=short
- --strict-markers
- --disable-warnings
- --color=yes
-
-# Async configuration
-asyncio_mode = auto
-asyncio_default_fixture_loop_scope = function
-
-# Markers for test categorization
-markers =
- unit: Unit tests for individual components
- integration: Integration tests requiring external services
- api: API endpoint tests
- slow: Tests that take longer to run
- network: Tests that require network access
-
-# Timeout settings
-timeout = 300
-
-# Coverage settings (optional)
-# Uncomment to enable coverage reporting
-# addopts =
-# -v
-# --cov=stocksense
-# --cov-report=html
-# --cov-report=term-missing
diff --git a/readme.md b/readme.md
index 57a6db5..4624a59 100644
--- a/readme.md
+++ b/readme.md
@@ -1,311 +1,160 @@
-# StockSense Agent
+# StockSense
-**AI-Powered Autonomous Stock Market Research System Using Advanced ReAct Pattern**
+**AI-Powered Stock Analysis Platform**
-An intelligent stock analysis platform that leverages the **ReAct (Reasoning + Action)** design pattern to conduct autonomous market research through sophisticated AI agent capabilities, dynamic tool selection, and adaptive strategy formation.
-
-[](https://www.python.org/downloads/)
-[](https://fastapi.tiangolo.com/)
-[](https://streamlit.io/)
-[](https://langchain-ai.github.io/langgraph/)
-
-## Overview
-
-StockSense demonstrates advanced AI agent architecture through autonomous reasoning, self-correction, and intelligent tool orchestration. The system combines real-time market data with AI-powered sentiment analysis to provide comprehensive stock market insights.
-
-### Key Achievements
-
-- **Autonomous ReAct Agent**: Self-guided analysis with dynamic tool selection and iterative reasoning
-- **Production-Ready Architecture**: Full-stack application with FastAPI backend and Streamlit frontend
-- **Advanced AI Integration**: Google Gemini 2.5 Flash with optimized prompting strategies
-- **Enterprise-Grade Features**: Comprehensive error handling, caching, and health monitoring
-
-## Architecture
-
-### Technology Stack
-
-| Layer | Technology | Purpose |
-| ------------------- | ----------------------------------- | ------------------------------------------ |
-| **AI Layer** | Google Gemini 2.5 Flash + LangGraph | ReAct reasoning and sentiment analysis |
-| **Agent Framework** | LangChain Tools + StateGraph | Tool orchestration and state management |
-| **Backend** | FastAPI + Uvicorn | High-performance async REST API |
-| **Frontend** | Streamlit | Interactive dashboard and visualizations |
-| **Database** | SQLite + Peewee ORM | Persistent storage and intelligent caching |
-| **Data Sources** | NewsAPI + Yahoo Finance | Real-time market data integration |
-
-### ReAct Agent Workflow
-
-```mermaid
-graph TD
- A[Stock Ticker Input] --> B[Initialize ReAct Agent]
- B --> C[Reasoning Phase]
- C --> D{Analysis Complete?}
- D -->|No| E[Select Appropriate Tool]
- E --> F[Execute Action]
- F --> G[Observe Results]
- G --> C
- D -->|Yes| H[Generate Summary]
- H --> I[Save Results]
- I --> J[Return Analysis]
-```
-
-### Core Components
-
-```
-StockSense-Agent/
-├── app.py # Streamlit frontend application
-├── docker-compose.yml # Multi-container orchestration
-├── Dockerfile.backend # Backend containerization
-├── Dockerfile.frontend # Frontend containerization
-├── nasdaq_screener.csv # Stock market data file
-├── pytest.ini # Pytest configuration
-├── stocksense.db # SQLite database file
-├── stocksense/
-│ ├── react_agent.py # ReAct agent implementation (LangGraph)
-│ ├── main.py # FastAPI backend server
-│ ├── data_collectors.py # News & market data fetching
-│ ├── analyzer.py # AI sentiment analysis engine
-│ ├── database.py # Database operations & ORM
-│ └── config.py # Configuration management
-├── tests/
-│ ├── README.md # Test documentation
-│ ├── test_api.py # API integration test suite
-│ └── test_tools.py # Unit tests for agent tools
-└── requirements.txt # Dependencies
-```
+A modern full-stack application that combines AI analysis with real-time market data to provide comprehensive stock insights.
## Features
-### Autonomous AI Agent
-
-- **Self-Guided Decision Making**: Agent independently determines optimal analysis strategy
-- **Dynamic Tool Selection**: Context-aware selection of appropriate data collection tools
-- **Iterative Reasoning**: Multi-step analysis with observation and adaptation
-- **Error Recovery**: Graceful handling of API failures and data quality issues
-
-### Comprehensive Market Analysis
-
-- **Multi-Source Intelligence**: Combines news sentiment with historical price movements
-- **AI-Powered Insights**: Advanced sentiment classification using Google Gemini
-- **Visual Analytics**: Interactive charts for price trends and sentiment distribution
-- **Risk Assessment**: Identification of market opportunities and potential risks
-
-### Production-Ready Infrastructure
-
-- **Scalable API Design**: RESTful endpoints with comprehensive error handling
-- **Intelligent Caching**: Optimized result storage to minimize API usage
-- **Health Monitoring**: Real-time system status and dependency verification
-- **Multi-Access Patterns**: Web UI, REST API, and Docker deployment
-- **Containerized Deployment**: Full Docker support with multi-service orchestration
-
-## Quick Start
-
-### Prerequisites
-
-- Python 3.10+
-- [Google Gemini API Key](https://aistudio.google.com/app/apikey)
-- [NewsAPI Key](https://newsapi.org/register)
-
-### Installation
-
-````bash
-```bash
-# Clone repository
-git clone https://github.com/Spkap/StockSense-Agent.git
-cd StockSense-Agent
-
-# Setup environment
-python -m venv venv
-source venv/bin/activate # Windows: venv\Scripts\activate
-
-# Install dependencies
-pip install -r requirements.txt
-
-# Configure environment variables
-echo "GOOGLE_API_KEY=your_api_key_here" > .env
-echo "NEWSAPI_KEY=your_api_key_here" >> .env
-
-# Initialize database
-python -c "from stocksense.database import init_db; init_db()"
-````
-
-### Usage Options
-
-#### Full-Stack Application
-
-```bash
-# Start backend server
-python -m stocksense.main # http://localhost:8000
-
-# Launch frontend (new terminal)
-streamlit run app.py # http://localhost:8501
-```
-
-#### Docker Deployment
-
-```bash
-# Quick start with Docker Compose
-docker-compose up -d
-
-# Access services
-# Frontend: http://localhost:8501
-# Backend API: http://localhost:8000
-
-# View logs
-docker-compose logs -f
-
-# Stop services
-docker-compose down
-```
-
-#### REST API
-
-```bash
-# Trigger ReAct agent analysis
-curl -X POST "http://localhost:8000/analyze/AAPL"
-
-# Retrieve cached results
-curl "http://localhost:8000/results/AAPL"
-
-# System health check
-curl "http://localhost:8000/health"
-
-# Get all cached tickers
-curl "http://localhost:8000/cached-tickers"
-```
-
-#### Command Line Interface
-
-```bash
-# Use the run task to start the ReAct agent directly
-python -m stocksense.main
-```
-
-### Example Analysis Output
-
-```json
-{
- "ticker": "AAPL",
- "summary": "Apple Inc. demonstrates strong market sentiment with positive outlook...",
- "sentiment_report": "Overall Sentiment: POSITIVE (78% bullish)",
- "headlines_count": 18,
- "reasoning_steps": [
- "Analyzing request for AAPL stock",
- "Fetching recent news headlines (7 days)",
- "Collecting historical price data (30 days)",
- "Performing AI sentiment analysis",
- "Generating comprehensive summary"
- ],
- "tools_used": [
- "fetch_news_headlines",
- "fetch_price_data",
- "analyze_sentiment"
- ],
- "iterations": 3,
- "agent_type": "ReAct"
-}
-```
-
-## API Reference
-
-### Core Endpoints
+- **AI Stock Analysis** - Uses Google Gemini for intelligent stock analysis
+- **Real-time Data** - Fetches live stock prices and market news
+- **User Dashboard** - Clean React interface for managing watchlists
+- **Firebase Auth** - Secure user authentication and data storage
+- **Django REST API** - Robust backend with comprehensive endpoints
+- **CI/CD Pipeline** - Automated testing and deployment with GitHub Actions
-- **POST `/analyze/{ticker}`** - Trigger autonomous ReAct agent analysis
-- **GET `/results/{ticker}`** - Retrieve cached analysis results
-- **GET `/health`** - System health and dependency status
-- **GET `/cached-tickers`** - List all available cached analyses
-- **GET `/`** - API welcome message and endpoint directory
-- **GET `/docs`** - Interactive Swagger API documentation
-- **GET `/redoc`** - Alternative ReDoc API documentation
+## Architecture
-### Python Integration
+StockSense follows a **microservices architecture** with three main services working together to deliver comprehensive stock analysis capabilities.
-```python
-from stocksense.react_agent import run_react_analysis
+### System Overview
-# Direct agent usage
-result = run_react_analysis("AAPL")
-print(f"Analysis: {result['summary']}")
-print(f"Tools used: {result['tools_used']}")
```
-
-## Testing & Quality Assurance
-
-```bash
-# Run comprehensive test suite
-python -m pytest tests/ -v
-
-# Run specific test modules
-python -m pytest tests/test_api.py -v # API integration tests
-python -m pytest tests/test_tools.py -v # Agent tool unit tests
-
-# Run tests with coverage
-python -m pytest tests/ --cov=stocksense --cov-report=html
-
-# Individual component testing
-python -m stocksense.react_agent # Test ReAct agent
-python -m stocksense.analyzer # Test AI analysis
+┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
+│ Frontend │ │ Backend API │ │ AI Service │
+│ (React) │◄──►│ (Django) │◄──►│ (FastAPI) │
+│ │ │ │ │ │
+│ • React 18 │ │ • Django REST │ │ • Google Gemini │
+│ • Vite │ │ • PostgreSQL │ │ • LangGraph │
+│ • TailwindCSS │ │ • Firebase Auth │ │ • NewsAPI │
+│ • Axios │ │ • CORS │ │ • Yahoo Finance │
+└─────────────────┘ └─────────────────┘ └─────────────────┘
+ │ │ │
+ │ │ │
+┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
+│ Render │ │ AWS EC2 │ │ AWS EC2 │
+│ (Frontend) │ │ (Backend) │ │ (AI Agent) │
+└─────────────────┘ └─────────────────┘ └─────────────────┘
```
-### Test Coverage
-
-- ✅ ReAct agent workflows and reasoning cycles
-- ✅ All REST API endpoints with error scenarios
-- ✅ Database operations and caching mechanisms
-- ✅ External API integrations (NewsAPI, Yahoo Finance)
-- ✅ Error handling and graceful degradation
-- ✅ Agent tool functionality and data validation
+### Architecture Components
+
+**Frontend Layer (React + Vite)**
+- **User Interface**: Modern React 18 application with responsive design
+- **State Management**: Context API for global state and user authentication
+- **Routing**: React Router for single-page application navigation
+- **Styling**: TailwindCSS for utility-first styling and dark/light themes
+- **API Communication**: Axios for HTTP requests to backend services
+- **Authentication**: Firebase SDK for user management and session handling
+
+**Backend API Layer (Django REST Framework)**
+- **RESTful API**: Comprehensive endpoints for user data, watchlists, and stock management
+- **Database**: PostgreSQL for reliable data persistence and ACID compliance
+- **Authentication**: Firebase Admin SDK for token verification and user validation
+- **CORS Configuration**: Cross-origin resource sharing for frontend communication
+- **Data Models**: User profiles, watchlists, stock symbols, and analysis history
+- **Business Logic**: Portfolio management, user preferences, and data aggregation
+
+**AI Service Layer (FastAPI + LangGraph)**
+- **ReAct Agent**: Sophisticated reasoning and acting pattern for stock analysis
+- **LLM Integration**: Google Gemini 2.5 Flash for natural language processing
+- **Data Sources**:
+ - NewsAPI for real-time financial news
+ - Yahoo Finance for historical stock data
+ - Sentiment analysis for market sentiment
+- **Workflow Engine**: LangGraph for managing complex analysis workflows
+- **Tool Orchestration**: Coordinated execution of data collection and analysis tools
+
+
+### Deployment Architecture
+
+**Multi-Environment Deployment**
+- **Frontend**: Deployed on Render with automatic GitHub integration
+- **Backend**: Containerized Django app on AWS EC2 with nginx reverse proxy
+- **AI Service**: Containerized FastAPI app on separate AWS EC2 instance
+- **Database**: AWS RDS PostgreSQL for production data storage
+- **Storage**: AWS ECR for Docker image management
+
+**Infrastructure Features**
+- **Load Balancing**: nginx for HTTPS termination and request routing
+- **Containerization**: Docker for consistent deployment environments
+- **CI/CD Integration**: GitHub Actions for automated testing and deployment
+- **Security**: Environment variables and secrets management via GitHub Actions
+- **Monitoring**: Health checks and logging across all services
+
+
+
+## CI/CD Pipeline
+
+This project uses **GitHub Actions** for automated CI/CD with comprehensive testing and deployment:
+
+### Backend CI/CD Pipeline
+- **Automated Testing**: Runs on every push/PR to `main` branch affecting `backend/**`
+- **Environment Setup**: Configures Python 3.11 and PostgreSQL test database
+- **Dependency Management**: Installs requirements and runs database migrations
+- **Test Execution**: Runs Django test suite with coverage reporting
+- **Docker Containerization**: Builds Docker images and pushes to Amazon ECR
+- **EC2 Deployment**: Automatically deploys containerized backend to AWS EC2 instances
+- **Health Checks**: Validates deployment with automated health endpoint testing
+- **HTTPS Security**: Uses nginx reverse proxy to convert HTTP to HTTPS traffic
+- **Environment Configuration**: Manages Firebase and database secrets through GitHub Actions
+
+### Frontend CI/CD Pipeline
+- **Automated Deployment**: Deploys on every push to `main` branch affecting `frontend/**`
+- **Build Process**: Uses Vite for optimized production builds with asset bundling
+- **Deploy Hook Automation**: GitHub Actions workflow automatically triggers Render deployment using secure deploy hooks
+- **Environment Configuration**: Manages Firebase and API keys through Render environment variables
+
+### AI-Service CI/CD Pipeline
+- **Automated Deployment**: Triggers on changes to `stocksense/**` directory structure
+- **FastAPI Integration**: Deploys Python-based AI microservice
+- **EC2 Deployment**: Automatically deploys to AWS EC2 instance for AI processing workloads
+
+### Infrastructure & Security
+- **nginx Reverse Proxy**: Converts HTTP traffic to HTTPS for secure communication
+- **Docker Images**: All services run as containerized applications for consistency
+- **AWS Integration**: Uses ECR for image storage and EC2 for compute resources
+- **Secrets Management**: Secure handling of API keys and credentials via GitHub Secrets
+
+## AI Agent Architecture
+
+The StockSense AI Agent is built using the **ReAct (Reasoning + Acting) pattern** with **LangGraph** for sophisticated stock analysis workflows.
-## Configuration
-
-### Environment Variables
-
-```bash
-# Required
-GOOGLE_API_KEY=your_google_gemini_api_key
-NEWSAPI_KEY=your_newsapi_key
-
-# Optional
-STOCKSENSE_DB_PATH=./stocksense.db
-STOCKSENSE_LOG_LEVEL=INFO
-API_BASE_URL=http://127.0.0.1:8000 # For frontend connection
-```
-
-### Docker Configuration
-
-```bash
-# Create environment file for Docker
-cp .env.example .env
-# Edit .env with your API keys
+### Core Components
-# Docker environment variables
-GOOGLE_API_KEY=your_api_key_here
-NEWSAPI_KEY=your_api_key_here
-```
+**ReAct Agent Engine**
+- **LangGraph State Management**: Maintains conversation state and tool execution history
+- **Google Gemini 2.5 Flash**: Primary LLM for reasoning and analysis
+- **Tool Orchestration**: Systematically executes analysis tools in logical sequence
+- **Iterative Reasoning**: Up to 8 reasoning cycles for comprehensive analysis
-**Note**: The docker-compose.yml includes an optional nginx service, but the nginx.conf file is not included in the repository. You can either remove the nginx service from docker-compose.yml or create your own nginx.conf file for reverse proxy setup.
+**Data Collection Tools**
+- **News Headlines Fetcher**: Retrieves recent news articles using NewsAPI
+- **Price Data Collector**: Fetches historical stock prices via Yahoo Finance
+- **Sentiment Analyzer**: AI-powered sentiment analysis of news headlines
-### Agent Configuration
+**Analysis Workflow**
+1. **News Collection**: Fetches recent headlines (7-day lookback)
+2. **Price Data Retrieval**: Gathers historical price movements
+3. **Sentiment Analysis**: Analyzes news sentiment with detailed justifications
+4. **Comprehensive Summary**: Generates final investment recommendation
-- **Max Iterations**: 8 reasoning cycles per analysis
-- **Temperature**: 0.1 for reasoning, 0.3 for analysis
-- **Data Collection**: 7 days of news, 30 days of price history
-## Technical Highlights
+### Analysis Output
-### Advanced AI Implementation
+Each AI analysis provides:
+- **Investment Recommendation**: Clear BUY/SELL/HOLD decision
+- **Sentiment Report**: Detailed news sentiment breakdown
+- **Price Trend Analysis**: Historical price movement insights
+- **Reasoning Steps**: Transparent decision-making process
+- **Tool Usage Tracking**: Complete audit trail of analysis steps
+- **Confidence Metrics**: Analysis quality indicators
-- **ReAct Pattern**: True autonomous reasoning with observation loops
-- **State Management**: Comprehensive agent state tracking across iterations
-- **Tool Orchestration**: Dynamic binding and execution of analysis tools
-- **Context Optimization**: Temperature-controlled prompting for different tasks
-### Production Engineering
+## Usage
-- **Async Architecture**: High-performance FastAPI with proper error handling
-- **Caching Strategy**: Intelligent result storage with database integration
-- **Health Monitoring**: Real-time system status and dependency checking
-- **Scalable Design**: Modular architecture supporting future enhancements
-- **Container Orchestration**: Docker Compose with health checks and service dependencies
+1. **Authentication** - Sign in with Firebase authentication
+2. **Stock Management** - Add stocks to your personalized watchlist
+3. **AI Analysis** - Get comprehensive AI-powered stock analysis and insights
+4. **Real-time Updates** - Monitor live market data and news sentiment
-## Future Enhancements
diff --git a/requirements.txt b/requirements.txt
deleted file mode 100644
index f84ca57..0000000
--- a/requirements.txt
+++ /dev/null
@@ -1,115 +0,0 @@
-altair==5.5.0
-annotated-types==0.7.0
-anyio==4.9.0
-async-timeout==4.0.3
-attrs==25.3.0
-beautifulsoup4==4.13.4
-blinker==1.9.0
-cachetools==5.5.2
-certifi==2025.6.15
-cffi==1.17.1
-charset-normalizer==3.4.2
-click==8.2.1
-curl_cffi==0.11.4
-exceptiongroup==1.3.0
-fastapi==0.115.13
-filelock==3.18.0
-filetype==1.2.0
-frozendict==2.4.6
-fsspec==2025.5.1
-gitdb==4.0.12
-GitPython==3.1.44
-google-ai-generativelanguage==0.6.15
-google-api-core==2.25.1
-google-api-python-client==2.172.0
-google-auth==2.40.3
-google-auth-httplib2==0.2.0
-google-generativeai==0.8.5
-googleapis-common-protos==1.70.0
-grpcio==1.73.0
-grpcio-status==1.71.0
-h11==0.16.0
-hf-xet==1.1.4
-httpcore==1.0.9
-httplib2==0.22.0
-httptools==0.6.4
-httpx==0.28.1
-huggingface-hub==0.33.0
-idna==3.10
-Jinja2==3.1.6
-jsonpatch==1.33
-jsonpointer==3.0.0
-jsonschema==4.24.0
-jsonschema-specifications==2025.4.1
-langchain==0.3.25
-langchain-core==0.3.65
-langchain-google-genai==2.0.10
-langchain-text-splitters==0.3.8
-langgraph==0.4.8
-langgraph-checkpoint==2.1.0
-langgraph-prebuilt==0.2.2
-langgraph-sdk==0.1.70
-langsmith==0.3.45
-MarkupSafe==3.0.2
-mpmath==1.3.0
-multitasking==0.0.11
-narwhals==1.43.1
-networkx==3.4.2
-newsapi-python==0.2.7
-numpy==2.2.6
-orjson==3.10.18
-ormsgpack==1.10.0
-packaging==24.2
-pandas==2.3.0
-peewee==3.18.1
-pillow==11.2.1
-platformdirs==4.3.8
-proto-plus==1.26.1
-protobuf==5.29.5
-pyarrow==20.0.0
-pyasn1==0.6.1
-pyasn1_modules==0.4.2
-pycparser==2.22
-pydantic==2.11.7
-pydantic_core==2.33.2
-pydeck==0.9.1
-pyparsing==3.2.3
-pytest==8.3.4
-python-dateutil==2.9.0.post0
-python-dotenv==1.1.0
-pytz==2025.2
-PyYAML==6.0.2
-referencing==0.36.2
-regex==2024.11.6
-requests==2.32.4
-requests-toolbelt==1.0.0
-rpds-py==0.25.1
-rsa==4.9.1
-safetensors==0.5.3
-six==1.17.0
-smmap==5.0.2
-sniffio==1.3.1
-soupsieve==2.7
-SQLAlchemy==2.0.41
-starlette==0.46.2
-streamlit==1.46.0
-sympy==1.14.0
-tenacity==9.1.2
-tokenizers==0.21.1
-toml==0.10.2
-torch==2.7.1
-tornado==6.5.1
-tqdm==4.67.1
-transformers==4.52.4
-typing-inspection==0.4.1
-typing_extensions==4.14.0
-tzdata==2025.2
-uritemplate==4.2.0
-urllib3==2.5.0
-uvicorn==0.34.3
-uvloop==0.21.0
-watchfiles==1.1.0
-websockets==15.0.1
-xxhash==3.5.0
-yfinance==0.2.63
-zstandard==0.23.0
\ No newline at end of file
diff --git a/stocksense/.dockerignore b/stocksense/.dockerignore
new file mode 100644
index 0000000..7d1e921
--- /dev/null
+++ b/stocksense/.dockerignore
@@ -0,0 +1,11 @@
+# .dockerignore - for development builds
+Dockerfile
+.git
+.gitignore
+README.md
+__pycache__
+*.pyc
+.pytest_cache
+node_modules
+.vscode
+.idea
\ No newline at end of file
diff --git a/stocksense/Dockerfile b/stocksense/Dockerfile
new file mode 100644
index 0000000..28744fa
--- /dev/null
+++ b/stocksense/Dockerfile
@@ -0,0 +1,20 @@
+FROM python:3.9-slim
+
+# Set the working directory inside the container
+WORKDIR /app
+
+# Copy the requirements file to the working directory
+COPY requirements.txt .
+
+# Install the Python dependencies
+RUN pip install --upgrade pip
+RUN pip install -r requirements.txt
+
+# Copy the application code to the working directory
+COPY . .
+
+# Expose the port on which the application will run
+EXPOSE 8080
+
+# Run the FastAPI application using uvicorn server
+CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8080"]
\ No newline at end of file
diff --git a/stocksense/ai/__init__.py b/stocksense/ai/__init__.py
new file mode 100644
index 0000000..8e2a790
--- /dev/null
+++ b/stocksense/ai/__init__.py
@@ -0,0 +1 @@
+# AI/ML components package
diff --git a/stocksense/ai/analyzer.py b/stocksense/ai/analyzer.py
new file mode 100644
index 0000000..b6f1286
--- /dev/null
+++ b/stocksense/ai/analyzer.py
@@ -0,0 +1,63 @@
+import os
+import sys
+from typing import List, Dict
+
+# Add the parent directories to Python path to find core module
+current_dir = os.path.dirname(os.path.abspath(__file__))
+ai_dir = os.path.dirname(current_dir)
+stocksense_dir = os.path.dirname(ai_dir)
+sys.path.append(stocksense_dir)
+
+from core.config import get_chat_llm, ConfigurationError
+from data.collectors.data_collectors import get_news
+
+
+def analyze_sentiment_of_headlines(news: List[Dict]) -> str:
+ """Analyze sentiment of news headlines using Gemini LLM."""
+ try:
+ if not news:
+ return "No headlines provided for analysis."
+
+ llm = get_chat_llm(
+ model="gemini-2.5-flash",
+ temperature=0.3,
+ max_output_tokens=2048
+ )
+
+ headlines = "\n".join([f"{i+1}. {item['headline']}" for i, item in enumerate(news) if 'headline' in item])
+
+
+ prompt = f"""
+You are a financial sentiment analysis expert. Please analyze the sentiment of the following news headlines and provide insights for stock market research.
+
+Headlines to analyze:
+{headlines}
+
+For each headline, please:
+1. Classify the sentiment as 'Positive', 'Neutral', or 'Negative'
+2. Provide a brief justification (1-2 sentences)
+
+Then provide:
+- Overall market sentiment summary
+- Key themes or concerns identified
+- Potential impact on stock price (bullish/bearish/neutral)
+
+Format your response clearly with numbered items corresponding to the headlines, followed by your overall analysis.
+"""
+
+ response = llm.invoke(prompt)
+ return response
+
+ except ConfigurationError as e:
+ raise f"Configuration error: {str(e)}"
+
+ except Exception as e:
+ raise f"Error during sentiment analysis: {str(e)}"
+
+
+if __name__ == "__main__":
+ sample_ticker = "AAPL"
+ news_data = get_news(sample_ticker, days=7)
+ print("Fetched News Headlines:\n", news_data)
+ sentiment_report = analyze_sentiment_of_headlines(news_data)
+ print(f"Sentiment Analysis Report for {sample_ticker}:\n", sentiment_report)
\ No newline at end of file
diff --git a/stocksense/react_agent.py b/stocksense/ai/react_agent.py
similarity index 52%
rename from stocksense/react_agent.py
rename to stocksense/ai/react_agent.py
index 93c64b6..d1993da 100644
--- a/stocksense/react_agent.py
+++ b/stocksense/ai/react_agent.py
@@ -1,3 +1,6 @@
+import os
+import sys
+
from typing import Dict, List, Optional, TypedDict, Literal, Any
from datetime import datetime
@@ -5,11 +8,15 @@
from langchain_core.messages import BaseMessage, HumanMessage, ToolMessage
from langchain_core.tools import tool
-from .config import get_chat_llm
-from .data_collectors import get_news, get_price_history
-from .analyzer import analyze_sentiment_of_headlines
-from .database import save_analysis
+# Add the parent directories to Python path to find modules
+current_dir = os.path.dirname(os.path.abspath(__file__))
+ai_dir = os.path.dirname(current_dir)
+stocksense_dir = os.path.dirname(ai_dir)
+sys.path.append(stocksense_dir)
+from core.config import get_chat_llm
+from data.collectors.data_collectors import get_news, get_price_history
+from ai.analyzer import analyze_sentiment_of_headlines
class AgentState(TypedDict):
"""
@@ -17,8 +24,8 @@ class AgentState(TypedDict):
"""
messages: List[BaseMessage]
ticker: str
- headlines: List[str]
- price_data: List[Dict[str, Any]] # Updated to hold structured OHLCV data
+ headlines: List[Dict[str, str]]
+ price_data: List[Dict[str, Any]]
sentiment_report: str
summary: str
reasoning_steps: List[str]
@@ -32,24 +39,26 @@ class AgentState(TypedDict):
@tool
def fetch_news_headlines(ticker: str, days: int = 7) -> Dict:
"""
- Fetch recent news headlines for a stock ticker.
-
+ Fetch recent news headlines for a stock ticker.
+ This is STEP 1 of the mandatory analysis workflow.
+
Args:
- ticker: Stock ticker symbol
- days: Number of days to look back for news
-
+ ticker: Stock ticker symbol (e.g., AAPL, MSFT)
+ days: Number of days to look back for news (default: 7)
+
Returns:
- Dictionary with headlines and metadata
+ Dict with news data including headlines list and metadata
"""
try:
- headlines = get_news(ticker, days=days)
+ news_data = get_news(ticker, days=days)
+
return {
"success": True,
- "headlines": headlines,
- "count": len(headlines),
+ "news": news_data,
"ticker": ticker,
"days": days
}
+
except Exception as e:
return {
"success": False,
@@ -63,13 +72,14 @@ def fetch_news_headlines(ticker: str, days: int = 7) -> Dict:
def fetch_price_data(ticker: str, period: str = "1mo") -> Dict:
"""
Fetch price history for a stock ticker and return structured OHLCV data.
-
+ This is STEP 2 of the mandatory analysis workflow.
+
Args:
- ticker: Stock ticker symbol
- period: Time period (1d, 5d, 1mo, 3mo, 6mo, 1y, 2y, 5y, 10y, ytd, max)
-
+ ticker: Stock ticker symbol (e.g., AAPL, MSFT)
+ period: Time period for price data (default: "1mo")
+
Returns:
- Dictionary with structured price data and metadata
+ Dict with price data including OHLCV values and metadata
"""
try:
df = get_price_history(ticker, period=period)
@@ -81,11 +91,9 @@ def fetch_price_data(ticker: str, period: str = "1mo") -> Dict:
"price_data": []
}
- # Convert DataFrame to list of dictionaries with structured OHLCV data
df_reset = df.reset_index()
df_reset['Date'] = df_reset['Date'].dt.strftime('%Y-%m-%d')
- # Convert to records and ensure proper data types
price_data = []
for record in df_reset.to_dict(orient='records'):
price_record = {
@@ -115,78 +123,59 @@ def fetch_price_data(ticker: str, period: str = "1mo") -> Dict:
@tool
-def analyze_sentiment(headlines: List[str]) -> Dict:
+def analyze_sentiment(ticker: str) -> Dict:
"""
- Analyze sentiment of news headlines.
-
+ Analyze sentiment of recent news headlines for a stock ticker.
+ This is STEP 3 of the mandatory analysis workflow - must be called AFTER steps 1 and 2.
+
Args:
- headlines: List of news headlines
-
+ ticker: Stock ticker symbol (e.g., AAPL, MSFT)
+
Returns:
- Dictionary with sentiment analysis results
+ Dict with sentiment analysis report and success status
"""
try:
+ # Fetch fresh headlines for sentiment analysis
+ news_data = get_news(ticker, days=7)
+ headlines = news_data if news_data else []
+
if not headlines:
return {
"success": False,
- "error": "No headlines provided for analysis",
- "sentiment_report": ""
+ "error": f"No headlines found for {ticker} to analyze sentiment",
+ "sentiment_report": "",
+ "headlines_analyzed": 0
}
sentiment_report = analyze_sentiment_of_headlines(headlines)
+
return {
"success": True,
"sentiment_report": sentiment_report,
- "headlines_count": len(headlines)
- }
- except Exception as e:
- return {
- "success": False,
- "error": str(e),
- "sentiment_report": ""
- }
-
-
-@tool
-def save_analysis_results(ticker: str, summary: str, sentiment_report: str) -> Dict:
- """
- Save analysis results to database.
-
- Args:
- ticker: Stock ticker symbol
- summary: Analysis summary
- sentiment_report: Detailed sentiment report
-
- Returns:
- Dictionary with save operation results
- """
- try:
- save_analysis(ticker, summary, sentiment_report)
- return {
- "success": True,
- "message": f"Analysis saved for {ticker}",
+ "headlines_analyzed": len(headlines),
"ticker": ticker
}
+
except Exception as e:
return {
"success": False,
"error": str(e),
- "message": f"Failed to save analysis for {ticker}"
+ "sentiment_report": "",
+ "headlines_analyzed": 0
}
tools = [
fetch_news_headlines,
fetch_price_data,
- analyze_sentiment,
- save_analysis_results
+ analyze_sentiment
]
def create_react_agent() -> StateGraph:
llm = get_chat_llm(
- model="gemini-1.5-flash",
+ model="gemini-2.5-flash",
temperature=0.1,
max_output_tokens=1024
)
@@ -203,49 +192,58 @@ def agent_node(state: AgentState) -> AgentState:
max_iterations = state.get("max_iterations", 5)
if iterations >= max_iterations:
+ # Generate fallback summary with available data if max iterations reached
+ tools_used = state.get("tools_used", [])
+ required_tools = ["fetch_news_headlines", "fetch_price_data", "analyze_sentiment"]
+ missing_tools = [tool for tool in required_tools if tool not in tools_used]
+
+ fallback_summary = f"""
+Stock Analysis Summary for {ticker} (Partial - Max Iterations Reached):
+
+Analysis Status: Incomplete due to iteration limit
+Tools Used: {', '.join(tools_used) if tools_used else 'None'}
+Missing Tools: {', '.join(missing_tools) if missing_tools else 'None'}
+
+Available Data Summary:
+- News Headlines: {'✓' if 'fetch_news_headlines' in tools_used else '✗'}
+- Price Data: {'✓' if 'fetch_price_data' in tools_used else '✗'}
+- Sentiment Analysis: {'✓' if 'analyze_sentiment' in tools_used else '✗'}
+
+Final Recommendation: UNSPECIFIED (Incomplete Analysis)
+
+Note: Analysis was incomplete due to reaching maximum iteration limit ({max_iterations}).
+Please retry for complete analysis.
+ """.strip()
+
return {
**state,
"final_decision": "MAX_ITERATIONS_REACHED",
- "error": f"Reached maximum iterations ({max_iterations})"
+ "summary": fallback_summary,
+ "error": f"Reached maximum iterations ({max_iterations}) - partial analysis provided"
}
reasoning_prompt = f"""
-You are a ReAct (Reasoning + Action) agent for stock analysis. You must analyze {ticker} by reasoning about what to do next and then taking action.
-
-Current situation:
-- Ticker: {ticker}
-- Iteration: {iterations + 1}/{max_iterations}
-- Headlines collected: {len(state.get('headlines', []))}
-- Price data available: {len(state.get('price_data', [])) > 0}
-- Sentiment analyzed: {bool(state.get('sentiment_report'))}
-- Tools used so far: {state.get('tools_used', [])}
-
-Available tools:
-1. fetch_news_headlines - Get recent news headlines
-2. fetch_price_data - Get price history data
-3. analyze_sentiment - Analyze sentiment of headlines
-4. save_analysis_results - Save final analysis
-
-REASONING: Think step by step about what you should do next:
-1. What information do I have?
-2. What information do I still need?
-3. What tool should I use next?
-4. Am I ready to provide a final analysis?
-
-COMPLETION CRITERIA: You are ready to provide final analysis when you have:
-- Headlines collected (✓ if {len(state.get('headlines', []))} > 0)
-- Price data gathered (✓ if {len(state.get('price_data', [])) > 0})
-- Sentiment analysis completed (✓ if {bool(state.get('sentiment_report'))})
-
-ACTION: Based on your reasoning:
-- If ANY of the above criteria are missing, use the appropriate tool to gather that information
-- If ALL criteria are met, provide a comprehensive final summary WITHOUT calling any more tools
-- DO NOT call analyze_sentiment more than once unless the first attempt failed
-
-IMPORTANT: Once you have collected headlines, performed sentiment analysis, and gathered price data, provide a comprehensive final summary that includes:
-- Overall market sentiment conclusion
-- Key insights from the news analysis
-- Market implications and investment considerations.
+You are a ReAct (Reasoning + Action) agent for stock analysis. You must analyze {ticker} following this EXACT sequence:
+
+MANDATORY WORKFLOW:
+1. FIRST: Call fetch_news_headlines("{ticker}") to get recent news
+2. SECOND: Call fetch_price_data("{ticker}") to get price history
+3. THIRD: Call analyze_sentiment("{ticker}") to analyze news sentiment
+4. FOURTH: Provide final analysis combining ALL data
+
+IMPORTANT RULES:
+- You MUST use ALL THREE tools before providing final analysis
+- Each tool provides crucial data for comprehensive analysis
+- Do NOT skip any tools - all are required
+- After using all tools, provide your final analysis with:
+ * Market sentiment from news analysis
+ * Price trend insights from historical data
+ * Key insights combining news + price data
+ * Final Investment Recommendation: either KEEP (hold) or SELL
+
+Be explicit: Always end with 'Final Recommendation: KEEP' or 'Final Recommendation: SELL'.
+
+Start by calling the first tool: fetch_news_headlines("{ticker}")
"""
messages.append(HumanMessage(content=reasoning_prompt))
@@ -261,35 +259,47 @@ def agent_node(state: AgentState) -> AgentState:
if response.tool_calls:
new_state["final_decision"] = "CONTINUE"
else:
- new_state["final_decision"] = "COMPLETE"
-
- agent_response = response.content
-
- headlines = state.get('headlines', [])
- sentiment_report = state.get('sentiment_report', '')
- price_data = state.get('price_data', [])
+ # Check if all required tools have been used before final analysis
+ tools_used = state.get("tools_used", [])
+ required_tools = ["fetch_news_headlines", "fetch_price_data", "analyze_sentiment"]
+ missing_tools = [tool for tool in required_tools if tool not in tools_used]
+
+ if missing_tools:
+ # If tools are missing, force continuation with specific instruction
+ missing_tools_str = ", ".join(missing_tools)
+ continuation_prompt = f"""
+You have not completed all required analysis steps. You are missing: {missing_tools_str}
+
+IMPORTANT: You MUST use all three tools before providing final analysis.
+Please call the missing tools now: {missing_tools_str}
+
+Do not provide final analysis until you have used ALL required tools:
+1. fetch_news_headlines("{ticker}")
+2. fetch_price_data("{ticker}")
+3. analyze_sentiment("{ticker}")
+"""
+
+ new_state["messages"].append(HumanMessage(content=continuation_prompt))
+ new_state["final_decision"] = "CONTINUE"
+
+ else:
+ # All tools used, proceed with final analysis
+ agent_response = response.content or ""
+ decision = "KEEP" if "KEEP" in agent_response.upper() else "SELL" if "SELL" in agent_response.upper() else "UNSPECIFIED"
+ new_state["final_decision"] = decision
- if sentiment_report and headlines:
comprehensive_summary = f"""
Stock Analysis Summary for {ticker}:
-Market Sentiment: Based on analysis of {len(headlines)} recent news headlines, the overall market sentiment shows {sentiment_report[:200]}...
+{agent_response}
-Key Findings:
-- News Coverage: {len(headlines)} articles analyzed from the past 7 days
-- Price Data: {'Available' if len(price_data) > 0 else 'Not available'} ({len(price_data)} data points)
-- Agent Reasoning: {agent_response}
+Final Recommendation: {decision}
-The ReAct agent completed analysis using {len(set(state.get('tools_used', [])))} different tools across {iterations + 1} reasoning iterations.
+Analysis completed using {len(set(tools_used))} tools across {iterations + 1} reasoning iterations.
""".strip()
new_state["summary"] = comprehensive_summary
- if sentiment_report and not any('save_analysis_results' in tool for tool in state.get('tools_used', [])):
- print(f"Warning: ReAct agent didn't call save_analysis_results for {ticker}")
- else:
- new_state["summary"] = agent_response or f"Analysis completed for {ticker} but insufficient data collected"
-
return new_state
def custom_tool_node(state: AgentState) -> AgentState:
@@ -307,36 +317,16 @@ def custom_tool_node(state: AgentState) -> AgentState:
tool_name = tool_call["name"]
tool_args = tool_call["args"]
- # Prevent redundant tool calls
- if tool_name == "analyze_sentiment" and state.get("sentiment_report"):
- result = {
- "success": True,
- "sentiment_report": state.get("sentiment_report"),
- "message": "Sentiment analysis already completed"
- }
- elif tool_name == "fetch_news_headlines" and state.get("headlines"):
- result = {
- "success": True,
- "headlines": state.get("headlines"),
- "message": "Headlines already fetched"
- }
- elif tool_name == "fetch_price_data" and state.get("price_data"):
- result = {
- "success": True,
- "price_data": state.get("price_data"),
- "message": "Price data already fetched"
- }
- else:
- tool_function = None
- for tool_item in tools:
- if tool_item.name == tool_name:
- tool_function = tool_item
- break
+ tool_function = None
+ for tool_item in tools:
+ if tool_item.name == tool_name:
+ tool_function = tool_item
+ break
- if tool_function:
- result = tool_function.invoke(tool_args)
- else:
- result = {"error": f"Tool {tool_name} not found"}
+ if tool_function:
+ result = tool_function.invoke(tool_args)
+ else:
+ result = {"error": f"Tool {tool_name} not found"}
tool_message = ToolMessage(
content=str(result),
@@ -347,7 +337,7 @@ def custom_tool_node(state: AgentState) -> AgentState:
tools_used.append(tool_name)
if tool_name == "fetch_news_headlines" and result.get("success"):
- state["headlines"] = result.get("headlines", [])
+ state["headlines"] = result.get("news", [])
reasoning_steps.append(f"Fetched {len(state['headlines'])} headlines")
elif tool_name == "fetch_price_data" and result.get("success"):
@@ -359,9 +349,6 @@ def custom_tool_node(state: AgentState) -> AgentState:
state["sentiment_report"] = result.get("sentiment_report", "")
reasoning_steps.append("Completed sentiment analysis")
- elif tool_name == "save_analysis_results" and result.get("success"):
- reasoning_steps.append("Saved analysis to database")
-
return {
**state,
"messages": messages + tool_results,
@@ -370,14 +357,10 @@ def custom_tool_node(state: AgentState) -> AgentState:
}
def should_continue(state: AgentState) -> Literal["tools", "end"]:
- """
- Conditional edge to determine if agent should continue or end.
- """
final_decision = state.get("final_decision")
-
if final_decision == "CONTINUE":
return "tools"
- elif final_decision in ["COMPLETE", "MAX_ITERATIONS_REACHED"]:
+ elif final_decision in ["COMPLETE", "MAX_ITERATIONS_REACHED", "KEEP", "SELL", "UNSPECIFIED"]:
return "end"
else:
return "tools"
@@ -406,12 +389,11 @@ def should_continue(state: AgentState) -> Literal["tools", "end"]:
def run_react_analysis(ticker: str) -> Dict:
- # Initialize state
initial_state = {
"messages": [],
"ticker": ticker.upper(),
"headlines": [],
- "price_data": [], # Changed from None to empty list
+ "price_data": [],
"sentiment_report": "",
"summary": "",
"reasoning_steps": [],
@@ -423,10 +405,8 @@ def run_react_analysis(ticker: str) -> Dict:
}
try:
- # Run the ReAct agent
final_state = react_app.invoke(initial_state)
- # Extract final results
return {
"ticker": final_state["ticker"],
"summary": final_state.get("summary", "Analysis completed"),
@@ -436,32 +416,25 @@ def run_react_analysis(ticker: str) -> Dict:
"reasoning_steps": final_state.get("reasoning_steps", []),
"tools_used": final_state.get("tools_used", []),
"iterations": final_state.get("iterations", 0),
+ "final_decision": final_state.get("final_decision", "UNSPECIFIED"),
"error": final_state.get("error"),
"timestamp": datetime.now().isoformat()
}
except Exception as e:
- error_msg = f"ReAct Agent error for {ticker}: {str(e)}"
-
- # Check if it's a rate limit error and provide helpful message
- if "429" in str(e) or "quota" in str(e).lower() or "rate" in str(e).lower():
- friendly_error = f"API Rate Limit Reached: Google Gemini free tier allows 50 requests/day. Please wait for quota reset or upgrade to paid plan."
- error_msg = friendly_error
+ error_msg = str(e)
return {
"ticker": ticker.upper(),
- "summary": f"Analysis temporarily unavailable due to API limits. The data collection was successful (found real market data), but AI analysis is rate-limited.",
- "sentiment_report": f"Rate Limit Info: Google Gemini free tier quota exceeded. Try again tomorrow or upgrade for higher limits.",
+ "summary": "Analysis temporarily unavailable due to API limits.",
+ "sentiment_report": "Rate Limit Info: Gemini free tier quota exceeded.",
"headlines": [],
- "price_data": [], # Changed from None to empty list
+ "price_data": [],
"reasoning_steps": [],
"tools_used": [],
"iterations": 0,
+ "final_decision": "UNSPECIFIED",
"error": error_msg,
"timestamp": datetime.now().isoformat()
}
-
-if __name__ == '__main__':
- test_ticker = "AAPL"
- result = run_react_analysis(test_ticker)
\ No newline at end of file
diff --git a/stocksense/analyzer.py b/stocksense/analyzer.py
deleted file mode 100644
index 590f81c..0000000
--- a/stocksense/analyzer.py
+++ /dev/null
@@ -1,58 +0,0 @@
-import os
-from typing import List
-from .config import get_llm, ConfigurationError
-
-
-def analyze_sentiment_of_headlines(headlines: List[str]) -> str:
- """Analyze sentiment of news headlines using Gemini LLM."""
- try:
- if not headlines:
- return "No headlines provided for analysis."
-
- llm = get_llm(
- model="gemini-1.5-flash",
- temperature=0.3,
- max_output_tokens=2048
- )
-
- numbered_headlines = "\n".join([f"{i+1}. {headline}" for i, headline in enumerate(headlines)])
-
- prompt = f"""
-You are a financial sentiment analysis expert. Please analyze the sentiment of the following news headlines and provide insights for stock market research.
-
-Headlines to analyze:
-{numbered_headlines}
-
-For each headline, please:
-1. Classify the sentiment as 'Positive', 'Neutral', or 'Negative'
-2. Provide a brief justification (1-2 sentences)
-
-Then provide:
-- Overall market sentiment summary
-- Key themes or concerns identified
-- Potential impact on stock price (bullish/bearish/neutral)
-
-Format your response clearly with numbered items corresponding to the headlines, followed by your overall analysis.
-"""
-
- response = llm.invoke(prompt)
- return response
-
- except ConfigurationError as e:
- error_msg = f"Configuration error: {str(e)}"
- return error_msg
-
- except Exception as e:
- error_msg = f"Error during sentiment analysis: {str(e)}"
- return error_msg
-
-
-if __name__ == '__main__':
- sample_headlines = [
- "Apple Reports Record Q4 Earnings, Beats Wall Street Expectations",
- "Apple Stock Falls 3% After iPhone Sales Disappoint Analysts",
- "Apple Announces New AI Features for iPhone and iPad",
- "Regulatory Concerns Mount Over Apple's App Store Policies"
- ]
-
- result = analyze_sentiment_of_headlines(sample_headlines)
\ No newline at end of file
diff --git a/stocksense/core/__init__.py b/stocksense/core/__init__.py
new file mode 100644
index 0000000..3706f98
--- /dev/null
+++ b/stocksense/core/__init__.py
@@ -0,0 +1 @@
+# Core business logic package
diff --git a/stocksense/config.py b/stocksense/core/config.py
similarity index 94%
rename from stocksense/config.py
rename to stocksense/core/config.py
index beea842..3a43a24 100644
--- a/stocksense/config.py
+++ b/stocksense/core/config.py
@@ -5,7 +5,6 @@
load_dotenv()
-
class ConfigurationError(Exception):
pass
@@ -27,7 +26,7 @@ def get_google_api_key() -> str:
return api_key
-def get_llm(model: str = "gemini-1.5-flash",
+def get_llm(model: str = "gemini-2.5-flash",
temperature: float = 0.3,
max_output_tokens: int = 2048) -> GoogleGenerativeAI:
"""Get configured Google Generative AI LLM instance."""
@@ -41,7 +40,7 @@ def get_llm(model: str = "gemini-1.5-flash",
)
-def get_chat_llm(model: str = "gemini-1.5-flash",
+def get_chat_llm(model: str = "gemini-2.5-flash",
temperature: float = 0.1,
max_output_tokens: int = 1024) -> ChatGoogleGenerativeAI:
"""Get configured Google Generative AI Chat LLM instance."""
@@ -82,7 +81,7 @@ def validate_configuration() -> bool:
return True
-DEFAULT_MODEL = "gemini-1.5-flash"
+DEFAULT_MODEL = "gemini-2.5-flash"
DEFAULT_TEMPERATURE = 0.3
DEFAULT_MAX_TOKENS = 2048
DEFAULT_CHAT_TEMPERATURE = 0.1
diff --git a/stocksense/data/collectors/__init__.py b/stocksense/data/collectors/__init__.py
new file mode 100644
index 0000000..a4b73a1
--- /dev/null
+++ b/stocksense/data/collectors/__init__.py
@@ -0,0 +1 @@
+# Data collectors package
diff --git a/stocksense/data/collectors/data_collectors.py b/stocksense/data/collectors/data_collectors.py
new file mode 100644
index 0000000..3e76a17
--- /dev/null
+++ b/stocksense/data/collectors/data_collectors.py
@@ -0,0 +1,88 @@
+import os
+import sys
+from typing import List, Optional, Dict
+from datetime import datetime, timedelta
+import yfinance as yf
+from newsapi import NewsApiClient
+
+# Add the parent directories to Python path to find core module
+current_dir = os.path.dirname(os.path.abspath(__file__))
+collectors_dir = os.path.dirname(current_dir)
+stocksense_dir = os.path.dirname(collectors_dir)
+sys.path.append(stocksense_dir)
+
+from core.config import get_newsapi_key, ConfigurationError
+
+
+
+def get_news(ticker: str, days: int = 7) -> List[Dict[str, str]]:
+ """Fetch recent news headlines and URLs related to a stock ticker.
+
+ Returns:
+ List of dictionaries with 'headline' and 'url'
+ """
+ try:
+ api_key = get_newsapi_key()
+ newsapi = NewsApiClient(api_key=api_key)
+
+ to_date = datetime.now()
+ from_date = to_date - timedelta(days=days)
+
+ from_date_str = from_date.strftime('%Y-%m-%d')
+ to_date_str = to_date.strftime('%Y-%m-%d')
+
+ try:
+ results = newsapi.get_everything(
+ q=ticker ,
+ language='en',
+ sort_by='publishedAt',
+ from_param=from_date_str,
+ to=to_date_str,
+ page_size=5
+
+ )
+
+ news_data = []
+ if results and results.get('status') == 'ok':
+ articles = results.get('articles', [])
+
+ for article in articles:
+ if article.get('title') and article.get('url'):
+ news_data.append({
+ 'headline': article['title'],
+ 'url': article['url'],
+ })
+
+ return news_data
+
+ except Exception as e:
+ print(f"Error fetching news for {ticker}: {str(e)}")
+ return []
+
+ except ConfigurationError as e:
+ print(f"Configuration error: {str(e)}")
+ return []
+ except Exception as e:
+ print(f"Unexpected error in get_news: {str(e)}")
+ return []
+
+
+
+
+
+
+def get_price_history(ticker: str, period: str = "1mo") -> Optional[object]:
+ """Fetch historical price data for a stock ticker."""
+ try:
+ stock = yf.Ticker(ticker)
+ history = stock.history(period=period)
+
+ if history.empty:
+ return None
+
+ return history
+
+ except Exception as e:
+ return None
+
+
diff --git a/stocksense/data_collectors.py b/stocksense/data_collectors.py
deleted file mode 100644
index 6bace38..0000000
--- a/stocksense/data_collectors.py
+++ /dev/null
@@ -1,63 +0,0 @@
-import os
-from typing import List, Optional
-from datetime import datetime, timedelta
-import yfinance as yf
-from newsapi import NewsApiClient
-from .config import get_newsapi_key, ConfigurationError
-
-
-def get_news(ticker: str, days: int = 7) -> List[str]:
- """Fetch recent news headlines related to a stock ticker."""
- try:
- api_key = get_newsapi_key()
- newsapi = NewsApiClient(api_key=api_key)
-
- to_date = datetime.now()
- from_date = to_date - timedelta(days=days)
-
- from_date_str = from_date.strftime('%Y-%m-%d')
- to_date_str = to_date.strftime('%Y-%m-%d')
-
- articles = newsapi.get_everything(
- q=ticker,
- language='en',
- sort_by='publishedAt',
- from_param=from_date_str,
- to=to_date_str,
- page_size=20
- )
-
- if articles['status'] == 'ok':
- headlines = []
- for article in articles['articles']:
- if article['title'] and article['title'] != '[Removed]':
- headlines.append(article['title'])
- return headlines
- else:
- return []
-
- except ConfigurationError as e:
- return []
- except Exception as e:
- return []
-
-
-def get_price_history(ticker: str, period: str = "1mo") -> Optional[object]:
- """Fetch historical price data for a stock ticker."""
- try:
- stock = yf.Ticker(ticker)
- history = stock.history(period=period)
-
- if history.empty:
- return None
-
- return history
-
- except Exception as e:
- return None
-
-
-if __name__ == '__main__':
- test_ticker = "AAPL"
- headlines = get_news(test_ticker, days=7)
- price_data = get_price_history(test_ticker, period="1mo")
\ No newline at end of file
diff --git a/stocksense/database.py b/stocksense/database.py
deleted file mode 100644
index dc345dc..0000000
--- a/stocksense/database.py
+++ /dev/null
@@ -1,164 +0,0 @@
-import sqlite3
-import os
-from datetime import datetime
-from typing import Dict, Optional
-
-
-def init_db() -> None:
- try:
- current_dir = os.path.dirname(os.path.abspath(__file__))
- project_root = os.path.dirname(current_dir)
- db_path = os.path.join(project_root, 'stocksense.db')
-
- os.makedirs(os.path.dirname(db_path), exist_ok=True)
-
- with sqlite3.connect(db_path) as conn:
- cursor = conn.cursor()
- cursor.execute("PRAGMA journal_mode=WAL") # For better concurrency
- cursor.execute('''
- CREATE TABLE IF NOT EXISTS analysis_cache (
- id INTEGER PRIMARY KEY AUTOINCREMENT,
- ticker TEXT NOT NULL,
- analysis_summary TEXT,
- sentiment_report TEXT,
- timestamp TEXT NOT NULL
- )
- ''')
- cursor.execute('''
- CREATE INDEX IF NOT EXISTS idx_ticker_timestamp
- ON analysis_cache (ticker, timestamp DESC)
- ''')
- conn.commit()
-
- except sqlite3.Error as e:
- raise
- except Exception as e:
- raise
-
-
-def save_analysis(ticker: str, summary: str, sentiment_report: str) -> None:
- """Save analysis results to the database cache."""
- try:
- current_dir = os.path.dirname(os.path.abspath(__file__))
- project_root = os.path.dirname(current_dir)
- db_path = os.path.join(project_root, 'stocksense.db')
-
- if not os.path.exists(db_path) or os.path.getsize(db_path) == 0:
- init_db()
-
- timestamp = datetime.utcnow().isoformat()
-
- with sqlite3.connect(db_path) as conn:
- cursor = conn.cursor()
- cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='analysis_cache'")
- table_exists = cursor.fetchone()
-
- if not table_exists:
- conn.close()
- init_db()
- conn = sqlite3.connect(db_path)
- cursor = conn.cursor()
-
- cursor.execute('''
- INSERT INTO analysis_cache (ticker, analysis_summary, sentiment_report, timestamp)
- VALUES (?, ?, ?, ?)
- ''', (ticker.upper(), summary, sentiment_report, timestamp))
-
- conn.commit()
-
- except sqlite3.Error as e:
- raise
- except Exception as e:
- raise
-
-
-def get_latest_analysis(ticker: str) -> Optional[Dict[str, str]]:
- """Retrieve the most recent analysis for a given ticker."""
- try:
- current_dir = os.path.dirname(os.path.abspath(__file__))
- project_root = os.path.dirname(current_dir)
- db_path = os.path.join(project_root, 'stocksense.db')
-
- if not os.path.exists(db_path):
- return None
-
- if os.path.getsize(db_path) == 0:
- init_db()
-
- with sqlite3.connect(db_path) as conn:
- cursor = conn.cursor()
- cursor.execute("SELECT name FROM sqlite_master WHERE type='table' AND name='analysis_cache'")
- table_exists = cursor.fetchone()
-
- if not table_exists:
- conn.close()
- init_db()
- conn = sqlite3.connect(db_path)
- cursor = conn.cursor()
-
- cursor.execute('''
- SELECT id, ticker, analysis_summary, sentiment_report, timestamp
- FROM analysis_cache
- WHERE ticker = ?
- ORDER BY timestamp DESC
- LIMIT 1
- ''', (ticker.upper(),))
-
- result = cursor.fetchone()
-
- if result:
- analysis_data = {
- 'id': result[0],
- 'ticker': result[1],
- 'analysis_summary': result[2],
- 'sentiment_report': result[3],
- 'timestamp': result[4]
- }
- return analysis_data
- else:
- return None
-
- except sqlite3.Error as e:
- return None
- except Exception as e:
- return None
-
-
-def get_all_cached_tickers() -> list:
- """Get a list of all tickers that have cached analysis data."""
- try:
- current_dir = os.path.dirname(os.path.abspath(__file__))
- project_root = os.path.dirname(current_dir)
- db_path = os.path.join(project_root, 'stocksense.db')
-
- if not os.path.exists(db_path):
- return []
-
- with sqlite3.connect(db_path) as conn:
- cursor = conn.cursor()
-
- cursor.execute('''
- SELECT DISTINCT ticker, MAX(timestamp) as latest_timestamp
- FROM analysis_cache
- GROUP BY ticker
- ORDER BY latest_timestamp DESC
- ''')
-
- results = cursor.fetchall()
- tickers = [result[0] for result in results]
-
- return tickers
-
- except sqlite3.Error as e:
- return []
-
-
-if __name__ == '__main__':
- init_db()
- sample_ticker = "AAPL"
- sample_summary = "Apple showed strong performance with positive earnings."
- sample_sentiment = "Overall sentiment: Positive. Headlines show bullish outlook."
- save_analysis(sample_ticker, sample_summary, sample_sentiment)
- retrieved_data = get_latest_analysis(sample_ticker)
- non_existent = get_latest_analysis("NONEXISTENT")
- cached_tickers = get_all_cached_tickers()
\ No newline at end of file
diff --git a/stocksense/main.py b/stocksense/main.py
index cbd31f7..358e602 100644
--- a/stocksense/main.py
+++ b/stocksense/main.py
@@ -1,14 +1,18 @@
+import os
+import sys
from fastapi import FastAPI, HTTPException
-from fastapi.responses import JSONResponse
from fastapi.middleware.cors import CORSMiddleware
from contextlib import asynccontextmanager
import uvicorn
from typing import Dict, Any
-from datetime import datetime
-from .config import validate_configuration, ConfigurationError
-from .database import init_db, get_latest_analysis, get_all_cached_tickers
-from .react_agent import run_react_analysis
+# Add the parent directory to Python path to enable imports
+current_dir = os.path.dirname(os.path.abspath(__file__))
+parent_dir = os.path.dirname(current_dir)
+sys.path.append(parent_dir)
+
+from core.config import validate_configuration, ConfigurationError
+from ai.react_agent import run_react_analysis
@asynccontextmanager
@@ -24,14 +28,6 @@ async def lifespan(app: FastAPI):
print(f"Configuration error: {str(e)}")
raise
- print("Initializing database...")
- try:
- init_db()
- print("Database initialized successfully")
- except Exception as e:
- print(f"Error initializing database: {str(e)}")
- raise
-
print("StockSense ReAct Agent API ready to serve requests!")
yield
@@ -41,11 +37,8 @@ async def lifespan(app: FastAPI):
app = FastAPI(
- title="StockSense ReAct Agent API",
- description="AI-powered autonomous stock analysis using ReAct pattern",
- version="2.0.0",
- docs_url="/docs",
- redoc_url="/redoc",
+ title="StockSense AI Analysis API",
+ description="AI-powered stock analysis with data sources",
lifespan=lifespan
)
@@ -58,198 +51,114 @@ async def lifespan(app: FastAPI):
)
-@app.get("/health")
-async def health_check() -> Dict[str, str]:
- return {"status": "ok"}
-
@app.get("/")
-async def root() -> Dict[str, str]:
+async def root() -> Dict[str, Any]:
+ """Root endpoint with API information."""
return {
- "message": "Welcome to StockSense ReAct Agent API",
- "description": "AI-powered autonomous stock analysis using ReAct pattern",
- "docs": "/docs",
- "health": "/health"
+ "message": "StockSense AI Analysis API",
}
-@app.post("/analyze/{ticker}")
+@app.get("/analyze/{ticker}")
async def analyze_stock(ticker: str) -> Dict[str, Any]:
- """Analyze stock using the ReAct Agent (Reasoning + Action) pattern."""
+ """
+ Comprehensive stock analysis endpoint.
+
+ Args:
+ ticker: Stock ticker symbol (e.g., AAPL, MSFT, GOOGL)
+
+ Returns:
+ Complete analysis with data and sources
+ """
try:
- # Validate and normalize ticker
+ # Validate ticker input
ticker = ticker.upper().strip()
if not ticker:
- raise HTTPException(status_code=400, detail="Ticker cannot be empty")
-
- print(f"ReAct Agent analysis request received for ticker: {ticker}")
-
- # Check for cached analysis first
- print(f"Checking cache for existing analysis of {ticker}...")
- cached_analysis = get_latest_analysis(ticker)
-
- if cached_analysis:
- print(f"Found cached analysis for {ticker}")
- return {
- "message": "Analysis retrieved from cache",
- "ticker": ticker,
- "data": {
- "id": cached_analysis["id"],
- "ticker": cached_analysis["ticker"],
- "summary": cached_analysis["analysis_summary"],
- "sentiment_report": cached_analysis["sentiment_report"],
- "timestamp": cached_analysis["timestamp"],
- "source": "cache",
- "agent_type": "ReAct",
- "price_data": [], # Cached results don't have structured price data
- "reasoning_steps": ["Retrieved from cache - no detailed steps available"],
- "tools_used": ["cache_lookup"],
- "iterations": 0
- }
- }
-
- # No cached data found, run fresh ReAct analysis
- print(f"No cached data found for {ticker}, running fresh ReAct analysis...")
-
- # Run the ReAct agent
- try:
- final_state = run_react_analysis(ticker)
+ raise HTTPException(status_code=400, detail="Ticker symbol is required")
+
+ if not ticker.replace('.', '').replace('-', '').isalpha() or len(ticker) > 10:
+ raise HTTPException(status_code=400, detail="Invalid ticker format")
- # Check if the ReAct agent completed successfully
- if final_state.get("error"):
- error_msg = final_state["error"]
- print(f"ReAct Agent error for {ticker}: {error_msg}")
- raise HTTPException(
- status_code=500,
- detail=f"ReAct Agent analysis failed: {error_msg}"
- )
+ print(f"Starting analysis for ticker: {ticker}")
- # Check if we have valid results
- summary = final_state.get("summary", "")
- sentiment_report = final_state.get("sentiment_report", "")
+ analysis_result = run_react_analysis(ticker)
+ print(f"Raw analysis result: {analysis_result}")
- if (not summary or summary.startswith("Analysis failed")):
+ # Check for errors
+ if analysis_result.get("error"):
+ error_msg = analysis_result["error"]
+ if "rate" in error_msg.lower() or "429" in error_msg:
raise HTTPException(
- status_code=500,
- detail="ReAct Agent completed but produced insufficient data"
+ status_code=429,
+ detail="API rate limit reached. Please try again later."
)
-
- # Save the analysis results to database
- try:
- from .database import save_analysis
- save_analysis(ticker, summary, sentiment_report)
- print(f"Analysis results saved to database for {ticker}")
- except Exception as e:
- print(f"Warning: Failed to save analysis to database: {str(e)}")
- # Don't fail the request if database save fails
-
- print(f"ReAct Agent analysis completed successfully for {ticker}")
-
- return {
- "message": "ReAct Agent analysis complete and saved",
- "ticker": ticker,
- "data": {
- "ticker": final_state["ticker"],
- "summary": final_state["summary"],
- "sentiment_report": final_state["sentiment_report"],
- "price_data": final_state.get("price_data", []), # Include structured OHLCV data
- "headlines_count": len(final_state.get("headlines", [])),
- "reasoning_steps": final_state.get("reasoning_steps", []),
- "tools_used": final_state.get("tools_used", []),
- "iterations": final_state.get("iterations", 0),
- "timestamp": final_state.get("timestamp"),
- "source": "react_analysis",
- "agent_type": "ReAct"
- }
- }
-
- except HTTPException:
- # Re-raise HTTP exceptions
- raise
- except Exception as e:
- error_msg = f"ReAct Agent execution error: {str(e)}"
- print(f"{error_msg}")
- raise HTTPException(status_code=500, detail=error_msg)
-
- except HTTPException:
- # Re-raise HTTP exceptions
- raise
- except Exception as e:
- error_msg = f"Unexpected error analyzing {ticker}: {str(e)}"
- print(f"{error_msg}")
- raise HTTPException(status_code=500, detail=error_msg)
-
-
-@app.get("/results/{ticker}")
-async def get_analysis_results(ticker: str) -> Dict[str, Any]:
- try:
- # Validate and normalize ticker
- ticker = ticker.upper().strip()
- if not ticker:
- raise HTTPException(status_code=400, detail="Ticker cannot be empty")
-
- print(f"Results request for ticker: {ticker}")
-
- # Get the latest analysis from database
- analysis = get_latest_analysis(ticker)
-
- if not analysis:
- print(f"No analysis found for {ticker}")
+ raise HTTPException(status_code=500, detail=f"Analysis failed: {error_msg}")
+
+ # Extract analysis components
+ summary = analysis_result.get("summary", "")
+ sentiment_report = analysis_result.get("sentiment_report", "")
+ headlines = analysis_result.get("headlines", [])
+ price_data = analysis_result.get("price_data", [])
+ reasoning_steps = analysis_result.get("reasoning_steps", [])
+ tools_used = analysis_result.get("tools_used", [])
+ final_decision = analysis_result.get("final_decision", "UNSPECIFIED")
+
+ # Validate we have meaningful results
+ if not summary or summary.startswith("Analysis failed"):
raise HTTPException(
- status_code=404,
- detail=f"No analysis found for ticker: {ticker}"
+ status_code=500,
+ detail="Analysis completed but insufficient data generated"
)
- print(f"Analysis results retrieved for {ticker}")
return {
- "message": "Analysis results retrieved successfully",
+ "success": True,
"ticker": ticker,
- "data": {
- "id": analysis["id"],
- "ticker": analysis["ticker"],
- "summary": analysis["analysis_summary"],
- "sentiment_report": analysis["sentiment_report"],
- "timestamp": analysis["timestamp"]
+ "analysis": {
+ "summary": summary,
+ "sentiment_report": sentiment_report,
+ "recommendation": final_decision,
+ "confidence": "high" if len(tools_used) >= 3 else "medium"
+ },
+ "data_sources": {
+ "news_headlines": {
+ "count": len(headlines),
+ "headlines": headlines[:10],
+ "source": "NewsAPI"
+ },
+ "price_data": {
+ "data_points": len(price_data),
+ "latest_price": price_data[-1]["Close"] if price_data else None,
+ "price_range": {
+ "period": "30 days",
+ "high": max([p["High"] for p in price_data]) if price_data else None,
+ "low": min([p["Low"] for p in price_data]) if price_data else None
+ },
+ "source": "Yahoo Finance",
+ "chart_data": price_data
+ },
+ "ai_analysis": {
+ "model": "Google Gemini 2.5 Flash",
+ "reasoning_steps": len(reasoning_steps),
+ "tools_used": tools_used,
+ "iterations": analysis_result.get("iterations", 0),
+ "sentiment_analyzed": analysis_result.get("sentiment_analyzed", False)
+ }
+ },
+ "metadata": {
+ "analysis_type": "ReAct Agent",
+ "timestamp": analysis_result.get("timestamp"),
+ "processing_time": f"{analysis_result.get('iterations', 0)} AI iterations",
+ "data_freshness": "Real-time"
}
}
except HTTPException:
- # Re-raise HTTP exceptions
raise
except Exception as e:
- error_msg = f"Error retrieving results for {ticker}: {str(e)}"
- print(f"{error_msg}")
- raise HTTPException(status_code=500, detail=error_msg)
-
-
-@app.get("/cached-tickers")
-async def get_cached_tickers() -> Dict[str, Any]:
- try:
- print("Retrieving list of cached tickers...")
-
- cached_tickers = get_all_cached_tickers()
-
- print(f"Found {len(cached_tickers)} cached tickers")
+ error_msg = f"Unexpected error analyzing {ticker}: {str(e)}"
+ print(error_msg)
+ raise HTTPException(status_code=500, detail="Internal server error")
- return {
- "message": "Cached tickers retrieved successfully",
- "count": len(cached_tickers),
- "tickers": cached_tickers
- }
- except Exception as e:
- error_msg = f"Error retrieving cached tickers: {str(e)}"
- print(f"{error_msg}")
- raise HTTPException(status_code=500, detail=error_msg)
-
-
-if __name__ == "__main__":
- print("Starting StockSense ReAct Agent FastAPI development server...")
- uvicorn.run(
- "stocksense.main:app",
- host="0.0.0.0",
- port=8000,
- reload=True,
- log_level="info"
- )
\ No newline at end of file
diff --git a/stocksense/requirements.txt b/stocksense/requirements.txt
new file mode 100644
index 0000000..f2b916d
--- /dev/null
+++ b/stocksense/requirements.txt
@@ -0,0 +1,25 @@
+# FastAPI and web server dependencies
+fastapi==0.115.13
+uvicorn==0.34.3
+
+# Data collection dependencies
+yfinance==0.2.63
+newsapi-python==0.2.7
+
+# AI and LangChain dependencies
+langchain-google-genai==2.0.10
+langchain-core==0.3.65
+langgraph==0.4.8
+
+# Configuration and utilities
+python-dotenv==1.1.0
+
+# Google AI dependencies (required by langchain-google-genai)
+google-generativeai==0.8.5
+google-api-core==2.25.1
+google-auth==2.40.3
+
+# Core dependencies (automatically included by above packages)
+pydantic==2.11.7
+typing-extensions==4.14.0
+requests==2.32.4
\ No newline at end of file
diff --git a/tests/README.md b/tests/README.md
deleted file mode 100644
index 55a8528..0000000
--- a/tests/README.md
+++ /dev/null
@@ -1,137 +0,0 @@
-# StockSense Agent Test Suite
-
-This directory contains the automated test suite for the StockSense Agent project.
-
-## Test Structure
-
-### Unit Tests (`test_tools.py`)
-
-Tests for individual agent tools without external dependencies:
-
-- `fetch_news_headlines` tool functionality
-- `fetch_price_data` tool functionality
-- Error handling for invalid tickers
-- Data structure validation
-
-### Integration Tests (`test_api.py`)
-
-Tests for the FastAPI application endpoints:
-
-- Health check endpoint (`/health`)
-- Cached tickers endpoint (`/cached-tickers`)
-- API response format validation
-- Error handling and performance
-
-## Running Tests
-
-### Prerequisites
-
-```bash
-# Install pytest if not already installed
-pip install pytest requests
-```
-
-### Quick Start
-
-```bash
-# Run all tests
-python run_tests.py all
-
-# Run only unit tests (no server required)
-python run_tests.py unit
-
-# Run only API tests (requires server running)
-python run_tests.py api
-
-# Run quick smoke test
-python run_tests.py smoke
-```
-
-### Manual pytest Commands
-
-```bash
-# Run all tests
-pytest tests/ -v
-
-# Run only unit tests
-pytest tests/test_tools.py -v
-
-# Run only API tests
-pytest tests/test_api.py -v
-
-# Run with coverage (if coverage installed)
-pytest tests/ --cov=stocksense --cov-report=html
-```
-
-## Test Requirements
-
-### For Unit Tests
-
-- No external dependencies
-- Tests the tools in isolation
-- Validates data structures and error handling
-
-### For API Tests
-
-- Requires the FastAPI server to be running
-- Start server with: `python -m stocksense.main`
-- Tests live HTTP endpoints
-
-## Test Configuration
-
-The test suite is configured via `pytest.ini` with:
-
-- Test discovery patterns
-- Output formatting
-- Timeout settings
-- Custom markers for test categorization
-
-## Adding New Tests
-
-### Unit Test Example
-
-```python
-def test_new_tool_functionality():
- """Test a new tool function"""
- result = new_tool.invoke({"param": "value"})
- assert isinstance(result, dict)
- assert "expected_key" in result
-```
-
-### API Test Example
-
-```python
-def test_new_endpoint():
- """Test a new API endpoint"""
- response = requests.get(f"{BASE_URL}/new-endpoint")
- assert response.status_code == 200
- assert "expected_field" in response.json()
-```
-
-## Continuous Integration
-
-These tests are designed to be CI/CD friendly:
-
-- Unit tests can run without external dependencies
-- API tests can be skipped if server isn't available
-- Clear exit codes for success/failure
-- Detailed output for debugging
-
-## Troubleshooting
-
-### Common Issues
-
-1. **Import Errors**: Ensure the project root is in Python path
-2. **API Tests Failing**: Make sure the FastAPI server is running
-3. **Network Tests Slow**: Use `-m "not network"` to skip network-dependent tests
-4. **Tool Tests Failing**: Check if external data sources (news, price data) are accessible
-
-### Debug Mode
-
-```bash
-# Run with more verbose output
-pytest tests/ -v -s --tb=long
-
-# Run a specific test
-pytest tests/test_tools.py::TestNewsHeadlines::test_fetch_news_headlines_success -v
-```
diff --git a/tests/test_api.py b/tests/test_api.py
deleted file mode 100644
index d552d6c..0000000
--- a/tests/test_api.py
+++ /dev/null
@@ -1,307 +0,0 @@
-#!/usr/bin/env python3
-"""
-Integration Tests for StockSense Agent API
-
-This module contains integration tests for the FastAPI endpoints
-of the StockSense ReAct Agent application.
-
-Test Coverage:
-- Health check endpoint
-- Cached tickers endpoint
-- API response format validation
-- Error handling
-"""
-
-import pytest
-import requests
-import time
-import sys
-from pathlib import Path
-
-# Add the project root to Python path for imports
-project_root = Path(__file__).parent.parent
-sys.path.insert(0, str(project_root))
-
-# Configuration
-BASE_URL = "http://127.0.0.1:8000"
-REQUEST_TIMEOUT = 10 # seconds
-
-
-class TestAPIHealthAndStatus:
- """Test cases for basic API health and status endpoints"""
-
- def test_health_check(self):
- """Test the health check endpoint"""
- try:
- response = requests.get(f"{BASE_URL}/health", timeout=REQUEST_TIMEOUT)
-
- # Assert response status
- assert response.status_code == 200, f"Health check should return 200, got {response.status_code}"
-
- # Assert response content
- response_json = response.json()
- assert response_json == {"status": "ok"}, \
- f"Health check should return {{'status': 'ok'}}, got {response_json}"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running. Start with: python -m stocksense.main")
- except requests.exceptions.Timeout:
- pytest.fail("Health check endpoint timed out")
-
- def test_health_check_response_time(self):
- """Test that health check responds quickly"""
- try:
- start_time = time.time()
- response = requests.get(f"{BASE_URL}/health", timeout=REQUEST_TIMEOUT)
- response_time = time.time() - start_time
-
- assert response.status_code == 200
- assert response_time < 2.0, f"Health check should respond quickly, took {response_time:.2f}s"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
- def test_root_endpoint(self):
- """Test the root endpoint returns welcome message"""
- try:
- response = requests.get(f"{BASE_URL}/", timeout=REQUEST_TIMEOUT)
-
- assert response.status_code == 200
- response_json = response.json()
-
- # Check for expected fields
- assert "message" in response_json
- assert "description" in response_json
- assert "docs" in response_json
- assert "health" in response_json
-
- # Verify content
- assert "StockSense ReAct Agent API" in response_json["message"]
- assert response_json["docs"] == "/docs"
- assert response_json["health"] == "/health"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
-
-class TestCachedTickersEndpoint:
- """Test cases for the cached tickers endpoint"""
-
- def test_cached_tickers_endpoint(self):
- """Test the cached tickers endpoint basic functionality"""
- try:
- response = requests.get(f"{BASE_URL}/cached-tickers", timeout=REQUEST_TIMEOUT)
-
- # Assert response status
- assert response.status_code == 200, \
- f"Cached tickers endpoint should return 200, got {response.status_code}"
-
- # Assert response structure
- response_json = response.json()
- assert isinstance(response_json, dict), "Response should be a dictionary"
- assert "tickers" in response_json, "Response should contain 'tickers' key"
- assert isinstance(response_json["tickers"], list), \
- "The 'tickers' value should be a list"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
- except requests.exceptions.Timeout:
- pytest.fail("Cached tickers endpoint timed out")
-
- def test_cached_tickers_response_structure(self):
- """Test the structure of cached tickers response"""
- try:
- response = requests.get(f"{BASE_URL}/cached-tickers", timeout=REQUEST_TIMEOUT)
-
- assert response.status_code == 200
- response_json = response.json()
-
- # Check for expected keys
- expected_keys = ["message", "count", "tickers"]
- for key in expected_keys:
- assert key in response_json, f"Response should contain '{key}' key"
-
- # Validate data types
- assert isinstance(response_json["message"], str)
- assert isinstance(response_json["count"], int)
- assert isinstance(response_json["tickers"], list)
-
- # Validate consistency
- assert response_json["count"] == len(response_json["tickers"]), \
- "Count should match the number of tickers in the list"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
- def test_cached_tickers_empty_cache(self):
- """Test cached tickers endpoint when cache is empty"""
- try:
- response = requests.get(f"{BASE_URL}/cached-tickers", timeout=REQUEST_TIMEOUT)
-
- assert response.status_code == 200
- response_json = response.json()
-
- # Should handle empty cache gracefully
- assert "tickers" in response_json
- assert isinstance(response_json["tickers"], list)
- assert response_json["count"] >= 0
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
-
-class TestAPIErrorHandling:
- """Test cases for API error handling"""
-
- def test_nonexistent_endpoint(self):
- """Test that non-existent endpoints return 404"""
- try:
- response = requests.get(f"{BASE_URL}/nonexistent-endpoint", timeout=REQUEST_TIMEOUT)
-
- assert response.status_code == 404, \
- f"Non-existent endpoint should return 404, got {response.status_code}"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
- def test_invalid_method(self):
- """Test that invalid HTTP methods are handled properly"""
- try:
- # Try DELETE on health endpoint (should not be allowed)
- response = requests.delete(f"{BASE_URL}/health", timeout=REQUEST_TIMEOUT)
-
- # Should return 405 Method Not Allowed or 422 Unprocessable Entity
- assert response.status_code in [405, 422], \
- f"Invalid method should return 405 or 422, got {response.status_code}"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
-
-class TestAPIDocumentation:
- """Test cases for API documentation endpoints"""
-
- def test_docs_endpoint_accessible(self):
- """Test that the API docs endpoint is accessible"""
- try:
- response = requests.get(f"{BASE_URL}/docs", timeout=REQUEST_TIMEOUT)
-
- # Should return 200 and HTML content
- assert response.status_code == 200
- assert "text/html" in response.headers.get("content-type", "").lower()
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
- def test_redoc_endpoint_accessible(self):
- """Test that the ReDoc endpoint is accessible"""
- try:
- response = requests.get(f"{BASE_URL}/redoc", timeout=REQUEST_TIMEOUT)
-
- # Should return 200 and HTML content
- assert response.status_code == 200
- assert "text/html" in response.headers.get("content-type", "").lower()
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
-
-class TestAPIPerformance:
- """Test cases for API performance and reliability"""
-
- def test_concurrent_health_checks(self):
- """Test multiple concurrent health check requests"""
- try:
- import concurrent.futures
- import threading
-
- def make_health_request():
- response = requests.get(f"{BASE_URL}/health", timeout=REQUEST_TIMEOUT)
- return response.status_code
-
- # Make 5 concurrent requests
- with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:
- futures = [executor.submit(make_health_request) for _ in range(5)]
- results = [future.result() for future in concurrent.futures.as_completed(futures)]
-
- # All should succeed
- assert all(status == 200 for status in results), \
- f"All concurrent requests should succeed, got {results}"
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
- except ImportError:
- pytest.skip("concurrent.futures not available")
-
- def test_api_response_headers(self):
- """Test that API returns appropriate headers"""
- try:
- response = requests.get(f"{BASE_URL}/health", timeout=REQUEST_TIMEOUT)
-
- assert response.status_code == 200
-
- # Check for standard HTTP headers
- headers = response.headers
- assert "content-type" in headers, "Content-Type header should be present"
- assert "server" in headers, "Server header should be present"
-
- # CORS headers are optional but good to have
- has_cors = ("access-control-allow-origin" in headers or
- "Access-Control-Allow-Origin" in headers)
- # Log CORS status but don't fail the test
- print(f"CORS headers present: {has_cors}")
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
-
-# Utility fixtures and helper functions
-@pytest.fixture(scope="session")
-def api_server_check():
- """Fixture to check if API server is running before tests"""
- try:
- response = requests.get(f"{BASE_URL}/health", timeout=5)
- if response.status_code != 200:
- pytest.exit("API server is not responding correctly")
- except requests.exceptions.ConnectionError:
- pytest.exit("API server is not running. Start with: python -m stocksense.main")
- return True
-
-
-@pytest.fixture
-def api_client():
- """Fixture providing a configured requests session"""
- session = requests.Session()
- session.timeout = REQUEST_TIMEOUT
- return session
-
-
-# Integration test using fixtures
-def test_api_integration_flow(api_client):
- """Test a complete integration flow"""
- try:
- # 1. Check health
- health_response = api_client.get(f"{BASE_URL}/health")
- assert health_response.status_code == 200
-
- # 2. Get root info
- root_response = api_client.get(f"{BASE_URL}/")
- assert root_response.status_code == 200
-
- # 3. Check cached tickers
- tickers_response = api_client.get(f"{BASE_URL}/cached-tickers")
- assert tickers_response.status_code == 200
-
- # All responses should be JSON
- assert all(resp.headers.get("content-type", "").startswith("application/json")
- for resp in [health_response, root_response, tickers_response])
-
- except requests.exceptions.ConnectionError:
- pytest.skip("API server is not running")
-
-
-if __name__ == "__main__":
- # Allow running this file directly for quick testing
- print(f"Testing API at: {BASE_URL}")
- print("Make sure the API server is running with: python -m stocksense.main")
- pytest.main([__file__, "-v"])
diff --git a/tests/test_tools.py b/tests/test_tools.py
deleted file mode 100644
index e8253b5..0000000
--- a/tests/test_tools.py
+++ /dev/null
@@ -1,228 +0,0 @@
-#!/usr/bin/env python3
-"""
-Unit Tests for StockSense Agent Tools
-
-This module contains unit tests for the core data collection tools
-used by the StockSense ReAct Agent.
-
-Test Coverage:
-- fetch_news_headlines tool
-- fetch_price_data tool
-- Error handling for invalid tickers
-"""
-
-import pytest
-import sys
-from pathlib import Path
-
-# Add the project root to Python path for imports
-project_root = Path(__file__).parent.parent
-sys.path.insert(0, str(project_root))
-
-from stocksense.react_agent import fetch_news_headlines, fetch_price_data
-
-
-class TestNewsHeadlines:
- """Test cases for the fetch_news_headlines tool"""
-
- def test_fetch_news_headlines_success(self):
- """Test successful news headlines retrieval with valid ticker"""
- # Use a common, stable ticker for testing
- ticker = "MSFT"
-
- # Call the tool function
- result = fetch_news_headlines.invoke({"ticker": ticker})
-
- # Assertions
- assert isinstance(result, dict), "Result should be a dictionary"
- assert "headlines" in result, "Result should contain 'headlines' key"
- assert isinstance(result["headlines"], list), "Headlines should be a list"
-
- # Check for successful retrieval
- if result.get("success", False):
- assert len(result["headlines"]) > 0, "Headlines list should not be empty for valid ticker"
-
- # Verify expected structure
- assert "ticker" in result, "Result should contain ticker information"
- assert result["ticker"] == ticker.upper(), "Ticker should be normalized to uppercase"
-
- def test_fetch_news_headlines_structure(self):
- """Test the structure of news headlines response"""
- ticker = "AAPL"
- result = fetch_news_headlines.invoke({"ticker": ticker})
-
- # Check response structure
- expected_keys = ["success", "headlines", "ticker"]
- for key in expected_keys:
- assert key in result, f"Result should contain '{key}' key"
-
- # If successful, check headlines structure
- if result.get("success") and result.get("headlines"):
- headlines = result["headlines"]
- assert all(isinstance(headline, str) for headline in headlines), \
- "All headlines should be strings"
-
- def test_fetch_news_headlines_invalid_ticker(self):
- """Test news headlines tool with invalid ticker"""
- invalid_ticker = "INVALIDTICKERXYZ"
-
- # Call the tool function
- result = fetch_news_headlines.invoke({"ticker": invalid_ticker})
-
- # Assertions for graceful error handling
- assert isinstance(result, dict), "Result should be a dictionary even for invalid ticker"
- assert "headlines" in result, "Result should contain 'headlines' key"
- assert isinstance(result["headlines"], list), "Headlines should be a list"
-
- # Should handle gracefully - either empty list or error indicator
- assert (
- len(result["headlines"]) == 0 or
- result.get("success") == False or
- any("error" in str(result).lower() for key in result.keys())
- ), "Invalid ticker should be handled gracefully"
-
-
-class TestPriceData:
- """Test cases for the fetch_price_data tool"""
-
- def test_fetch_price_data_success(self):
- """Test successful price data retrieval with valid ticker"""
- ticker = "GOOGL"
-
- # Call the tool function
- result = fetch_price_data.invoke({"ticker": ticker})
-
- # Assertions
- assert isinstance(result, dict), "Result should be a dictionary"
- assert "price_data" in result, "Result should contain 'price_data' key"
- assert isinstance(result["price_data"], list), "Price data should be a list"
-
- # Check for successful retrieval
- if result.get("success", False):
- assert len(result["price_data"]) > 0, "Price data list should not be empty for valid ticker"
-
- # Verify OHLCV structure if data is present
- if result["price_data"]:
- sample_record = result["price_data"][0]
- assert isinstance(sample_record, dict), "Each price record should be a dictionary"
-
- # Check for required OHLCV fields
- expected_fields = ["Date", "Open", "High", "Low", "Close", "Volume"]
- for field in expected_fields:
- assert field in sample_record, f"Price record should contain '{field}' field"
-
- # Verify expected metadata
- assert "ticker" in result, "Result should contain ticker information"
- assert result["ticker"] == ticker.upper(), "Ticker should be normalized to uppercase"
-
- def test_fetch_price_data_structure(self):
- """Test the structure of price data response"""
- ticker = "AAPL"
- result = fetch_price_data.invoke({"ticker": ticker, "period": "5d"})
-
- # Check response structure
- expected_keys = ["success", "price_data", "ticker", "has_data"]
- for key in expected_keys:
- assert key in result, f"Result should contain '{key}' key"
-
- # If successful, validate data structure
- if result.get("success") and result.get("price_data"):
- price_data = result["price_data"]
-
- for record in price_data[:3]: # Check first 3 records
- assert isinstance(record, dict), "Each price record should be a dictionary"
-
- # Validate data types
- if "Date" in record:
- assert isinstance(record["Date"], str), "Date should be a string"
-
- numeric_fields = ["Open", "High", "Low", "Close"]
- for field in numeric_fields:
- if field in record and record[field] is not None:
- assert isinstance(record[field], (int, float)), \
- f"{field} should be numeric"
-
- if "Volume" in record and record["Volume"] is not None:
- assert isinstance(record["Volume"], (int, float)), \
- "Volume should be numeric"
-
- def test_fetch_price_data_invalid_ticker(self):
- """Test price data tool with invalid ticker"""
- invalid_ticker = "INVALIDTICKERXYZ"
-
- # Call the tool function
- result = fetch_price_data.invoke({"ticker": invalid_ticker})
-
- # Assertions for graceful error handling
- assert isinstance(result, dict), "Result should be a dictionary even for invalid ticker"
- assert "price_data" in result, "Result should contain 'price_data' key"
- assert isinstance(result["price_data"], list), "Price data should be a list"
-
- # Should handle gracefully - either empty list or error indicator
- assert (
- len(result["price_data"]) == 0 or
- result.get("success") == False or
- "error" in str(result).lower()
- ), "Invalid ticker should be handled gracefully"
-
-
-class TestCombinedDataRetrieval:
- """Test cases for combined data retrieval scenarios"""
-
- def test_fetch_data_consistency(self):
- """Test that both tools return consistent ticker formatting"""
- ticker = "msft" # Use lowercase to test normalization
-
- news_result = fetch_news_headlines.invoke({"ticker": ticker})
- price_result = fetch_price_data.invoke({"ticker": ticker})
-
- # Both should return the ticker (case-insensitive comparison)
- assert news_result.get("ticker").upper() == ticker.upper()
- assert price_result.get("ticker").upper() == ticker.upper()
-
- def test_error_handling_consistency(self):
- """Test that both tools handle errors consistently"""
- invalid_ticker = "INVALIDTICKER123"
-
- news_result = fetch_news_headlines.invoke({"ticker": invalid_ticker})
- price_result = fetch_price_data.invoke({"ticker": invalid_ticker})
-
- # Both should return dictionary structure even on error
- assert isinstance(news_result, dict)
- assert isinstance(price_result, dict)
-
- # Both should have expected keys
- assert "headlines" in news_result
- assert "price_data" in price_result
-
-
-# Pytest fixtures for common test data
-@pytest.fixture
-def valid_tickers():
- """Fixture providing list of valid test tickers"""
- return ["AAPL", "MSFT", "GOOGL", "AMZN", "TSLA"]
-
-
-@pytest.fixture
-def invalid_tickers():
- """Fixture providing list of invalid test tickers"""
- return ["INVALIDXYZ", "NOTREAL123", "BADTICKER"]
-
-
-# Integration test using fixtures
-def test_multiple_valid_tickers(valid_tickers):
- """Test tools with multiple valid tickers"""
- for ticker in valid_tickers[:2]: # Test first 2 to avoid rate limits
- news_result = fetch_news_headlines.invoke({"ticker": ticker})
- price_result = fetch_price_data.invoke({"ticker": ticker})
-
- # Basic structure checks
- assert isinstance(news_result, dict)
- assert isinstance(price_result, dict)
- assert news_result.get("ticker") == ticker
- assert price_result.get("ticker") == ticker
-
-
-if __name__ == "__main__":
- # Allow running this file directly for quick testing
- pytest.main([__file__, "-v"])