A sophisticated marketing campaign engine that leverages real-time weather data, local market dynamics, and a dual-layer AI to generate strategic, data-driven campaign recommendations for local businesses.
You can test the live API directly by building a request URL.
1. Copy the template below:
https://api.eesita.me/recommend?zipcode=YOUR_ZIP_CODE&store_type=YOUR_STORE_TYPE
2. Replace the placeholders:
YOUR_ZIP_CODEβ Enter any valid zip code (e.g.,90210).YOUR_STORE_TYPEβ Choose a type from the list below.
Suggested Store Types:
grocery_store, restaurant, clothing_store, book_store, cafe, gym, hardware_store, pharmacy, flower_shop, bakery
3. Test the final URL in your browser. For example: https://api.eesita.me/recommend?zipcode=90210&store_type=cafe
- Dynamic Weather Integration: Ingests 7-day weather forecasts to recommend timely, weather-appropriate promotions.
- Hyper-Local Market Analysis: Analyzes local competitor density, customer ratings, and business sentiment to identify market gaps and opportunities.
- Dual-Layer AI Strategy:
- Campaign Generator: An AI model that creates initial campaign ideas based on a comprehensive feature vector of the market.
- Marketing Expert AI: A second AI model that refines, validates, and enhances the initial ideas, ensuring they are realistic, compelling, and effective.
- RESTful API: Simple and clean API for generating recommendations on demand.
- Containerized & Cloud-Ready: Fully containerized with Docker for easy deployment and architected for scalable deployment on cloud services like AWS ECS.
The system follows a data-processing pipeline that culminates in AI-powered analysis:
- Data Fetching: The
fetch_datamodule retrieves store and competitor data from the Google Places API and weather forecasts from the Open-Meteo API. - Data Processing & Feature Engineering: Raw data is cleaned, and key features are extracted. This includes calculating competitor density, aggregating customer ratings, and processing weather data into actionable insights (
src/). - Sentiment Analysis: Customer reviews are analyzed to generate sentiment scores for different store categories.
- AI Recommendation Engine:
- A detailed JSON
feature_vectoris constructed, summarizing the entire market context. - This vector is fed to the first Gemini AI model to generate initial campaign recommendations.
- A second, "marketing expert" AI model reviews the initial output, along with raw aggregated data, to improve the campaigns' realism, psychological impact, and competitive edge.
- A detailed JSON
- API Server: A Flask server exposes the
/recommendendpoint to deliver the final JSON-formatted recommendations.
- Docker and Docker Compose installed.
- AWS CLI installed and configured (for AWS deployment).
- Valid API keys for:
- Google Cloud Platform (with Places API enabled)
- Google AI Studio (for Gemini API)
Clone the repository and create a .env file for your API keys.
git clone https://github.com/your-username/marketing-campaign-recommendation.git
cd marketing-campaign-recommendation
touch .envAdd your API keys to the .env file:
# .env
GEMINI_API_KEY="your_gemini_api_key_here"
GOOGLE_PLACES_API_KEY="your_google_places_api_key_here"This is the recommended method for local development.
# Build and run the container in detached mode
docker-compose up --build -dThe application will be available at http://localhost:3000.
Other useful commands:
# View container logs
docker-compose logs -f
# Check running services
docker-compose ps
# Stop and remove the containers
docker-compose down# Install Python dependencies
pip install -r requirements.txt
# Run the Flask application
python server.pyThis application is designed to be deployed as a container on AWS Elastic Container Service (ECS) with a Fargate launch type.
-
Build and Push the Docker Image to ECR:
- The image must be built for the
linux/amd64platform to be compatible with AWS Fargate. This is pre-configured in thedocker-compose.ymlfile. - Create an ECR repository and push the multi-platform image to it.
- The image must be built for the
-
Set Up ECS Cluster:
- Create a new ECS cluster to host the service.
-
Create a Task Definition:
- This blueprint defines how to run the container. It specifies the ECR image URI, CPU/memory resources, port mappings, and environment variables.
- Crucially, API keys should be injected securely using AWS Secrets Manager, not as plaintext environment variables.
-
Create an Application Load Balancer (ALB):
- Set up an ALB and a Target Group to expose the service to the internet.
- The Target Group's health check should point to the
/endpoint on the container's port (3000).
-
Create the ECS Service:
- The service launches the task and connects it to the ALB's Target Group.
- Critical Networking Configuration: Ensure the service's Security Group allows inbound traffic on port
3000from the ALB's Security Group to pass health checks.
GET /- Description: Confirms that the API is running.
- Success Response (
200 OK):{ "status": "healthy" }
GET /recommend?zipcode={zipcode}&store_type={store_type}- Description: Generates marketing campaigns based on the location and store type.
- Query Parameters:
zipcode(string, required): The target postal code (e.g.,90210).store_type(string, required): The type of store (e.g.,grocery_store,book_store).
- Example Request:
curl "https://api.eesita.me/recommend?zipcode=10001&store_type=clothing_store" - Example Success Response (
200 OK):{ "Insights": [ "Insight 1: With a sunny, warm 7-day forecast, foot traffic is expected to increase.", "Insight 2: Local competition is moderate, but customer ratings for competitors are average, indicating an opportunity to capture market share with a superior experience." ], "Campaigns": [ { "Campaign Title": "Sunshine & Style: Early Summer Showcase", "Campaign Description": "The weather is perfect for a wardrobe refresh! Visit us this week to explore our new summer collection. Enjoy a complimentary iced tea while you shop.", "Campaign Duration": "June 25, 2024 - July 1, 2024", "Discount/Promo": "15% off all new summer arrivals." } ] }
.
βββ src/ # Core data processing and feature engineering modules
β βββ cleaning.py
β βββ feature_extraction.py
β βββ feature_pipeline.py
β βββ fetch_data.py
β βββ sentiment.py
β βββ weather_features.py
βββ data/ # Directory for storing intermediate data files (auto-generated)
βββ logs/ # Directory for storing logs (auto-generated)
βββ .dockerignore
βββ .gitignore
βββ .env # Local environment variables (must be created manually)
βββ docker-compose.yml # Docker Compose configuration for local development
βββ Dockerfile # Defines the container image
βββ README.md # This file
βββ requirements.txt # Python dependencies
βββ server.py # Flask API server and main application logic
- 7-day daily forecast
- Temperature (max/min)
- Precipitation
- Weather codes
- Humidity and wind data
- Competitor store locations
- Customer ratings
- Store categories
- Geographic coordinates
- Business information
- Campaign generation
- Market insights
- Consumer psychology
- Competitive analysis
- API keys stored as environment variables
- No sensitive data in codebase
- Production-ready configuration
- Health checks and error handling
This project is licensed under the MIT License.