Turns Slack + GitHub activity into structured work analytics for Engineering Managers, Tech Leads, and CTOs.
- Slack activity — pulls messages from public and private channels you're in, including standup bots (Standuply-style)
- GitHub activity — PRs (created + reviewed) cross-org via Search API + PAT, with optional per-repo deep mode for commits/issues
- Team management — add engineers to your roster; they don't need to sign in
- Batch sync — sync yourself, your whole team, or any subset at once with per-member progress
- Background sync — runs in a daemon thread; switch pages freely without losing progress
- Flexible date ranges — "Last N days" or custom range; default 3 months
- AI classification — Claude surfaces feature work, bug fixes, architecture, mentorship, and incidents
- Shareable reports — activity feed, AI insights, and one-click copy summary per member
- Database cleanup — remove ignored-channel data or stale member data in bulk
- Notion Dev Track Sync — reads per-developer track pages from a Notion database and syncs skill status + objectives into the Google Sheet snapshot
| Layer | Technology |
|---|---|
| UI | Streamlit |
| Database | PostgreSQL (SQLAlchemy async + NullPool) |
| Slack | Sign in with Slack — user OAuth (no bot, no webhooks) |
| GitHub | GitHub REST + Search API — Personal Access Token (PAT) |
| AI | Anthropic Claude API |
| Notion | Notion API v1 — Internal Integration token |
| Migrations | Alembic |
The optional FastAPI app (app/main.py) exposes JSON routes under /api/* and POST /auth/github/link. Threat model: treat any host/port that serves this app as sensitive. In production, set INTERNAL_API_KEY (at least 16 characters) and send it as the X-Internal-Key header from callers; APP_ENV=production refuses to start without a strong APP_SECRET_KEY and INTERNAL_API_KEY. Set CORS_ORIGINS to a comma-separated allowlist if a browser (not Streamlit server-side) calls the API. For local Docker, bind Uvicorn to 127.0.0.1 unless you intend the API to be reachable on the LAN. Slack browser OAuth callbacks (/auth/slack*) stay unauthenticated by design; they rely on signed OAuth state plus Slack’s code exchange.
| Item | Notes |
|---|---|
APP_ENV=production |
Enables stricter validation and redacts exception text in the Streamlit UI and FastAPI JSON errors so secrets are less likely to leak through error pages. |
APP_SECRET_KEY |
Random string, ≥ 32 characters — signs Streamlit session URL tokens and Slack OAuth state. |
INTERNAL_API_KEY |
≥ 16 characters — required header X-Internal-Key for gated FastAPI routes. |
CORS_ORIGINS |
Comma-separated browser origins if anything client-side calls FastAPI; empty in production means no cross-origin browser access. |
| HTTPS only | Slack OAuth redirects and session hand-off query params must not go over plaintext HTTP. |
| PostgreSQL | Use encryption at rest (default on most managed DBs). Lock down network access (VPC, firewall, non-public binding). Slack user OAuth tokens are stored in slack_user_tokens.access_token. |
| Migrations + Slack | After upgrades that add client_session_key, have each operator run Sign in with Slack once on Connect Accounts so session restore stays bound to the correct user. |
| Operational hygiene | Do not paste secrets, tokens, or full tracebacks into tickets or screenshots; use host logs and secret managers. Rotate GITHUB_PAT and INTERNAL_API_KEY on suspected compromise. |
No Redis or Celery. Syncs run in daemon threads within the Streamlit process.
- Python 3.13+
- A PostgreSQL database (Supabase recommended for cloud, Docker for local)
- A Slack OAuth app
- A GitHub Personal Access Token (PAT) with
repo+read:orgscopes - An Anthropic API key
- A Google Cloud service account with Sheets API access — for Developer Track
- A Notion Internal Integration token — for Notion Dev Track Sync
Why cloud first? Slack OAuth requires
https://redirect URLs — even forlocalhost. Streamlit Cloud gives you HTTPS out of the box with zero cert setup.
a. Create the OAuth app — api.slack.com/apps → Create New App → From Scratch.
b. Add User Token Scopes under OAuth & Permissions → User Token Scopes:
channels:history — read public channel messages
channels:read — list public channels
groups:history — read private channel messages
groups:read — list private channels
users:read — resolve user profiles
users:read.email — resolve user emails
usergroups:read — resolve @subteam mentions to group handles
These must be User Token Scopes, not Bot Token Scopes.
c. Set the redirect URL — OAuth & Permissions → Redirect URLs → add https://yourapp.streamlit.app (root URL, no /callback path).
d. Copy credentials — copy the Client ID and Client Secret from Basic Information. You'll paste them into the secrets step below.
a. Create a Personal Access Token — github.com/settings/tokens → Generate new token (classic).
| Scope | Why |
|---|---|
repo |
Read PRs, reviews, commits in private + public repos |
read:org |
Cross-org Search API visibility (needed for Overview-mode sync) |
b. Copy the token (ghp_… or github_pat_…) and set it as GITHUB_PAT in .env or Streamlit secrets. The PAT is never stored in the database — rotate by updating the secret and rebooting. One PAT covers the whole team.
- Push your code to GitHub
- Go to share.streamlit.io → New app → select repo/branch → set main file to
streamlit_app.py→ Deploy
Streamlit Cloud blocks direct TCP on port 5432 — use Supabase's Transaction Pooler (port 6543).
- Create a Supabase project → Settings → Database → Connection Pooling → Mode: Transaction
- Copy the pooler connection string:
postgresql+asyncpg://postgres.xxxx:[password]@aws-0-us-east-1.pooler.supabase.com:6543/postgres?ssl=require
| Connection type | Port | Streamlit Cloud |
|---|---|---|
| Direct | 5432 | ❌ Blocked |
| Session pooler | 5432 | ❌ Blocked |
| Transaction pooler | 6543 | ✅ Works |
Run migrations from your local machine using the direct connection (port 5432):
DATABASE_URL="postgresql+asyncpg://postgres:[password]@db.xxxx.supabase.co:5432/postgres" \
.venv/bin/alembic upgrade headGo to your Streamlit app → ⋮ → Settings → Secrets:
DATABASE_URL = "postgresql+asyncpg://postgres.xxxx:[password]@aws-0-us-east-1.pooler.supabase.com:6543/postgres?ssl=require"
SLACK_CLIENT_ID = "..."
SLACK_CLIENT_SECRET = "..."
# GitHub PAT — single token for the whole team. Required.
# Required scopes: repo + read:org.
GITHUB_PAT = "ghp_..."
ANTHROPIC_API_KEY = "sk-ant-..."
ANTHROPIC_MODEL = "claude-sonnet-4-6"
APP_BASE_URL = "https://yourapp.streamlit.app"
APP_SECRET_KEY = "..." # see note below
ENABLE_AI_EXTRACTION = "true"
# See "Developer Track (Google Sheets)" under Usage
GOOGLE_SHEETS_CREDENTIALS_JSON = ""
DEV_TRACK_SHEET_ID = ""
# See "Notion Dev Track Sync" under Usage
NOTION_API_KEY = ""
NOTION_DEV_TRACK_DATABASE_ID = ""
NOTION_DEV_TRACK_VIEW_ID = "" # optional — filter by a specific view
APP_SECRET_KEYsigns the session cookie that keeps you logged in across page navigations and OAuth redirects. Generate one with:python -c "import secrets; print(secrets.token_hex(32))"Use the same key every deploy — changing it invalidates all existing sessions.
Click Reboot app after saving.
Go to 🔗 Connect Accounts → Sign in with Slack.
GitHub uses a single server-wide PAT configured via the GITHUB_PAT env var / Streamlit secret — no PAT is pasted in the UI and none is stored in the DB. The Connect page only validates the server PAT and links your GitHub handle so sync can query your PRs/reviews.
One PAT covers the whole team. Required scopes:
repo+read:org(for cross-org Search API access).
Go to 👥 Team Overview → Load workspace users → select your direct reports → Add selected members.
Set each member's GitHub handle (github_login) so the manager's PAT can query their PRs/reviews via the Search API.
Team members do not need to sign in. They do not need their own PAT.
Go to 🔄 Sync Data, pick members (All / My Team / individual), set the date range, and click Sync Slack or Sync GitHub.
The sync runs in the background — switch pages freely and come back to see progress.
GitHub sync — two modes:
| Mode | Toggle | What it pulls | When to use |
|---|---|---|---|
| 🔭 Overview mode (default) | ON | PRs created + PRs reviewed by member, across all organizations the PAT can see | Default. Mirrors github.com/<user>?tab=overview&from=YYYY-MM-DD&to=YYYY-MM-DD. Cross-org. Fast. Single Search API query per member. |
| Per-repo mode | OFF | Commits + PRs + reviews + issues, repo-by-repo (only repos the token lists) | When you need commits, or the member is in a small fixed set of repos and you want the full payload. Slower. |
Overview mode uses the GitHub Search API with your Personal Access Token (PAT) to pull everything the member did in any org your PAT has visibility into — even repos you've never indexed. Skips commits (commit-search is heavily rate-limited on GitHub's side).
Token resolution (per member):
GITHUB_PATenv/secret + member's storedgithub_login→ sync.- Either missing → skip.
No DB-stored PAT path. Set GITHUB_PAT in env/secrets and each member's handle in Team Overview.
Date range maps directly to the overview URL:
from=sync_start.date() to=sync_end.date() (or "now" if open-ended).
Slack sync behaviour:
| Scenario | What's captured |
|---|---|
| Syncing yourself | All messages from all joined channels |
| Syncing a team member | Only messages sent by or @mentioning that member |
| Standup bot (Standuply-style) | Bot's username matched to member's display name |
Ignored channels (always skipped): nimble-*, *-activity, *-corner, ic-*, and a few exact names.
Database cleanup (bottom of Sync page): remove ignored-channel data across all users, or clear stale data for removed members.
For demo deployments (for example Streamlit Cloud), you can run scheduled syncs without an always-on Celery worker by using the included workflow:
- Workflow file:
.github/workflows/scheduled-sync.yml - Trigger: weekdays at 07:30 in
Asia/Ho_Chi_Minh(ICT, UTC+7 — same wall clock asAsia/Bangkok) - Execution mode: inline dispatcher (
dispatch_due_sync_schedules_inline) — no Redis broker required
Setup steps:
- Push this repository (including the workflow) to GitHub.
- In GitHub repo settings, add these Actions Secrets:
DATABASE_URLSLACK_CLIENT_IDSLACK_CLIENT_SECRETAPP_SECRET_KEYINTERNAL_API_KEY(required, >= 16 chars whenAPP_ENV=production)GH_PAT(required only if scheduled GitHub sync is enabled)
- In Actions tab, run Scheduled Sync once via Run workflow to verify.
- Confirm logs show
Scheduled sync inline execution completed.and that due schedules updatelast_run_at/next_run_at.
Notes:
- The workflow uses GitHub’s
timezonefield so the cron expresses local 07:30 ICT directly (not mental UTC conversion). Withouttimezone, cron is interpreted in UTC only. - If no schedule is due at runtime, the job exits successfully without syncing.
- This mode is intended for demos/low volume. For higher throughput, use Celery worker + beat + Redis.
Go to 📊 Work Report, select a member, choose a date range, and click Generate Report.
- 🔗 Pull Request Links — three grouped expanders (PRs Created · PRs Merged · PRs Reviewed) with direct links, deduped per PR
- Activity Feed — commits, PRs, reviews, standups, all browsable
- AI Insights — Claude-powered work classification and leadership summary
- Developer Track — level + skill progress from a Google Sheet (see below)
- Share Summary — one-click copy for Slack, email, or docs
Shows each member's skill-vetting progress in the Work Report. Cell background colors drive status:
| Swatch | Color | Status |
|---|---|---|
| 🟢 | green | vetted |
| 🔵 | blue | in progress |
| 🟣 | purple | proposed |
| 🟡 | yellow | focus |
| ⚪ | white | not started |
Cell notes render inline.
Sheet format — one tab per person. The tab name must share a token (≥3 chars) with the member's Slack display name, real name, or email local-part (e.g. don.vo@… → Don Vo), so first name, last name, or full name all work. Column A = integer level, column B = level title, column C = Technical skills, column D = Soft skills.
| Col A | Col B | Col C — Technical skills | Col D — Soft skills |
|---|---|---|---|
3 |
Junior Software Developer | skill text | skill text |
4 |
Mid-senior Software Developer | skill text | skill text |
Setup:
- In Google Cloud Console, enable the Google Sheets API and create a Service Account (no roles needed). Add a JSON key — a file downloads.
- In the sheet, click Share and grant Editor to the key's
client_email(Editor is required so the Notion Dev Track Sync can write back). - Set
GOOGLE_SHEETS_CREDENTIALS_JSON(full JSON as a single-line string) andDEV_TRACK_SHEET_ID(URL segment between/d/and/edit) in secrets /.env. Reboot.
TOML tip: wrap the JSON in single quotes so its double quotes parse; leave
\ninprivate_keyas-is.
Troubleshooting: "No developer-track tab found" → rename the tab to include the member's first name, last name, or full name as it appears in Slack. "Caller does not have permission" → share the sheet with client_email. "not valid JSON" → re-copy the full file, single-quote-wrapped.
Reads per-developer track data from a Notion database and writes skill statuses, objectives, and evidence notes into the matching Google Sheet tab. Notion is always treated as the source of truth; the Sheet is the snapshot.
Notion database setup:
- Each database entry represents one developer.
- Page title format:
{developer name} <> {manager name}— e.g.Don <> Mike. - Each page body must contain a
## Skills Developmentsection with### Level Nheadings, toggle/bullet skills (bold), and- [ ]/- [x]to-do objectives. - Optionally add a
## Focus Areassection with bulleted skill names; those skills get "focus" (yellow) status in the Sheet.
Setup:
- In Notion: Settings → Connections → Develop or manage integrations → New integration. Set type to Internal, copy the Internal Integration Secret.
- Share the Notion database with the integration (open the database → ··· → Connections → add your integration).
- Copy the database ID from its URL:
notion.so/.../{DATABASE_ID}?v={VIEW_ID}. - (Optional) Copy the view ID — the
v=segment of the same URL — if you want the sync to mirror a specific Notion view's filter + sort (e.g. only active developers). Leave blank to sync every page in the database. - Set
NOTION_API_KEY(integration secret),NOTION_DEV_TRACK_DATABASE_ID, and optionallyNOTION_DEV_TRACK_VIEW_IDin secrets /.env. - Re-share the Google Sheet with the service account as Editor (Viewer was enough for the Work Report read path; writes need Editor).
- Navigate to 📋 Notion Dev Track Sync in the app → Fetch from Notion → preview matches → Sync.
View-based filtering uses Notion's Views API (
/v1/views/{view_id}/queries,Notion-Version: 2026-03-11). Your integration must have access to the parent database; shared-view access is inherited from the database share.
Sync behaviour:
- Notion = source of truth; Sheet = snapshot.
- Skills in Notion but not in Sheet → added. Skills in Sheet but not in Notion → untouched (no deletes). Note mismatch → Notion wins.
- Status from objective phrasing (priority):
- Unchecked V-ing ("Working as…", "Actively raising…") or
In-progress:/In-review:→ blue. - Unchecked
New objective:/To-review objective:→ yellow. - All checked + current cell blue/yellow → white.
- Current cell green/purple → unchanged.
- Unchecked V-ing ("Working as…", "Actively raising…") or
- Focus intent driven by to-do items only, never skill title.
- Focus Areas kept in sync: skills gaining focus intent → bullet added; losing it → removed. Bullets append inside heading when toggleable, else as siblings directly under heading (never page bottom).
- Preview before apply — per-member diff shown; sync only on explicit button. Per-member failures don't block others.
- Timestamps render in GMT+7 (DB stays UTC).
The ingester handles two patterns:
Thread replies — member replies to a bot's top-level post with their own account. Captured via conversations.replies.
Bot-reposted summaries (Standuply-style) — bot reposts each member's standup as a bot_message with username set to the member's full name. The ingester matches this against the team roster (case-insensitive). The member must be in Team Overview with a matching display name.
Note: Slack OAuth rejects plain
http://redirect URLs. Run Streamlit over HTTPS locally using self-signed certs.
git clone <repo-url>
cd el-operation-manager-ai
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txtcp .env.example .envRequired keys:
APP_SECRET_KEY=<run: python -c "import secrets; print(secrets.token_hex(32))">
APP_BASE_URL=https://localhost:8501
DATABASE_URL=postgresql+asyncpg://postgres:postgres@localhost:5432/el_ops
SLACK_CLIENT_ID=...
SLACK_CLIENT_SECRET=...
GITHUB_PAT=ghp_... # repo + read:org. Never stored in DB.
ANTHROPIC_API_KEY=sk-ant-...Optional (enable their features): GOOGLE_SHEETS_CREDENTIALS_JSON + DEV_TRACK_SHEET_ID (Developer Track), NOTION_API_KEY + NOTION_DEV_TRACK_DATABASE_ID (+ optional NOTION_DEV_TRACK_VIEW_ID).
APP_SECRET_KEY signs the session cookie — keep it stable across restarts or your Slack session resets every reload.
Config source priority:
st.secrets(if present) →.envfile.
make up # docker compose up -d db redis (Redis is defined but unused by the Streamlit UI)
make migrate # alembic upgrade headSyncs run in daemon threads inside the Streamlit process — no Celery worker / Redis needed at runtime. The worker / beat / api Makefile targets exist for legacy paths and are not required for normal use.
./scripts/setup_local_https.shUses mkcert if available, otherwise falls back to openssl. Writes to .certs/localhost.pem and .certs/localhost-key.pem.
streamlit run streamlit_app.py \
--server.port 8501 \
--server.address localhost \
--server.sslCertFile .certs/localhost.pem \
--server.sslKeyFile .certs/localhost-key.pemSet the Slack OAuth redirect URL to https://localhost:8501. GitHub needs no redirect — the GITHUB_PAT env var is read directly at sync time.
streamlit_app.py # Main UI + OAuth callback handler
pages/
├── 1_Connect.py # Slack OAuth + GitHub handle linking
├── 2_Work_Report.py # Work reports — charts, feed, share summary
├── 3_Team_Overview.py # Team management — add/remove/edit members
├── 4_Sync.py # Slack + GitHub sync with background progress
└── 5_Notion_Dev_Track.py # Notion dev track preview, diff, and sync
app/
├── config.py # Settings (pydantic-settings + .env)
├── database.py # SQLAlchemy async engine + session
├── models/ # SlackMessage, GitHubActivity, WorkUnit, etc.
├── ingestion/ # Slack + GitHub data ingestion
├── normalization/ # Raw → WorkUnit normalizer
├── analytics/ # Report aggregation + Notion dev track parser/sync
├── ai/ # Claude-powered classification + insights
├── integrations/ # Google Sheets + Notion async clients
├── slack/ # OAuth flow + workspace user listing
└── github/ # GitHub handle validation + user linking (oauth.py)
ENABLE_AI_EXTRACTION=true # AI work classification from standups
ENABLE_BURNOUT_DETECTION=false # Not yet implemented
ENABLE_ORG_ANALYTICS=false # Not yet implemented- Only channels the EM has joined are synced
- For team member syncs, only that member's messages are stored
- GitHub access uses a single server-wide PAT set via
GITHUB_PATenv/secret; never stored in the DB. Revoke any time at github.com/settings/tokens. - All data stays in your own database
- Team members can be removed at any time