Tweakcn-inspired AI + manual theme generator for shadcn/ui projects. Describe a vibe in chat, get paired light/dark palettes, and refine tokens (colors, typography, shadows, spacing, sidebar) with instant preview.
- AI Theme Chat – Convex actions stream LLM responses and tool calls that directly update
ThemeProvidertokens in real time. - Manual Theme Controls – Color pickers, typography selectors, spacing / radius / shadow sliders for precise tweaks.
- Live Preview Playground – Shadcn component gallery reflects every change so designers/developers can validate instantly.
- Persistent Theme State –
ThemeProvidersyncs tolocalStorage, keeps light/dark tokens in sync when required, and supports sharing/export. - Convex Backend – Convex functions handle chat persistence/streaming via
@convex-dev/persistent-text-streaming; AI tools are typed with Zod schemas.
- React 19 + Vite
- Tailwind CSS with shadcn/ui primitives
- Convex backend +
@convex-dev/persistent-text-streaming - ai-sdk for model + tool orchestration (OpenRouter provider)
- Bun for package/runtime (scripts use
bun)
-
Install dependencies
bun install
-
Configure environment
- Copy
.env.example→.env.local(if present) or create.env.local. - Set
OPENROUTER_API_KEY=<your key>soconvex/ai.tscan call OpenRouter. - Optional Convex vars (auth, deployment) go in
.env.localas well.
- Copy
-
Run Convex dev tools once (recommended)
bunx convex dev --once # generates convex/_generated/* -
Start the app
bun run dev # runs Vite + Convex togetherVisit
http://localhost:5173.
| Command | Description |
|---|---|
bun run dev:frontend |
Start Vite only |
bun run dev:backend |
Start Convex dev server only |
bun run build |
Type-check + Vite build to dist/ |
bun run preview |
Serve production build locally |
bun run lint |
tsc + ESLint (treat warnings as errors) |
- Theme Provider (
src/providers/themeProvider.tsx) stores light/dark token config, writes CSS variables to:root, and persists inlocalStorage. - AI Tooling (
convex/lib/theme.ts) defines theupdateThemeTokenstool schema using Zod. The LLM can only change tokens from this schema. - Streaming Chat:
- Frontend posts prompt + current theme snapshot to
api.messages.sendMessage. - Convex action
streamChatrunsstreamTextwith our system prompt and tool. - When the model calls
updateThemeTokens, we emit a base64 marker inside the stream; the client decodes it, applies updates viaThemeProvider, and shows a Task summary in the chat panel.
- Frontend posts prompt + current theme snapshot to
- Manual Controls – Tabs under
src/components/controlslet users edit tokens directly (color pickers, sliders, etc.), debounced intoThemeProvider.
- Add new tokens – Extend
themeVariableKeysandthemeUpdatesSchema, then update UI controls + preview components to use them. - Model provider – Swap OpenRouter in
convex/ai.tsfor another provider compatible with ai-sdk. - Prompt tuning –
SYSTEM_MESSAGEinconvex/ai.tscontains all guardrails (spacing limits, light/dark sync). Adjust for new rules.
- Convex codegen errors – Run
bunx convex dev --onceafter modifying anything inconvex/. - AI responses missing updates – Ensure your OpenRouter key is valid and the model supports tool calls.
- Lint failures –
bun run lintfails on warnings per repo policy; fix ESLint + TypeScript warnings before committing. - Known issue – The “dark-to-light” theme converter flow is still unreliable; expect mismatched tokens when trying to transplant dark palettes into light mode. Track progress in the issue tracker.