Skip to content

Conversation

@cto-new
Copy link

@cto-new cto-new bot commented Dec 10, 2025

Summary

Introduce a new Browser AI chat component that runs entirely in the browser using WebGPU and the @mlc-ai/web-llm library. It is exposed as a custom element and integrated into the component registry.

Details

  • Added BrowserAI.tsx implementing an in-browser AI chat UI and engine initialization
  • Exposed as the browser-ai-chat custom element and wired into main.jsx
  • Updated package.json to include @mlc-ai/web-llm
  • Added example/docs to demonstrate usage and WebGPU requirements

Warning: Task VM test is not passing, cto.new will perform much better if you fix the setup

This change adds a new fully embedded AI chat component that runs in the browser using WebGPU via the MLC library. It is exposed as a custom element and wired into the existing component registry.

- Introduced BrowserAI.tsx implementing an in-browser AI chat UI and engine initialization
- Exposed as browser-ai-chat custom element and wired in main.jsx
- Updated package.json to include @mlc-ai/web-llm
- Added example/docs to demonstrate usage and capabilities
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant