Skip to content

Add tool calling support#538

Draft
Aaryan-Kapoor wants to merge 3 commits intoVali-98:masterfrom
Aaryan-Kapoor:feature/tool-call-mcp-support
Draft

Add tool calling support#538
Aaryan-Kapoor wants to merge 3 commits intoVali-98:masterfrom
Aaryan-Kapoor:feature/tool-call-mcp-support

Conversation

@Aaryan-Kapoor
Copy link
Copy Markdown

@Aaryan-Kapoor Aaryan-Kapoor commented Mar 22, 2026

Hey Folks,

Related to #535

This PR is to pave the way for adding future MCP tools support. Any recommendations on local vs remote MCP support are welcome too!

In short, this PR:

  • Adds OpenAI-compatible tool calling with streaming support and an agentic inference loop (generate → detect tool calls → execute → re-generate, up to 10 rounds)
  • Includes DB schema migration for tool definitions, tool call metadata on swipes, and a role field on chat entries
  • Ships with two built-in tools (get_current_datetime, calculate with a safe math parser) and a Tool Manager UI for creating/editing/toggling custom tool definitions (more for paving way for MCP support)
  • Tool call accumulator handles incremental JSON parsing from SSE streams, including parallel tool calls via index-based accumulation
  • Enabled for OpenAI, Open Router, Google AI Studio, and Chat Completions API templates via useTools feature flag

What's included

  • Engine: ToolCallAccumulator, ToolExecutor, ToolTypes, built-in tool definitions
  • Inference: Agentic loop in chatInferenceStreamWithTools with abort support
  • Context/Request builders: Tool message serialization (role: "tool", tool_calls, tool_call_id)
  • DB: Migration 0018 - tool_definitions table, role on chat_entries, tool_calls/tool_call_id on chat_swipes
  • State: ToolState zustand store with DB-backed CRUD and built-in tool seeding
  • UI: Tool Manager screen (list/add/edit/delete/toggle), collapsible tool call indicators and tool result headers in ChatBubble

Known limitations

  • OpenAI format only - Claude native tool_use format not yet supported
  • Custom tools are definition-only (sent to model but no local execution handler, paves way for MCP)
  • Still WIP since this is a huge change, I tested with some basic tool calling working. Contribs are welcome :)

Screenshots

Using GPT-5.4-nano through OpenRouter and the default AI Bot

image image image

…xplicit `abortedByUser` flag and adjusting `stopGenerating` callback.
@Aaryan-Kapoor
Copy link
Copy Markdown
Author

I'd appreciate some feedback on how to handle the UI tool surfacing.

Right now, it just shows up as a user message, which might seem somewhat unsettling and lead to a lot of scrolling!

@Vali-98
Copy link
Copy Markdown
Owner

Vali-98 commented Mar 25, 2026

Hey there, thanks for the PR!

Just to let you know, preferably PR's are done to the dev build, as many underlying processes likely have changed. Otherwise, I'll have a proper look at this over the weekend.

* Returns extracted text content (if any).
*/
processChunk(parsed: any): { text: string | null } {
const choice = parsed?.choices?.[0]
Copy link
Copy Markdown
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This extractor should be using the API defined message extractor.

@Vali-98
Copy link
Copy Markdown
Owner

Vali-98 commented Mar 31, 2026

I left a few comments on implementation, also there is the matter of how to represent this in the chat. I believe the idea to have tools as separate messages from the tool caller/response is viable, so its just a matter prettifying the UI for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants