Skip to content

Conversation

@lgandecki
Copy link

@lgandecki lgandecki commented Dec 30, 2025

info

I took https://github.com/robertpelloni/claude-mem/ that from what I see was 100% vibecoded and not tested, and had some obvious omissions. I spent today massaging it to work. At first I tried to make it work directly with oh-my-opencode through their compability layer ( code-yeongyu/oh-my-opencode#341 ) , but turns out there are limitations in whats possible, and building a custom-plugin might be better idea anyway.. Things seem to be working now.
Not sure if this is the right place for this code anyway, maybe it should be a separate repository with a separate npm deployment. I asked AI to create a PR and it did it to this original repo, not the fork with the plugin..
Anyway, feel free to close it.

Everything below was AI generated

Summary

  • Fix broken session summaries in OpenCode plugin (was passing empty strings to summarize)
  • Track all user messages per session and capture assistant responses
  • Pass last user/assistant messages to summarize endpoint, matching Claude Code's behavior

Changes

  • Message tracking: Added sessionUserMessages and sessionAssistantMessages Maps to collect conversation history
  • Assistant capture: Capture assistant responses from message.updated events
  • Context injection: Inject memory context via chat.message hook (prepends to first message)
  • Session lifecycle: 60-second idle delay before summarization (matches Claude Code Stop hook behavior)
  • Context stripping: Strip [Claude-Mem Context] tags before saving prompts to avoid recursive context

Testing

  1. Build plugin: cd opencode-plugin && npm run build
  2. Start OpenCode session
  3. Have a conversation (with or without tool usage)
  4. Wait for session to idle (60s)
  5. Check summary includes actual conversation context

Logs

Debug logs written to /tmp/claude-mem-opencode.log

google-labs-jules bot and others added 7 commits December 15, 2025 04:45
This commit introduces a new plugin for OpenCode that connects to the claude-mem worker service.
It implements:
- `session.created` hook to inject memory context.
- `tool.execute.after` hook to capture observations.
- `mem-search` tool for memory retrieval.
- `session.idle` hook for summarization.

The plugin resides in `opencode-plugin/` and includes its own build and test configuration.
…46602792139

Port claude-mem to OpenCode plugin
This commit adds a Gemini CLI extension that proxies memory capabilities to the claude-mem worker.
It includes:
- An MCP server implementation using `@modelcontextprotocol/sdk`.
- `mem-search` tool exposed to Gemini.
- Manifest and context files (`gemini-extension.json`, `GEMINI.md`).
- Build configuration for ESM compatibility.

The extension connects to the local worker on port 37777.
…in-226549446602792139

Add Gemini CLI extension for Claude-Mem
- Track all user messages per session via sessionUserMessages Map
- Capture assistant responses from message.updated events
- Pass last user/assistant messages to summarize (was empty strings)
- Add context injection via chat.message hook
- Add session idle detection with 60s delay before summarization
- Strip [Claude-Mem Context] tags before saving prompts
- Add comprehensive logging to /tmp/claude-mem-opencode.log
@lgandecki lgandecki force-pushed the fix/opencode-plugin-message-tracking branch from ade8992 to e080986 Compare December 30, 2025 14:18
@thedotmack
Copy link
Owner

Can we figure out a way to have it follow a similar structure to the cursor hooks I just added? I want a unified clean way of porting things to other platforms

@thedotmack
Copy link
Owner

@lgandecki please message me on our discord :) https://discord.com/invite/J4wttp9vDu

@namphamdev
Copy link

I think creating a plugin for OpenCode is the best approach because OpenCode doesn't support hooks. They use plugins to listen for events https://opencode.ai/docs/plugins/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants