OpenCode provider that reads ~/.codex/config.toml and uses the configured Codex model provider + API key.
-
Install Codex CLI and make sure
codexis on your PATH. -
Configure Codex in
~/.codex/config.tomland login (codex login). -
Clone this repo:
git clone https://github.com/JakkuSakura/opencode-codex-provider- Install dependencies (pnpm) and build if you plan to edit TypeScript:
pnpm install
pnpm run build- Configure OpenCode to use the provider.
Edit
~/.config/opencode/opencode.json:
{
"$schema": "https://opencode.ai/config.json",
"model": "codex-config/default",
"provider": {
"codex-config": {
"npm": "file:///Users/jakku/Dev/opencode-codex-provider",
"name": "Codex Config",
"options": {
"codexHome": "/Users/jakku/.codex",
"useCodexConfigModel": true
},
"models": {
"default": {
"id": "default",
"name": "Codex (from ~/.codex)",
"family": "codex",
"reasoning": true,
"limit": { "context": 272000, "output": 128000 },
"modalities": { "input": ["text", "image"], "output": ["text"] },
"options": {
"reasoningEffort": "medium",
"reasoningSummary": "auto",
"textVerbosity": "medium"
}
}
}
}
}
}-
Restart OpenCode.
-
In the TUI, run
/modelsand selectcodex-config/default.
- The provider reads
~/.codex/config.tomlon each request and uses the selectedmodel_providerandmodel. - API keys are resolved from
~/.codex/auth.json(same as Codex CLI) or from the env var specified byenv_key. wire_apicontrols whether requests go through Chat Completions (chat) or Responses (responses).
codexHome: path to Codex home (default:~/.codex)useCodexConfigModel: when true, always use the model from~/.codex/config.toml