ocgo is a small Go CLI that lets Claude Code and Codex CLI run against an OpenCode Go subscription. It starts a local compatibility proxy, translates Claude Code's Anthropic Messages API requests when needed, exposes OpenAI-compatible endpoints for Codex, and launches tools with the right configuration.
# 1. Setup your OpenCode API key
ocgo setup
# 2. Start coding!
ocgo launch claude --model kimi-k2.6
ocgo launch codex --model kimi-k2.6Use your OpenCode Go subscription from Claude Code or Codex CLI in one command — no manual proxy setup required.
- Save and reuse your OpenCode Go API key.
- List known OpenCode Go model IDs.
- Run Claude Code through OpenCode Go with one command.
- Run Codex CLI through OpenCode Go with one command.
- Start, stop, and inspect a local proxy server.
- Exposes Anthropic-compatible and OpenAI-compatible local API layers.
- Supports streaming text responses and basic tool-call translation.
- Go 1.22 or newer.
- A valid OpenCode Go API key.
- Claude Code or Codex CLI installed and available.
Install with Homebrew:
brew install emanuelcasco/tap/ocgoOr tap the repository first:
brew tap emanuelcasco/tap
brew install ocgoBuild from source:
git clone https://github.com/emanuelcasco/ocgo.git
cd ocgo
make installRun setup and paste your OpenCode Go API key when prompted:
ocgo setupOr pass the key directly:
ocgo setup --api-key sk-opencode-your-keyConfiguration is saved to:
~/.config/ocgo/config.json
You can also provide the key at runtime with an environment variable:
export OCGO_API_KEY=sk-opencode-your-keyBy default, the local proxy listens on 127.0.0.1:3456.
ocgo listAliases are also available:
ocgo ls
ocgo modelsStart Claude Code through the local proxy:
ocgo launch claudeUse a specific OpenCode Go model:
ocgo launch claude --model kimi-k2.6Pass arguments through to Claude Code after --:
ocgo launch claude --model kimi-k2.6 -- -p "How does this repository work?"Allow Claude Code to skip permission prompts:
ocgo launch claude --yesWhen ocgo launch claude starts Claude Code, it sets:
ANTHROPIC_BASE_URL=http://127.0.0.1:3456
ANTHROPIC_AUTH_TOKEN=unusedWhen --model is provided, it also sets:
ANTHROPIC_MODEL=<model>
ANTHROPIC_SMALL_FAST_MODEL=<model>If Claude Code requests a Claude model name or does not provide a model, ocgo defaults the upstream OpenCode Go model to kimi-k2.6.
Start Codex CLI through the local proxy:
ocgo launch codexUse a specific OpenCode Go model:
ocgo launch codex --model kimi-k2.6Pass arguments through to Codex after --:
ocgo launch codex --model kimi-k2.6 -- --sandbox workspace-writeConfigure Codex without launching it:
ocgo launch codex --configWhen ocgo launch codex runs, it writes or updates this profile in ~/.codex/config.toml:
[profiles.ocgo-launch]
openai_base_url = "http://127.0.0.1:3456/v1/"
forced_login_method = "api"
model_provider = "ocgo-launch"
model_catalog_json = "/Users/you/.codex/ocgo-models.json"
[model_providers.ocgo-launch]
name = "OpenCode Go"
base_url = "http://127.0.0.1:3456/v1/"
wire_api = "responses"It then launches:
codex --profile ocgo-launch -m <model>The Codex process receives OPENAI_API_KEY=ocgo; the local proxy injects your real OpenCode Go API key upstream. ocgo also writes ~/.codex/ocgo-models.json so Codex has metadata for OpenCode Go model IDs such as deepseek-v4-pro.
Run the proxy in the foreground:
ocgo serveRun it in the background:
ocgo serve --background
# or
ocgo serve -bCheck whether the proxy is running:
ocgo statusStop the background proxy:
ocgo stopProxy runtime files are stored in:
~/.config/ocgo/ocgo.pid
~/.config/ocgo/ocgo.log
Clone the repository and enter the project directory:
git clone <repository-url>
cd ocgo-ccInstall Go 1.22 or newer, then download dependencies:
go mod downloadBuild the binary:
make buildThe binary is written to:
bin/ocgo
Optionally install it to ~/go/bin:
make installMake sure the install location is in your PATH:
export PATH="$HOME/go/bin:$PATH"Configure an OpenCode Go API key for local testing:
bin/ocgo setup
# or, if installed:
ocgo setupRun the CLI without building:
make runRun tests:
make testRemove built binaries:
make cleanThis project includes a plain Bash release script, no GoReleaser required. It uses the GitHub CLI to create the GitHub release and update a Homebrew tap formula.
Requirements:
brew install gh
gh auth loginRelease a new version:
make release TAG=v0.1.0By default, releases are published to emanuelcasco/ocgo and the Homebrew formula is pushed to emanuelcasco/homebrew-tap. You can override those with GITHUB_REPOSITORY=owner/repo and HOMEBREW_TAP_REPO=owner/homebrew-tap.
The script builds macOS/Linux amd64 and arm64 archives, uploads them to GitHub Releases, and commits Formula/ocgo.rb to the tap repo.
ocgo exposes a local compatibility API used by Claude Code and Codex CLI:
GET /healthPOST /v1/messagesPOST /v1/messages/count_tokensPOST /v1/chat/completionsPOST /v1/responses
Requests sent to /v1/messages are converted from Anthropic Messages format into OpenAI-compatible chat completion requests.
Requests sent to /v1/chat/completions are passed through as OpenAI-compatible chat completion requests while ocgo injects the configured OpenCode Go API key.
Requests sent to /v1/responses use a lightweight OpenAI Responses API adapter for Codex CLI. The adapter converts common Responses input, tool definitions, and streaming text events to and from chat completions.
All upstream requests are forwarded to:
https://opencode.ai/zen/go/v1/chat/completions
Claude Code responses are converted back into Anthropic-compatible responses. Codex responses are returned in OpenAI-compatible Chat Completions or Responses API shapes depending on the requested endpoint.
ocgo is intentionally lightweight. Token counting currently returns 0, and Anthropic/OpenAI compatibility is focused on the request and response shapes needed by Claude Code and Codex CLI rather than full API parity. The /v1/responses adapter is minimal and targets text/tool workflows used by Codex; it is not a complete OpenAI Responses API implementation.
MIT. See LICENSE.