OpenAI Codex CLI
OpenAI Codex CLI integrates via OPENAI_BASE_URL, routing requests through
Isartor's OpenAI-compatible /v1 surface, including /v1/chat/completions and
/v1/models.
Step-by-step setup
# 1. Start Isartor
isartor up
# 2. Configure Codex
isartor connect codex
# 3. Source the env file
source ~/.isartor/env/codex.sh
# 4. Run Codex
codex --model o3-mini
How it works
isartor connect codexwritesOPENAI_BASE_URLandOPENAI_API_KEYto~/.isartor/env/codex.sh- Codex can query
/v1/modelsto discover the configured model - Codex sends chat requests to Isartor's
/v1/chat/completionsendpoint - Isartor supports OpenAI streaming SSE and tool-call passthrough for compatible agent workflows
- Isartor forwards to the configured upstream as Layer 3 when not deflected
- Use
--modelto select any model name configured in your L3 provider
Disconnecting
isartor connect codex --disconnect
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
| Codex not routing through Isartor | Env vars not loaded | Run source ~/.isartor/env/codex.sh in your shell |
| Codex cannot list models | /v1/models unreachable or auth mismatch | Test curl http://localhost:8080/v1/models with the same auth settings |