Skip to content
Docs

Codex setup

Configure Codex to use OpenAI as the model provider via 0-0.pro (wire_api = responses).

Scope

Compatible
This setup applies to all Codex clients that read the standard ~/.codex directory, including VS Code Codex extensions/plugins, Codex CLI, and the Codex app.

1) Install or open Codex

Install or open the Codex client you use. Codex reads its shared configuration from ~/.codex.

2) Update ~/.codex/config.toml

Open ~/.codex/config.toml. Below is a minimal example you can copy:

toml
1model = "gpt-5.5"2model_reasoning_effort = "xhigh"3 4# Added OpenAI provider via 0-0.pro (only 6 lines)5model_provider = "OpenAI"6 7[model_providers.OpenAI]8name = "OpenAI"9base_url = "https://api.0-0.pro/v1"10wire_api = "responses"11requires_openai_auth = true12# End added block

If you already have a config, you only need to add the following 6 lines (plus an optional blank line) near the top:

toml
1model_provider = "OpenAI"2 3[model_providers.OpenAI]4name = "OpenAI"5base_url = "https://api.0-0.pro/v1"6wire_api = "responses"7requires_openai_auth = true
Note
Keep your existing model and model_reasoning_effort unchanged.

3) Update ~/.codex/auth.json

Set your Uni API key as OPENAI_API_KEY:

json
1{2  "OPENAI_API_KEY": "your-api-key"3}

4) Restart or reload Codex

Restart or reload the Codex client you use: reload the VS Code window for VS Code extensions/plugins, restart your Codex CLI session, or reopen the Codex app.

Troubleshooting

Network
If Codex reports stream disconnected before completion: error sending request for url (https://api.0-0.pro/v1/responses), first check the local VPN or proxy. A common cause is a local 60s idle timeout cutting the streaming connection; Codex may retry 5 times, which makes the request look stuck for about 5 minutes.

Why did my past chats disappear?

Codex conversations are associated with the model_provider. All session data is stored locally, but switching providers can make older sessions “invisible”.

To show older sessions under the new provider, find the JSONL files in ~/.codex/sessions and update the model_provider in the first line:

json
1{ "model_provider": "openai", "...": "..." }{ "model_provider": "OpenAI", "...": "..." }

Restart or reload your Codex client after the change to re-load the sessions.