Oracle uses the official OpenAI Node.js SDK, which allows it to connect to any API that adheres to the OpenAI API specification. This includes:
- Official OpenAI API
- Azure OpenAI Service
- Local inference servers (e.g., vLLM, Ollama)
- Proxy servers (e.g., LiteLLM)
Oracle uses Azure's v1 Responses endpoint when --azure-endpoint (or azure.endpoint) is set.
Pass your resource endpoint, Azure key, and optionally a deployment name when it differs from Oracle's CLI model alias:
export AZURE_OPENAI_ENDPOINT="https://your-resource-name.openai.azure.com/"
export AZURE_OPENAI_API_KEY="your-azure-api-key"
export AZURE_OPENAI_DEPLOYMENT="gpt-5-1-pro"Key lookup for GPT-family models when an Azure endpoint is set:
- First looks for
AZURE_OPENAI_API_KEY. - Falls back to
OPENAI_API_KEYif the Azure key is missing.
Without an Azure endpoint, Oracle keeps using OPENAI_API_KEY as before.
Notes:
- Oracle calls Azure at
https://<resource>.openai.azure.com/openai/v1. - For Responses API runs, Azure expects
modelto be your deployment name. Use--azure-deploymentorazure.deploymentwhen the deployment name does not exactly match the CLI model alias. AZURE_OPENAI_API_VERSIONis still accepted for back-compat, but Azure's v1 Responses endpoint does not require it.
You can also pass the Azure settings via CLI flags (env for the key is still recommended):
oracle --azure-endpoint https://... --azure-deployment my-deployment-nameFor other compatible services that use the standard OpenAI protocol but a different URL:
oracle --base-url http://localhost:4000Or via config.json:
{
"apiBaseUrl": "http://localhost:4000"
}Oracle keeps a stable CLI-facing model set, but some names are aliases for the concrete API model ids it sends:
gpt-5.1-pro,gpt-5.2-pro→gpt-5.4-pro(API)
Notes:
gpt-5.1-proandgpt-5.2-proare CLI aliases for “the current Pro API model” — OpenAI’s API usesgpt-5.4-pro.- If you want the classic Pro tier explicitly, use
gpt-5-pro.
--base-url / apiBaseUrl only affect API runs. For browser automation, use --chatgpt-url (or browser.chatgptUrl in config) to point Chrome at a specific ChatGPT workspace/folder such as https://chatgpt.com/g/.../project.
LiteLLM allows you to use Azure, Anthropic, VertexAI, and more using the OpenAI format.
- Start LiteLLM:
litellm --model azure/gpt-4-turbo
- Connect Oracle:
oracle --base-url http://localhost:4000
Oracle can also talk to OpenRouter (Responses API compatible) with any model id:
export OPENROUTER_API_KEY="sk-or-..."
oracle --model minimax/minimax-m2 --prompt "Summarize the notes"- If
OPENROUTER_API_KEYis set and no provider-specific key is available for the chosen model, Oracle defaults the base URL tohttps://openrouter.ai/api/v1. - You can still set
--base-urlexplicitly; if it points at OpenRouter (with or without a trailing/responses), Oracle will useOPENROUTER_API_KEYand forward optional attribution headers (OPENROUTER_REFERER/OPENROUTER_TITLE). - Multi-model runs accept OpenRouter ids alongside built-in ones. See
docs/openrouter.mdfor details.