This is a local proxy that bridges API-key-only OpenAI-compatible clients to ChatGPT OAuth authentication.
Many Agent/SDK tools only support:
OPENAI_BASE_URL=https://api.openai.com/v1OPENAI_API_KEY=<api_key>
But in Teams/Business setups, users often authenticate with browser OAuth instead of directly using an OpenAI platform API key.
This project helps you:
- keep existing clients on OpenAI v1 + API key style configuration
- route requests to an OAuth-backed upstream session locally
- reuse OAuth accounts in current agent workflows
High-level path:
Agent/Client -> OpenAI v1 request -> local openai-oauth-proxy -> OAuth token -> ChatGPT/Codex upstream
In detail:
- Client sends requests to local
/v1/*with a placeholderAuthorization: Bearer proxy. - Proxy resolves token source by priority:
OPENAI_PROXY_BEARER_TOKENOPENAI_OAUTH_TOKEN/OPENAI_API_KEY- local token file (auto-refresh supported)
- For
chatgpt.com/backend-api+/v1/chat/completions, request/response is transformed through Codex responses format. - Client still receives OpenAI-style responses.
- Rust (stable recommended)
- Network access to
auth.openai.com
cargo install --git https://github.com/liuhongru/openai-oauth-proxy --branch main# 1) First-time OAuth login
openai-oauth-proxy auth
# 2) Start local proxy (default 127.0.0.1:8788)
openai-oauth-proxy servedocker pull ghcr.io/devineliu/openai-oauth-proxy:latestdocker build -t openai-oauth-proxy .docker run --rm -it \
-e OPENAI_OAUTH_NO_BROWSER=1 \
-v "$HOME/.config/openai-oauth-proxy:/home/appuser/.config/openai-oauth-proxy" \
openai-oauth-proxy authdocker run --rm -p 8788:8788 \
-e OPENAI_PROXY_UPSTREAM=https://chatgpt.com/backend-api \
-v "$HOME/.config/openai-oauth-proxy:/home/appuser/.config/openai-oauth-proxy" \
openai-oauth-proxyWhen working in Chat with coding agents, first paste this branch file URL:
https://raw.githubusercontent.com/DevineLiu/openai-oauth-proxy/main/LLM_INSTALL.md
Then let the agent follow that document end-to-end.
Default first-run flow:
- Run
openai-oauth-proxy(no subcommand). - If a local token exists, it starts proxy directly.
- If no token exists, it runs OAuth login first, then starts proxy.
By default,
openai-oauth-proxy servelistens on127.0.0.1:8788and uses upstreamhttps://chatgpt.com/backend-api. It reads credentials in this order:OPENAI_PROXY_BEARER_TOKEN->OPENAI_OAUTH_TOKEN/OPENAI_API_KEY-> local auth file (AGENT_AUTH_FILE).
You can still use explicit commands if needed: openai-oauth-proxy auth and openai-oauth-proxy serve.
export OPENAI_BASE_URL=http://127.0.0.1:8788/v1
export OPENAI_API_KEY=proxycurl -s http://127.0.0.1:8788/healthzopenai-oauth-proxy auth
openai-oauth-proxy serve
openai-oauth-proxy serve --proxy-host 0.0.0.0 --proxy-port 8788
openai-oauth-proxy --print-auth-file
openai-oauth-proxy --print-access-token
openai-oauth-proxy --list-modelsexport OPENAI_BASE_URL=http://127.0.0.1:8788/v1
export OPENAI_API_KEY=proxy
openai-oauth-proxy servedocker run --rm -p 8788:8788 \
-e OPENAI_PROXY_UPSTREAM=https://chatgpt.com/backend-api \
-e OPENAI_OAUTH_NO_BROWSER=1 \
-v "$HOME/.config/openai-oauth-proxy:/home/appuser/.config/openai-oauth-proxy" \
openai-oauth-proxyexport OPENAI_BASE_URL=http://127.0.0.1:8788/v1
export OPENAI_API_KEY=proxy
export OPENAI_PROXY_BEARER_TOKEN="<your_oauth_or_bearer_token>"
openai-oauth-proxy serveOPENAI_PROXY_UPSTREAM: upstream URL (defaulthttps://chatgpt.com/backend-api)OPENAI_PROXY_BEARER_TOKEN: explicit bearer token for forwardingOPENAI_API_KEY: client compatibility field; can be fallback token sourceOPENAI_OAUTH_TOKEN: manually provided OAuth tokenAGENT_AUTH_FILE: token file path (default~/.config/openai-oauth-proxy/aopenai-browser-token.json)OPENAI_OAUTH_AUTH_URL: OAuth authorize URL overrideOPENAI_OAUTH_TOKEN_URL: OAuth token URL overrideOPENAI_OAUTH_CLIENT_ID: OAuth client id overrideOPENAI_OAUTH_REDIRECT_URI: OAuth redirect URI overrideOPENAI_OAUTH_SCOPE: OAuth scopes overrideOPENAI_OAUTH_NO_PROXY=1: bypass system proxy for OAuth/upstream HTTP callsOPENAI_OAUTH_NO_BROWSER=1: disable browser auto-open and use manual login flowOPENAI_OAUTH_PROXY_DEBUG=1: enable debug logs
- Default branch:
main - Image publishing branch: pushes to
mainpublishghcr.io/devineliu/openai-oauth-proxy:latest(multi-archlinux/amd64,linux/arm64) - Security policy:
SECURITY.md - Security fix support: latest
mainbranch - Private vulnerability reporting:
https://github.com/DevineLiu/openai-oauth-proxy/security/advisories/new - License: MIT (
LICENSE) - CI workflow:
.github/workflows/ci.yml - Security workflow (cargo-audit + CodeQL):
.github/workflows/security.yml - Dependency updates:
.github/dependabot.yml