Skip to content

feat: add --openai-api-target and --anthropic-api-target flags for custom LLM endpoints#1247

Closed
Rubyj wants to merge 1 commit intogithub:mainfrom
Rubyj:feat/custom-openai-anthropic-api-target
Closed

feat: add --openai-api-target and --anthropic-api-target flags for custom LLM endpoints#1247
Rubyj wants to merge 1 commit intogithub:mainfrom
Rubyj:feat/custom-openai-anthropic-api-target

Conversation

@Rubyj
Copy link

@Rubyj Rubyj commented Mar 11, 2026

Summary

Adds --openai-api-target <host> and --anthropic-api-target <host> CLI flags to the AWF, enabling the API proxy sidecar to forward requests to custom/internal LLM endpoints instead of the hardcoded api.openai.com / api.anthropic.com defaults.

This follows the existing --copilot-api-target pattern already implemented for GitHub Enterprise Server support.

Closes #20590 (reported via github/gh-aw#20590)

Problem

When users configure a custom OPENAI_BASE_URL or ANTHROPIC_BASE_URL (e.g., an internal corporate LLM router) in their gh-aw workflow's engine.env, the AWF API proxy ignores it. The proxy overrides OPENAI_BASE_URL inside the agent container to point to its internal sidecar, then forwards all requests to the hardcoded api.openai.com. The custom endpoint is never reached, resulting in 401 errors when the internal API key is sent to OpenAI's public API.

The only workaround was to disable the entire AWF sandbox (sandbox.agent: false), which removes firewall protection and credential isolation.

Changes

containers/api-proxy/server.js

  • Read OPENAI_API_TARGET env var (defaults to api.openai.com)
  • Read ANTHROPIC_API_TARGET env var (defaults to api.anthropic.com)
  • Use these variables in all proxy handlers instead of hardcoded hostnames
  • Include targets in startup log

src/cli.ts

  • Add --openai-api-target <host> option
  • Add --anthropic-api-target <host> option
  • Wire both to config (with env var fallbacks OPENAI_API_TARGET / ANTHROPIC_API_TARGET)

src/docker-manager.ts

  • Pass OPENAI_API_TARGET / ANTHROPIC_API_TARGET to the api-proxy container environment when set
  • Add debug log lines for both new targets

src/types.ts

  • Add openaiApiTarget?: string to WrapperConfig
  • Add anthropicApiTarget?: string to WrapperConfig

Usage

# Route OpenAI/Codex requests to an internal LLM router
sudo awf --enable-api-proxy \
  --openai-api-target llm-router.internal.example.com \
  --allow-domains llm-router.internal.example.com \
  -- codex exec --dangerously-bypass-approvals-and-sandbox 'do something'

# Route Anthropic/Claude requests to an internal LLM router
sudo awf --enable-api-proxy \
  --anthropic-api-target llm-router.internal.example.com \
  --allow-domains llm-router.internal.example.com \
  -- claude --print 'do something'

Or via environment variables:

export OPENAI_API_TARGET=llm-router.internal.example.com
sudo awf --enable-api-proxy --allow-domains llm-router.internal.example.com -- codex exec ...

Backward compatibility

Fully backward compatible. Both flags are optional and default to the existing hardcoded values (api.openai.com and api.anthropic.com). No existing behaviour changes.

Copilot AI review requested due to automatic review settings March 11, 2026 21:24
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds CLI/config support for overriding the upstream OpenAI/Anthropic API hostnames used by the AWF API-proxy sidecar, enabling routing to internal/custom LLM endpoints while keeping sandboxing/credential isolation intact.

Changes:

  • Introduces --openai-api-target and --anthropic-api-target CLI flags (with OPENAI_API_TARGET / ANTHROPIC_API_TARGET env fallbacks).
  • Passes the new target values into the api-proxy container environment via docker-compose generation.
  • Updates the Node.js api-proxy sidecar to forward OpenAI/Anthropic requests to configurable targets (and logs them on startup).

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 4 comments.

File Description
src/types.ts Adds openaiApiTarget / anthropicApiTarget to WrapperConfig with documentation.
src/cli.ts Adds CLI flags and wires them into the wrapper config with env fallbacks.
src/docker-manager.ts Propagates targets to the api-proxy container env; adds debug logging.
containers/api-proxy/server.js Reads target env vars, logs them, and uses them for proxying.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

…stom LLM endpoints

Adds support for routing OpenAI and Anthropic API proxy traffic to custom
endpoints, enabling use with internal LLM routers, Azure OpenAI, and other
OpenAI/Anthropic-compatible APIs.

Follows the existing --copilot-api-target pattern.

Changes:
- cli.ts: Add --openai-api-target and --anthropic-api-target CLI flags
- types.ts: Add openaiApiTarget and anthropicApiTarget to WrapperConfig
- docker-manager.ts: Pass OPENAI_API_TARGET / ANTHROPIC_API_TARGET to proxy container
- containers/api-proxy/server.js: Read OPENAI_API_TARGET / ANTHROPIC_API_TARGET
  env vars (defaulting to api.openai.com / api.anthropic.com) instead of hardcoding

Closes: github/gh-aw#20590
@Rubyj
Copy link
Author

Rubyj commented Mar 12, 2026

Looks like this will be covered in #1249

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants