feat: add --openai-api-target and --anthropic-api-target flags for custom LLM endpoints#1247
Closed
Rubyj wants to merge 1 commit intogithub:mainfrom
Closed
feat: add --openai-api-target and --anthropic-api-target flags for custom LLM endpoints#1247Rubyj wants to merge 1 commit intogithub:mainfrom
Rubyj wants to merge 1 commit intogithub:mainfrom
Conversation
Contributor
There was a problem hiding this comment.
Pull request overview
Adds CLI/config support for overriding the upstream OpenAI/Anthropic API hostnames used by the AWF API-proxy sidecar, enabling routing to internal/custom LLM endpoints while keeping sandboxing/credential isolation intact.
Changes:
- Introduces
--openai-api-targetand--anthropic-api-targetCLI flags (withOPENAI_API_TARGET/ANTHROPIC_API_TARGETenv fallbacks). - Passes the new target values into the api-proxy container environment via docker-compose generation.
- Updates the Node.js api-proxy sidecar to forward OpenAI/Anthropic requests to configurable targets (and logs them on startup).
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
src/types.ts |
Adds openaiApiTarget / anthropicApiTarget to WrapperConfig with documentation. |
src/cli.ts |
Adds CLI flags and wires them into the wrapper config with env fallbacks. |
src/docker-manager.ts |
Propagates targets to the api-proxy container env; adds debug logging. |
containers/api-proxy/server.js |
Reads target env vars, logs them, and uses them for proxying. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
f804f03 to
af8eea0
Compare
…stom LLM endpoints Adds support for routing OpenAI and Anthropic API proxy traffic to custom endpoints, enabling use with internal LLM routers, Azure OpenAI, and other OpenAI/Anthropic-compatible APIs. Follows the existing --copilot-api-target pattern. Changes: - cli.ts: Add --openai-api-target and --anthropic-api-target CLI flags - types.ts: Add openaiApiTarget and anthropicApiTarget to WrapperConfig - docker-manager.ts: Pass OPENAI_API_TARGET / ANTHROPIC_API_TARGET to proxy container - containers/api-proxy/server.js: Read OPENAI_API_TARGET / ANTHROPIC_API_TARGET env vars (defaulting to api.openai.com / api.anthropic.com) instead of hardcoding Closes: github/gh-aw#20590
af8eea0 to
a502b11
Compare
Author
|
Looks like this will be covered in #1249 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds
--openai-api-target <host>and--anthropic-api-target <host>CLI flags to the AWF, enabling the API proxy sidecar to forward requests to custom/internal LLM endpoints instead of the hardcodedapi.openai.com/api.anthropic.comdefaults.This follows the existing
--copilot-api-targetpattern already implemented for GitHub Enterprise Server support.Closes #20590 (reported via github/gh-aw#20590)
Problem
When users configure a custom
OPENAI_BASE_URLorANTHROPIC_BASE_URL(e.g., an internal corporate LLM router) in their gh-aw workflow'sengine.env, the AWF API proxy ignores it. The proxy overridesOPENAI_BASE_URLinside the agent container to point to its internal sidecar, then forwards all requests to the hardcodedapi.openai.com. The custom endpoint is never reached, resulting in 401 errors when the internal API key is sent to OpenAI's public API.The only workaround was to disable the entire AWF sandbox (
sandbox.agent: false), which removes firewall protection and credential isolation.Changes
containers/api-proxy/server.jsOPENAI_API_TARGETenv var (defaults toapi.openai.com)ANTHROPIC_API_TARGETenv var (defaults toapi.anthropic.com)src/cli.ts--openai-api-target <host>option--anthropic-api-target <host>optionOPENAI_API_TARGET/ANTHROPIC_API_TARGET)src/docker-manager.tsOPENAI_API_TARGET/ANTHROPIC_API_TARGETto the api-proxy container environment when setsrc/types.tsopenaiApiTarget?: stringtoWrapperConfiganthropicApiTarget?: stringtoWrapperConfigUsage
Or via environment variables:
Backward compatibility
Fully backward compatible. Both flags are optional and default to the existing hardcoded values (
api.openai.comandapi.anthropic.com). No existing behaviour changes.