Skip to content

feat: emit --openai-api-target / --anthropic-api-target when OPENAI/ANTHROPIC_BASE_URL is set in engine.env#20596

Closed
Rubyj wants to merge 1 commit intogithub:mainfrom
Rubyj:feat/custom-openai-anthropic-api-target
Closed

feat: emit --openai-api-target / --anthropic-api-target when OPENAI/ANTHROPIC_BASE_URL is set in engine.env#20596
Rubyj wants to merge 1 commit intogithub:mainfrom
Rubyj:feat/custom-openai-anthropic-api-target

Conversation

@Rubyj
Copy link

@Rubyj Rubyj commented Mar 11, 2026

Summary

When a workflow sets a custom OPENAI_BASE_URL or ANTHROPIC_BASE_URL in engine.env (e.g., to point to an internal LLM router), the compiler now extracts the hostname and emits the corresponding --openai-api-target / --anthropic-api-target flags in the AWF command, so the API proxy sidecar forwards requests to the correct upstream.

Depends on github/gh-aw-firewall#1247 which adds --openai-api-target and --anthropic-api-target to AWF.

Closes #20590

Problem

When users configure a custom endpoint in engine.env:

engine:
  id: codex
  env:
    OPENAI_BASE_URL: "https://llm-router.internal.example.com/v1"
    OPENAI_API_KEY: ${{ secrets.LLM_KEY }}

The AWF API proxy overrides OPENAI_BASE_URL inside the agent container to point to its internal sidecar, then forwards requests to the hardcoded api.openai.com. The custom endpoint is never used, and requests fail with 401 because the internal key is rejected by OpenAI's public API.

Changes

pkg/workflow/awf_helpers.go

In BuildAWFArgs(), after the --enable-api-proxy flag is appended:

  1. Check engine.env for OPENAI_BASE_URL — if set, parse the hostname and emit --openai-api-target <hostname>
  2. Check engine.env for ANTHROPIC_BASE_URL — if set, parse the hostname and emit --anthropic-api-target <hostname>

Added extractHostname(rawURL string) string helper that uses net/url to safely parse the URL and return just the hostname.

Example

Given this frontmatter:

engine:
  id: codex
  env:
    OPENAI_BASE_URL: "https://llm-router.internal.example.com/v1"

The compiled awf command will now include:

sudo -E awf --env-all ... --enable-api-proxy --openai-api-target llm-router.internal.example.com ...

Backward compatibility

Fully backward compatible. The new flags are only emitted when OPENAI_BASE_URL / ANTHROPIC_BASE_URL are present in engine.env. Workflows without these env vars produce identical compiled output.

Copilot AI review requested due to automatic review settings March 11, 2026 21:30
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates AWF command compilation so that when a workflow sets custom OPENAI_BASE_URL and/or ANTHROPIC_BASE_URL in engine.env, the compiler extracts the hostname and emits --openai-api-target / --anthropic-api-target so the AWF API proxy sidecar forwards to the intended upstream.

Changes:

  • Parse OPENAI_BASE_URL / ANTHROPIC_BASE_URL from engine.env and emit --openai-api-target / --anthropic-api-target with the extracted hostname.
  • Add extractHostname(rawURL string) helper using net/url parsing.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

…NTHROPIC_BASE_URL

When engine.env sets OPENAI_BASE_URL or ANTHROPIC_BASE_URL pointing to a
custom endpoint, extract the hostname and pass it as --openai-api-target /
--anthropic-api-target to the AWF command.

This allows the API proxy sidecar to forward requests to internal LLM routers
or other custom OpenAI/Anthropic-compatible endpoints, rather than the hardcoded
api.openai.com / api.anthropic.com defaults.

Depends on gh-aw-firewall PR github#1247 which adds the --openai-api-target and
--anthropic-api-target flags to AWF.

Closes: github#20590
@Rubyj Rubyj force-pushed the feat/custom-openai-anthropic-api-target branch from 54d2fdc to 4734c05 Compare March 11, 2026 21:53
@pelikhan
Copy link
Contributor

File an issue thanks.

@pelikhan pelikhan closed this Mar 11, 2026
@Rubyj
Copy link
Author

Rubyj commented Mar 11, 2026

File an issue thanks.

I did! After reading the community guidelines I added agentic instructions here #20590 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

API proxy does not forward to custom OPENAI_BASE_URL / ANTHROPIC_BASE_URL endpoints (e.g., internal LLM routers)

3 participants