A proxy API server that lets you use GitHub Copilot in Xcode, either as a custom model provider or as the backend for Claude Agent and Codex Agent.
Xcode 26 added support for third-party LLM providers, but it only supports ChatGPT and Claude out of the box. If you have a GitHub Copilot subscription, there's no built-in way to use it.
This server wraps the GitHub Copilot SDK and exposes it as an API that Xcode can talk to. It's built on copilot-sdk-proxy, which handles the SDK integration and protocol translation. It supports three providers:
- OpenAI (default): Exposes an OpenAI-compatible completions API so Xcode can use Copilot as a custom model provider. Xcode handles tool execution directly.
- Claude: Exposes an Anthropic-compatible API so Xcode can use Copilot as the backend for Claude Agent. A built-in tool bridge intercepts tool calls and routes them back to Xcode for execution.
- Codex: Exposes an OpenAI Responses-compatible API so Xcode can use Copilot as the backend for Codex Agent. Same tool bridge as Claude.
In OpenAI mode, the server also connects to Xcode's built-in MCP tools (via xcrun mcpbridge), giving Copilot access to your project's build logs, indexes and other context. This requires Xcode 26.3 or later. Claude and Codex handle MCP internally through their own agents.
You need Node.js 25.6.0 or later and a GitHub Copilot subscription.
1. Authenticate with one of these (you only need one):
copilot login # Copilot CLI
gh auth login # GitHub CLIOr set a GITHUB_TOKEN environment variable with a valid fine-grained Copilot access token.
2. Install:
npm install -g xcode-copilot-server3. Start the server:
xcode-copilot-serverThe server runs in auto mode by default, which registers all three providers and auto-patches Claude and Codex settings. Pick the provider you want to use in Xcode and follow the setup below.
Tip
If you only need one provider, use --proxy to run a single provider instead (e.g. xcode-copilot-server --proxy claude). See CLI reference for details.
OpenAI (custom model provider)
- In Xcode, go to Settings > Intelligence > Add a provider
- Select "Locally hosted" and set the port to 8080 (or whatever port you chose)
- Give it a description e.g. "Copilot" and save
To enable tool calling, select the provider and enable "Allow tools" under "Advanced". To connect Xcode's MCP tools (Xcode 26.3+), enable "Xcode Tools" under "Model Context Protocol".
Claude (Claude Agent)
- In Xcode, go to Settings > Intelligence > Anthropic > Claude Agent
- Enable Claude Agent and sign in with an API key (the key can be any random text, since calls are proxied through the server)
The tool bridge is enabled by default (toolBridge: true in the config). It intercepts tool calls from the Copilot session and forwards them to Xcode, so Claude Agent can read files, search code, and make edits through the IDE.
Codex (Codex Agent)
- In Xcode, go to Settings > Intelligence > OpenAI > Codex Agent
- Enable Codex Agent and sign in with an API key (the key can be any random text, since calls are proxied through the server)
You might need to restart Xcode so it picks up the new environment variables.
The tool bridge works the same way as Claude, intercepting tool calls and routing them back to Xcode for execution.
Tip
If you want to run the server in the background with automatic start/stop, see Launchd agent below.
The server reads its configuration from a config.json5 file. By default, it uses the bundled one, but you can point to your own with --config:
xcode-copilot-server --config ./my-config.json5The config file uses JSON5 format, which supports comments and trailing commas. In auto mode, all three provider sections are active. In single-provider mode (--proxy), only the specified section is used:
{
openai: {
// No tool bridge needed, as Xcode drives tool execution directly.
toolBridge: false,
mcpServers: {
// Proxies Apple's xcrun mcpbridge (Xcode 26.3+).
xcode: {
type: "local",
command: "node",
args: ["./scripts/mcpbridge-proxy.mjs"],
allowedTools: ["*"],
},
},
},
claude: {
// Intercepts tool calls and forwards them to Xcode so Claude Agent
// drives tool execution through the IDE instead of the Copilot CLI.
toolBridge: true,
// No MCP servers needed, as Claude Agent handles tools natively.
mcpServers: {},
},
codex: {
// Same as Claude: intercepts tool calls and forwards them to Xcode
// so Codex drives tool execution through the IDE.
toolBridge: true,
mcpServers: {},
},
// Built-in CLI tools allowlist.
// ["*"] to allow all, [] to deny all, or a list of specific tool names.
//
// Empty by default so Xcode can handle all operations (search, read, edit)
// through its UI. Enabling CLI tools lets the Copilot session perform
// those operations directly, bypassing Xcode.
allowedCliTools: [],
// Maximum request body size in MiB.
bodyLimitMiB: 10,
// Filename patterns to filter out from search results in the prompt.
//
// Xcode can include full file contents for every search match, so add patterns
// here to strip files that bloat the prompt (e.g. ["mock", "generated"]).
excludedFilePatterns: [],
// Reasoning effort for models that support it: "low", "medium", "high", "xhigh"
reasoningEffort: "xhigh",
// Auto-approve permission requests.
// true to approve all, false to deny all,
// or an array of kinds: "read", "write", "shell", "mcp", "url"
autoApprovePermissions: ["read", "mcp"],
}In auto mode, Claude and Codex settings are patched on startup and restored on shutdown. For Claude, the server creates (or updates) settings.json at ~/Library/Developer/Xcode/CodingAssistant/ClaudeAgentConfig/ to point to the server. For Codex, OPENAI_BASE_URL and OPENAI_API_KEY are set via launchctl setenv so Xcode (and any Codex process it spawns) can reach the server.
OpenAI doesn't need any settings patching since it uses Xcode's built-in locally hosted provider support.
If you'd rather configure settings yourself instead of using auto-patch, see the manual steps below.
Claude
Create settings.json at ~/Library/Developer/Xcode/CodingAssistant/ClaudeAgentConfig/:
{
"env": {
"ANTHROPIC_BASE_URL": "http://localhost:8080",
"ANTHROPIC_AUTH_TOKEN": "12345"
}
}Set the port to match your --port flag (default 8080). The auth token can be any non-empty string.
You can also use the patch-settings and restore-settings subcommands to patch or restore settings without starting the server:
xcode-copilot-server patch-settings --proxy claude --port 8080
xcode-copilot-server restore-settings --proxy claudeCodex
Set the environment variables via launchctl:
launchctl setenv OPENAI_BASE_URL http://localhost:8080/v1
launchctl setenv OPENAI_API_KEY 12345Set the port to match your --port flag (default 8080). The API key can be any non-empty string. To restore the original values when you're done:
launchctl unsetenv OPENAI_BASE_URL
launchctl unsetenv OPENAI_API_KEYYou can also use the patch-settings and restore-settings subcommands to do this without starting the server:
xcode-copilot-server patch-settings --proxy codex --port 8080
xcode-copilot-server restore-settings --proxy codexAgent skills are an open standard for adding custom instructions and file context to AI coding agents. All three providers support them through the Copilot SDK session, and each agent has its own skill paths:
| Agent | Project skills | Personal skills |
|---|---|---|
| Copilot | .github/skills/, .claude/skills/ |
~/.copilot/skills/ |
| Claude | .claude/skills/ |
~/.claude/skills/ |
| Codex | .codex/skills/ |
~/.codex/skills/ |
Instead of starting the server manually every time, you can install it as a launchd agent. This uses macOS socket activation, so the server starts automatically when something connects to the port (e.g. when Xcode sends its first request) and you don't need to keep a terminal open.
xcode-copilot-server install-agentThis writes a plist to ~/Library/LaunchAgents/ and loads it with launchctl. The agent is set up with socket activation on the specified port, so launchd owns the socket and starts the server on demand. Settings are patched at install time.
The install-agent subcommand accepts the same options as the main command (--port, --proxy, --log-level, --config, --cwd), plus --idle-timeout which defaults to 60 minutes for the agent. After 60 minutes with no requests, the server shuts itself down. The next incoming connection will start it again automatically.
Server logs go to ~/Library/Logs/xcode-copilot-server.out.log and ~/Library/Logs/xcode-copilot-server.err.log.
xcode-copilot-server uninstall-agentThis unloads the agent, deletes the plist, and restores any patched settings if the agent was installed with --auto-patch.
Launchd creates a socket on the configured port and waits. When a connection comes in (e.g. Xcode sends a request), launchd starts the server process and hands over the socket. The server handles requests as normal.
If the server crashes, launchd doesn't restart it immediately, but the next incoming connection will start a fresh process. If --idle-timeout is set (defaults to 60 minutes for the agent), the server exits after that many minutes of inactivity, and launchd will start it again on the next connection.
To check if the agent is loaded:
launchctl list | grep xcode-copilot-serverxcode-copilot-server [options]
Options:
-p, --port <number> Port to listen on (default: 8080)
--proxy <provider> API format: openai, claude, codex (default: auto)
-l, --log-level <level> Log verbosity (default: info)
-c, --config <path> Path to config file
--cwd <path> Working directory for Copilot sessions
--auto-patch Auto-patch settings on start, restore on exit (implicit in auto mode)
--idle-timeout <minutes> Shut down after N minutes of inactivity (default: 0, disabled)
-v, --version Output the version number
-h, --help Show help
Commands:
patch-settings Patch provider settings and exit (all providers unless --proxy is set)
restore-settings Restore provider settings from backup and exit
install-agent Install a launchd agent with socket activation
uninstall-agent Uninstall the launchd agent and restore settings
By default, the server runs in auto mode and registers all providers. Use --proxy to run a single provider instead:
| Provider | Flag | Routes |
|---|---|---|
| OpenAI | --proxy openai |
GET /v1/models, POST /v1/chat/completions |
| Claude | --proxy claude |
POST /v1/messages, POST /v1/messages/count_tokens |
| Codex | --proxy codex |
POST /v1/responses |
This server acts as a local proxy between Xcode and GitHub Copilot. It runs on your machine and isn't meant for the internet or shared networks.
-
The server binds to
127.0.0.1, so it's only reachable from your machine. Incoming requests are checked for expected user-agent headers (Xcode/for OpenAI and Codex,claude-cli/for Claude), which means casual or accidental connections from other tools will be rejected. This isn't a strong security boundary since user-agent headers can be trivially spoofed, but it means only the expected client talks to the server under normal use. -
The bundled config sets
autoApprovePermissionsto["read", "mcp"], which lets the Copilot session read files and call MCP tools without prompting. Writes, shell commands, and URL fetches are denied by default. You can set it totrueto approve everything,falseto deny everything, or pick specific kinds from"read","write","shell","mcp", and"url". -
MCP servers defined in the config are spawned as child processes. The bundled config uses
xcrun mcpbridge, which is an Apple-signed binary. If you add your own MCP servers, make sure you trust the commands you're configuring. -
When you use
install-agent, the generated plist file includes yourPATHin cleartext so the agent can find Node.js. The file is written to~/Library/LaunchAgents/which is only readable by your user account by default.
Note
If you authenticated with a GITHUB_TOKEN environment variable, that token is also embedded in the plist. If you'd rather not have a token in the plist, use gh auth login or copilot login instead.
MIT License
Copyright (c) 2026 Suyash Srijan
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
