-
Notifications
You must be signed in to change notification settings - Fork 17
Description
Description
I'm trying to use ccproxy on Windows via WSL2 (Ubuntu) to proxy Claude Code requests. The proxy starts successfully and Windows can connect to it, but all requests to Anthropic models fail with 403 Forbidden - Request not allowed.
Environment
- OS: Windows 11 with WSL2 (Ubuntu 24.04)
- ccproxy version: 1.2.0 (installed via
uv tool install claude-ccproxy --with "litellm[proxy]") - Claude Code: Running on Windows, connecting to proxy in WSL
Setup
Claude Code is installed on Windows. I'm running ccproxy in WSL and setting ANTHROPIC_BASE_URL=http://localhost:4000 in Windows before running Claude Code.
ccproxy.yaml
ccproxy:
debug: true
# OAuth token - path to WINDOWS credentials from WSL
oat_sources:
anthropic: "jq -r '.claudeAiOauth.accessToken' /mnt/c/Users/neil_/.claude/.credentials.json"
hooks:
- ccproxy.hooks.rule_evaluator
- ccproxy.hooks.model_router
- ccproxy.hooks.forward_oauth
rules:
- name: sonnet_to_glm
rule: ccproxy.rules.MatchModelRule
params:
- model_name: claude-sonnet-4-5-20250929
- name: sonnet_old_to_glm
rule: ccproxy.rules.MatchModelRule
params:
- model_name: claude-3-5-sonnet-20241022
- name: haiku_to_glm
rule: ccproxy.rules.MatchModelRule
params:
- model_name: claude-haiku-4-5-20251001
- name: haiku_old_to_glm
rule: ccproxy.rules.MatchModelRule
params:
- model_name: claude-3-5-haiku-20241022
- name: opus
rule: ccproxy.rules.MatchModelRule
params:
- model_name: claude-opus-4-5-20251101
- name: opus
rule: ccproxy.rules.ThinkingRule
litellm:
host: 0.0.0.0
port: 4000
num_workers: 4
debug: trueconfig.yaml
model_list:
- model_name: sonnet_to_glm
litellm_params:
model: openai/glm-4.7
api_base: https://api.z.ai/v1
api_key: "REDACTED"
- model_name: sonnet_old_to_glm
litellm_params:
model: openai/glm-4.7
api_base: https://api.z.ai/v1
api_key: "REDACTED"
- model_name: haiku_to_glm
litellm_params:
model: openai/glm-4.7
api_base: https://api.z.ai/v1
api_key: "REDACTED"
- model_name: haiku_old_to_glm
litellm_params:
model: openai/glm-4.7
api_base: https://api.z.ai/v1
api_key: "REDACTED"
- model_name: opus
litellm_params:
model: anthropic/claude-opus-4-5-20251101
api_base: https://api.anthropic.com
- model_name: default
litellm_params:
model: anthropic/claude-opus-4-5-20251101
api_base: https://api.anthropic.com
litellm_settings:
callbacks:
- ccproxy.handler
general_settings:
forward_client_headers_to_llm_api: trueSteps to Reproduce
- Install ccproxy in WSL:
uv tool install claude-ccproxy --with "litellm[proxy]" - Install jq:
sudo apt install jq - Run
ccproxy installand configure as above - Start proxy:
ccproxy start --detach - Verify OAuth token is readable:
jq -r '.claudeAiOauth.accessToken' /mnt/c/Users/neil_/.claude/.credentials.json # Returns valid token: sk-ant-oat01-...
- From Windows CMD:
set ANTHROPIC_BASE_URL=http://localhost:4000 claude -p "say hi"
Error
API Error: 403 {"error":{"message":"{\n \"error\": {\n \"type\": \"forbidden\",\n \"message\": \"Request not allowed\"\n }\n}. Received Model Group=anthropic/claude-opus-4-5-20251101\nAvailable Model Group Fallbacks=None","type":"None","param":"None","code":"403"}} · Please run /login
Observations
- The OAuth token is successfully loaded at startup (no errors in logs)
ccproxy statusshows all models configured correctlycurl http://localhost:4000/healthworks from both WSL and Windows- The proxy receives requests (logs show
POST /v1/messages?beta=true HTTP/1.1" 403 Forbidden) - Manually testing the OAuth token with curl also returns 403:
TOKEN=$(jq -r '.claudeAiOauth.accessToken' /mnt/c/Users/neil_/.claude/.credentials.json) curl -X POST https://api.anthropic.com/v1/messages \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: application/json" \ -H "anthropic-version: 2023-06-01" \ -d '{"model":"claude-sonnet-4-5-20250929","max_tokens":100,"messages":[{"role":"user","content":"hi"}]}' # Returns: {"error":{"type":"forbidden","message":"Request not allowed"}}
Questions
-
Is Windows/WSL officially supported? The CLI has a bug where it looks for
litellminstead oflitellm.exeon Windows, which led me to try WSL. -
Is the
forward_oauthhook supposed to forward the token from Claude Code's request headers, or use the token loaded fromoat_sources? -
Do I need additional configuration for the OAuth token to be forwarded correctly when Claude Code is on Windows and ccproxy is in WSL?
-
Could there be an issue with how the OAuth token works - does it need to be used from the same session/machine that generated it?
Feature Request
It would be great to have official Windows support, either native or documented WSL setup instructions.