Skip to content

support for windows and GLM 4.7 config question #8

@neilhuang007

Description

@neilhuang007

Description

I'm trying to use ccproxy on Windows via WSL2 (Ubuntu) to proxy Claude Code requests. The proxy starts successfully and Windows can connect to it, but all requests to Anthropic models fail with 403 Forbidden - Request not allowed.

Environment

  • OS: Windows 11 with WSL2 (Ubuntu 24.04)
  • ccproxy version: 1.2.0 (installed via uv tool install claude-ccproxy --with "litellm[proxy]")
  • Claude Code: Running on Windows, connecting to proxy in WSL

Setup

Claude Code is installed on Windows. I'm running ccproxy in WSL and setting ANTHROPIC_BASE_URL=http://localhost:4000 in Windows before running Claude Code.

ccproxy.yaml

ccproxy:
  debug: true

  # OAuth token - path to WINDOWS credentials from WSL
  oat_sources:
    anthropic: "jq -r '.claudeAiOauth.accessToken' /mnt/c/Users/neil_/.claude/.credentials.json"

  hooks:
    - ccproxy.hooks.rule_evaluator
    - ccproxy.hooks.model_router
    - ccproxy.hooks.forward_oauth

  rules:
    - name: sonnet_to_glm
      rule: ccproxy.rules.MatchModelRule
      params:
        - model_name: claude-sonnet-4-5-20250929
    - name: sonnet_old_to_glm
      rule: ccproxy.rules.MatchModelRule
      params:
        - model_name: claude-3-5-sonnet-20241022
    - name: haiku_to_glm
      rule: ccproxy.rules.MatchModelRule
      params:
        - model_name: claude-haiku-4-5-20251001
    - name: haiku_old_to_glm
      rule: ccproxy.rules.MatchModelRule
      params:
        - model_name: claude-3-5-haiku-20241022
    - name: opus
      rule: ccproxy.rules.MatchModelRule
      params:
        - model_name: claude-opus-4-5-20251101
    - name: opus
      rule: ccproxy.rules.ThinkingRule

litellm:
  host: 0.0.0.0
  port: 4000
  num_workers: 4
  debug: true

config.yaml

model_list:
  - model_name: sonnet_to_glm
    litellm_params:
      model: openai/glm-4.7
      api_base: https://api.z.ai/v1
      api_key: "REDACTED"

  - model_name: sonnet_old_to_glm
    litellm_params:
      model: openai/glm-4.7
      api_base: https://api.z.ai/v1
      api_key: "REDACTED"

  - model_name: haiku_to_glm
    litellm_params:
      model: openai/glm-4.7
      api_base: https://api.z.ai/v1
      api_key: "REDACTED"

  - model_name: haiku_old_to_glm
    litellm_params:
      model: openai/glm-4.7
      api_base: https://api.z.ai/v1
      api_key: "REDACTED"

  - model_name: opus
    litellm_params:
      model: anthropic/claude-opus-4-5-20251101
      api_base: https://api.anthropic.com

  - model_name: default
    litellm_params:
      model: anthropic/claude-opus-4-5-20251101
      api_base: https://api.anthropic.com

litellm_settings:
  callbacks:
    - ccproxy.handler

general_settings:
  forward_client_headers_to_llm_api: true

Steps to Reproduce

  1. Install ccproxy in WSL: uv tool install claude-ccproxy --with "litellm[proxy]"
  2. Install jq: sudo apt install jq
  3. Run ccproxy install and configure as above
  4. Start proxy: ccproxy start --detach
  5. Verify OAuth token is readable:
    jq -r '.claudeAiOauth.accessToken' /mnt/c/Users/neil_/.claude/.credentials.json
    # Returns valid token: sk-ant-oat01-...
  6. From Windows CMD:
    set ANTHROPIC_BASE_URL=http://localhost:4000
    claude -p "say hi"

Error

API Error: 403 {"error":{"message":"{\n  \"error\": {\n    \"type\": \"forbidden\",\n    \"message\": \"Request not allowed\"\n  }\n}. Received Model Group=anthropic/claude-opus-4-5-20251101\nAvailable Model Group Fallbacks=None","type":"None","param":"None","code":"403"}} · Please run /login

Observations

  1. The OAuth token is successfully loaded at startup (no errors in logs)
  2. ccproxy status shows all models configured correctly
  3. curl http://localhost:4000/health works from both WSL and Windows
  4. The proxy receives requests (logs show POST /v1/messages?beta=true HTTP/1.1" 403 Forbidden)
  5. Manually testing the OAuth token with curl also returns 403:
    TOKEN=$(jq -r '.claudeAiOauth.accessToken' /mnt/c/Users/neil_/.claude/.credentials.json)
    curl -X POST https://api.anthropic.com/v1/messages \
      -H "Authorization: Bearer $TOKEN" \
      -H "Content-Type: application/json" \
      -H "anthropic-version: 2023-06-01" \
      -d '{"model":"claude-sonnet-4-5-20250929","max_tokens":100,"messages":[{"role":"user","content":"hi"}]}'
    # Returns: {"error":{"type":"forbidden","message":"Request not allowed"}}

Questions

  1. Is Windows/WSL officially supported? The CLI has a bug where it looks for litellm instead of litellm.exe on Windows, which led me to try WSL.

  2. Is the forward_oauth hook supposed to forward the token from Claude Code's request headers, or use the token loaded from oat_sources?

  3. Do I need additional configuration for the OAuth token to be forwarded correctly when Claude Code is on Windows and ccproxy is in WSL?

  4. Could there be an issue with how the OAuth token works - does it need to be used from the same session/machine that generated it?

Feature Request

It would be great to have official Windows support, either native or documented WSL setup instructions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions