-
Notifications
You must be signed in to change notification settings - Fork 16
Open
Description
Hello,
I'm trying to use this proxy with Gemini only (not openrouter). I have set my base URL: BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai/ (according to here) and created and defined my API key. The proxy is running and I've got it configured in Raycast, but trying to use a Gemini Ollama model gives me an error 400 status code (no body).
Raycast is configured with this proxy and can detect the five models I defined.
My models.json:
[
{
"name": "Gemini 2.5 Flash",
"id": "google/gemini-2.5-flash",
"contextLength": 1000000,
"capabilities": ["vision", "tools"],
"temperature": 0
},
{
"name": "Gemini 2.5 Flash Thinking",
"id": "google/gemini-2.5-flash:thinking",
"contextLength": 1000000,
"capabilities": ["vision", "tools"],
"temperature": 1
},
{
"name": "DeepSeek V3",
"id": "deepseek/deepseek-chat-v3-0324",
"contextLength": 128000,
"capabilities": ["tools"]
},
{
"name": "GPT-4o Mini",
"id": "openai/gpt-4o-mini",
"contextLength": 128000,
"capabilities": ["vision", "tools"]
},
{
"name": "Claude Sonnet 4",
"id": "anthropic/claude-sonnet-4",
"contextLength": 200000,
"capabilities": ["vision", "tools"],
"temperature": 0.7
}
]
Docker logs show this:
{"level":30,"time":1752684959349,"pid":19,"hostname":"cb4fdb6b1427","category":"HttpEvent","reqId":"6b58d6f9-ee22-4193-8ae1-5939d19512f3","configWithoutMessages":{"model":"google/gemini-2.5-flash","stream":true,"stream_options":{"include_usage":true},"temperature":0},"msg":"ChatCompletionRequest"}
{"level":30,"time":1752684959546,"pid":19,"hostname":"cb4fdb6b1427","category":"HttpEvent","reqId":"6b58d6f9-ee22-4193-8ae1-5939d19512f3","msg":"ConnectionCleanup"}
{"level":50,"time":1752684959548,"pid":19,"hostname":"cb4fdb6b1427","category":"HttpEvent","reqId":"6b58d6f9-ee22-4193-8ae1-5939d19512f3","err":{"type":"BadRequestError","message":"400 status code (no body)","stack":"Error: 400 status code (no body)\n at APIError.generate (/app/node_modules/openai/core/error.js:45:20)\n at OpenAI.makeStatusError (/app/node_modules/openai/client.js:151:32)\n at OpenAI.makeRequest (/app/node_modules/openai/client.js:293:30)\n at process.processTicksAndRejections (node:internal/process/task_queues:105:5)\n at async chatCompletion (/app/dist/controllers/api.js:50:32)","status":400,"headers":{},"requestID":null},"msg":"ErrorHandler"}
{"level":50,"time":1752684959549,"pid":19,"hostname":"cb4fdb6b1427","category":"HttpEvent","reqId":"6b58d6f9-ee22-4193-8ae1-5939d19512f3","req":{"id":"6b58d6f9-ee22-4193-8ae1-5939d19512f3","method":"POST","url":"/api/chat","query":{},"params":{},"headers":{"host":"10.17.7.24:11435","content-type":"application/json","connection":"keep-alive","accept":"application/json","user-agent":"Raycast/0 CFNetwork/3826.500.131 Darwin/24.5.0","content-length":"697","accept-language":"en-GB,en;q=0.9","accept-encoding":"gzip, deflate"},"remoteAddress":"::ffff:10.17.7.11","remotePort":64905},"res":{"statusCode":500,"headers":{"x-powered-by":"Express","x-request-id":"6b58d6f9-ee22-4193-8ae1-5939d19512f3","content-type":"application/json; charset=utf-8","content-length":"79","etag":"W/\"4f-KpVXNRNPplPd6ywl1CWgHirD6rw\""}},"err":{"type":"Error","message":"failed with status code 500","stack":"Error: failed with status code 500\n at onResFinished (/app/node_modules/pino-http/logger.js:115:39)\n at ServerResponse.onResponseComplete (/app/node_modules/pino-http/logger.js:178:14)\n at ServerResponse.emit (node:events:530:35)\n at onFinish (node:_http_outgoing:1082:10)\n at callback (node:internal/streams/writable:766:21)\n at afterWrite (node:internal/streams/writable:710:5)\n at afterWriteTick (node:internal/streams/writable:696:10)\n at process.processTicksAndRejections (node:internal/process/task_queues:89:21)"},"responseTime":202,"msg":"request errored"}
But I'm not sure what to make of this. Could you help me out?
Thanks!
Metadata
Metadata
Assignees
Labels
No labels