-
Notifications
You must be signed in to change notification settings - Fork 2k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Description
🐛 Bug Report: Incorrect Endpoint Configuration for Ollama
Summary
By default, Ollama exposes the following RESTful endpoints as documented:
http://localhost:11434/api/generatehttp://localhost:11434/api/chat
However, I'm not sure why the documentation lists that endpoint as the REST API path.
http://localhost:11434/v1/
Also, it hardcodes requests to /v1/chat/completions, which does not exist in Ollama’s native API. This leads to a 404 Not Found error when attempting to POST to that path.
Here’s the relevant log output from the tool:
[GIN] 2025/10/16 - 05:22:18 404 372.804µs 127.0.0.1 POST "/v1/chat/completions"Requests should be sent to Ollama’s actual endpoints:
/api/chatfor chat-style interactions/api/generatefor single-shot completions
Verification
I manually tested the correct endpoint using PowerShell:
Invoke-WebRequest -Uri "http://localhost:11434/api/generate" `
-Method POST `
-Body '{"model": "llama3.2", "prompt": "Why is the sky blue?"}' `
-Headers @{ "Content-Type" = "application/json" }Response
StatusCode : 200
StatusDescription : OK
Content : {123, 34, 109, 111...}
RawContent : HTTP/1.1 200 OK
Transfer-Encoding: chunked
Content-Type: application/x-ndjson
Date: Thu, 16 Oct 2025 05:03:18 GMT
{"model":"llama3.2","created_at":"2025-10-16T05:03:18.6668943Z","response":"The"...
Content-Type : application/x-ndjson
RawContentLength : 28027Pls let me know if there are any additional steps needed.Thanks
Reproduction steps
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See errorScreenshots
Logs
Browsers
No response
OS
No response
Additional information
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working