Replies: 2 comments 3 replies
-
Beta Was this translation helpful? Give feedback.
0 replies
-
|
hi @scaprile quite a few variables - but that error seems that somehow the protocol has broken down (likely a model that is swamped by context). A context window of > 32k is realistically needed to even start, 8K won't really do it and may result in things being truncated which possibly could cause this? you are using the docker model runner provider perhaps? can oyu provide the exact GOOSE_PROVIDER and _MODEL from ~/.config/goose/config.yaml? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm an embedded systems developer, but totally null with AI.
Have DMR running fine, can chat with it on its CLI, I'm using a gemma model.
Configured Goose to connect to the docker server.
After a couple of times of getting nowhere, I started Wireshark on port 12434 in order to see what's going on.
I just write "Hello", and send. (In Goose chat)
Goose sends a >20K query to the server, which responds with a JSON telling context has been exceeded.
Goose shows me that and says it is compacting the chat
Goose sends a >3K query requesting a model "gpt-4o-mini" that of course I don't have, how will I know ?
Goose tells me it can't go on and that's the end of my front-end experience.
gpt-4o-mini is not in the config file, I can't change it.
Docker doesn't seem to have that model in ai/
This is capture file "take1"
Can't find a model with that exact name in Huggingface, pull one that says it is that, iterate until one of those has a reasonable size that justifies the word "mini", most have the same size as the main one I intend to run...
Tag the model as the required name, test it on Docker, fine on CLI
Try again, same result, Docker reports context exceeded again
This is capture file "take2"
End of my experience.
I don't have a clue where to set the context. Yeah, quite likely it is Docker's, not Goose's. If you can help me: fine; just take care of that chicken and egg issue with models and try to be more verbose so a non tech-savvy user can get what is going on. (Yeah, non-tech-savvy users won't install their own LLM... you're right...)
Wireshark_capture_files.zip
diagnostics_20260201_1.zip
Beta Was this translation helpful? Give feedback.
All reactions