Replies: 4 comments
-
|
This looks interesting. I'll look into this. I've been wanting to add better support for local / other models but haven't looked into all the options for doing that. I've seen some other models be compatible with the openai request format and those should work out of the box, but it looks like this one has it's own request format. https://openrouter.ai/docs/requests Is there a particular model or feature from openrouter you were interested in using? |
Beta Was this translation helpful? Give feedback.
-
|
@briansunter The good thing about openrouter is that it would allow us a variety of different models, depending on the user needs. There are many free and paid models. Everyone can pick a model that fits his/her needs. |
Beta Was this translation helpful? Give feedback.
-
|
We need standard API tooling like LiteLLM for all the services out there, OpenRouter + top FOSS models like Qwen3 or K2 or DeepSeek or GLM 4.5, any of those will suffice. |
Beta Was this translation helpful? Give feedback.
-
|
I am very interested in this as well, I'm led to believe there might be a relatively standard OpenAI-compatible API that providers such as OpenRouter and even local only LLM software implement to allow access to LLMs of the end users preference. If it helps, there are alternative providers to OpenRouter e.g. https://nano-gpt.com/ that may give you another angle on the implementation. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Thanks for amazing plugin. It would be great if you can support https://openrouter.ai
Beta Was this translation helpful? Give feedback.
All reactions