-
Notifications
You must be signed in to change notification settings - Fork 19
Open
Description
I'm creating this ticket, hoping it will be straightforward to add support for OpenAI compatible API's & services and non-local GPU's.
add support for non openai providers to surfkit, the ability to specify the URL of the openAI API provider to support self-hosting
ollama, vllm, openrouter, etc.
I especially like how openAi Codex supports alternate providers (I would copy/emulate the codex config.json approach) -- see: https://github.com/openai/codex?tab=readme-ov-file#custom-ai-provider-configuration
examples of custom providers:
- openrouter
- LiteLLM
- vllm
- ollama
- ramalama
- azure openai service
Metadata
Metadata
Assignees
Labels
No labels