Skip to content

Support for OpenAI compatible LLM interfaces #129

@elasticdotventures

Description

@elasticdotventures

I'm creating this ticket, hoping it will be straightforward to add support for OpenAI compatible API's & services and non-local GPU's.

add support for non openai providers to surfkit, the ability to specify the URL of the openAI API provider to support self-hosting
ollama, vllm, openrouter, etc.

I especially like how openAi Codex supports alternate providers (I would copy/emulate the codex config.json approach) -- see: https://github.com/openai/codex?tab=readme-ov-file#custom-ai-provider-configuration

examples of custom providers:

  • openrouter
  • LiteLLM
  • vllm
  • ollama
  • ramalama
  • azure openai service

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions