Skip to content

Support for Ollama LLM Client #43

@bhaskarblur

Description

@bhaskarblur

We should add integration for the Ollama LLM client as one of the supported language models in NeoBase.

Why?

  • Ollama offers a local, fast, and privacy-first way to run large language models. Adding support for it will:
  • Enable users to run NeoBase queries without needing external API keys.
  • Support offline and secure environments.
  • Offer faster iteration for local development and testing.

Expected Features:

  • Allows users to configure and select Ollama as the LLM backend in NeoBase backend.
  • Users can use their custom trained or hosted Ollama LLMs in NeoBase.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions