-
Notifications
You must be signed in to change notification settings - Fork 12
Add Ollama LLM Client Support #62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
This commit introduces support for the Ollama LLM provider, including configuration options in the environment variables, constants, and the LLM manager. The Ollama client is implemented to handle requests and responses, ensuring proper JSON formatting and schema compliance. Additionally, the necessary constants and prompts for Ollama are defined to facilitate its integration with the existing system.
This commit enhances the MySQL prompt with detailed instructions and rules for generating SQL queries, ensuring compliance with schema requirements and improving user interaction through structured JSON responses.
This commit enhances the Ollama prompts by adding support for YugabyteDB and MySQL, including specific response schemas and optimization guidelines tailored for each database type. The changes improve the assistant's ability to generate accurate and efficient SQL queries while ensuring compliance with the defined schema.
This commit adds support for Clickhouse and MongoDB in the Ollama prompts, including specific response schemas and detailed guidelines for generating SQL and MongoDB queries. The enhancements improve the assistant's ability to provide accurate and efficient responses while ensuring compliance with the defined schema for both database types.
This commit updates the README.md to accurately reflect the current support status for the Ollama LLM client, removing it from the "Planned to be supported LLM Clients" section and adding it to the "Supported LLM Clients" section. Additionally, a minor grammatical correction was made in the contributing guidelines.
|
@SumonRayy is attempting to deploy a commit to the Aurivance Technologies Team on Vercel. A member of the Team first needs to authorize it. |
|
Hi @SumonRayy I will review this soon. |
…and YugabyteDB system prompts This commit extends the Ollama prompt functionality by adding support for PostgreSQL, MySQL, Clickhouse, MongoDB, and YugabyteDB. Each database type now has a corresponding prompt, enhancing the assistant's ability to generate accurate and efficient SQL queries while ensuring compliance with the defined schemas.
| ollamaMessages := make([]OllamaMessage, 0) | ||
|
|
||
| // Add system message first with explicit JSON formatting instruction | ||
| systemPrompt = systemPrompt + "\n\nCRITICAL INSTRUCTION: You MUST respond with ONLY a valid JSON object that strictly follows the schema above. Your response MUST include all required fields: assistantMessage, queries (array), and optionally actionButtons. Do not include any other text, markdown, or HTML in your response. Your entire response must be a single JSON object starting with { and ending with }. Do not include any explanations or additional text." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please inject this hardcode prompt from constants of ollama only rather than writing here.
| // Add a final instruction message to reinforce JSON formatting | ||
| ollamaMessages = append(ollamaMessages, OllamaMessage{ | ||
| Role: "system", | ||
| Content: "Remember: Your response must be ONLY a valid JSON object with all required fields: assistantMessage, queries (array), and optionally actionButtons. Do not include any other text or explanations.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment, pick from constants
| Stream: false, | ||
| Options: OllamaOptions{ | ||
| Temperature: c.temperature, | ||
| NumPredict: c.maxCompletionTokens, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is NumPredict = maxCompletionTokens?
Is this correct?
| for _, q := range queries { | ||
| if queryStr, ok := q.(string); ok { | ||
| queryInfos = append(queryInfos, constants.QueryInfo{ | ||
| Query: queryStr, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why hardcoded these values and not from llm response? Also many properties are missing in this append.
|
Please also update the docker-compose folder files too like env, docker compose files for these changes configs. |
|
Hi @SumonRayy Any news on this? |
Description
This PR adds support for the Ollama LLM client as one of the supported language models in NeoBase. This enhancement enables users to run NeoBase queries locally without requiring external API keys, supporting offline and secure environments.
Changes
Database-Specific Optimizations
Clickhouse
YugabyteDB
MongoDB
Benefits
Testing
Related Issues
Resolves #43
Security Considerations