Skip to content

houserwx/Codependement

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Ollama Chat - VS Code Extension

A powerful VS Code extension that enables seamless chat interactions with locally hosted Ollama LLM models. Features both simple Ask Mode and advanced Agent Mode with comprehensive workspace management capabilities and multi-agent coordination with research intelligence.

πŸš€ Features

πŸ€– Multiple Chat Modes

  • Ask Mode: Simple question-answer interactions for general queries
  • Agent Mode: Advanced AI assistant with file operations, code analysis, and workspace management
  • Multi-Agent Mode: Intelligent task coordination using specialized agents for complex workflows
  • General Chat: Intelligent mode switching based on query context

🎯 Multi-Agent System

  • Planner Agent: Breaks down complex tasks into manageable subtasks
  • Research Agent: Leverages MCP servers to gather relevant information and context
  • Coder Agent: Handles implementation and code generation tasks
  • Tester Agent: Creates and executes tests for code validation
  • Debugger Agent: Identifies and resolves issues in code
  • Documenter Agent: Generates comprehensive documentation

πŸ” Research Intelligence

  • MCP Integration: Connects to Model Context Protocol servers for external data access
  • Information Gathering: Automatically researches relevant patterns, examples, and best practices
  • Context Enhancement: Provides research findings to other agents for informed decision-making
  • Smart Caching: Caches research results to improve performance
  • Resource Discovery: Finds and utilizes available documentation, code examples, and APIs

πŸ”§ Agent Mode Capabilities

  • File Operations: Read, write, create, and search files in your workspace
  • Code Analysis: Analyze project structure, review code, and provide insights
  • Workspace Management: Get project information, list files, and navigate directories
  • Terminal Integration: Execute commands and get system information
  • Git Integration: Check repository status and version control information
  • Smart Context: Understands your codebase and provides relevant suggestions
  • MCP Tool Access: Utilize external tools and resources via Model Context Protocol

🎨 User Experience

  • Clean, modern chat interface with VS Code theming
  • Real-time model switching
  • Chat history export
  • Quick action buttons for common tasks
  • Status bar integration
  • Persistent chat sessions

πŸ“‹ Requirements

  • Ollama: Must be installed and running locally
    • Download from ollama.ai
    • At least one language model pulled (e.g., ollama pull llama2)
  • VS Code: Version 1.102.0 or higher
  • Node.js: For development (if building from source)

βš™οΈ Extension Settings

This extension contributes the following settings:

  • ollama-chat.baseUrl: Base URL for Ollama API (default: http://localhost:11434)
  • ollama-chat.defaultModel: Default model to use for chat (default: llama2)
  • ollama-chat.temperature: Temperature for model responses (default: 0.7)
  • ollama-chat.maxTokens: Maximum tokens for model responses (default: 2048)
  • codependent.contextBufferSize: Context buffer size for conversation history and agent memory (default: 32768 tokens)
  • codependent.enableMultiAgent: Enable multi-agent processing for complex tasks (default: true)
  • codependent.enableMcp: Enable MCP (Model Context Protocol) integration (default: true)
  • codependent.mcpServers: Array of MCP server configurations for research agent

πŸ” Research Commands

The research agent can be activated using specific commands:

  • "research [topic]": Gather information about a specific topic using available MCP servers
  • "mcp status": Check the status of connected MCP servers and available tools
  • "research status": Same as MCP status, shows research capabilities

πŸ’‘ Usage Examples

Multi-Agent Complex Tasks

implement a new REST API endpoint with tests and documentation

Research-Driven Development

research best practices for TypeScript error handling and implement them

MCP Integration

research mcp status
research documentation patterns for VS Code extensions

🧠 Context Buffer Management

The extension includes intelligent context buffer management to optimize performance with large conversations:

Features

  • Automatic Context Trimming: Keeps conversations within the configured context buffer size
  • Smart Message Retention: Preserves system messages and recent conversation history
  • Visual Status Indicator: Real-time context buffer usage display in the chat interface
  • Configurable Buffer Size: Adjustable context buffer (default: 32,768 tokens)

Visual Indicators

  • Green: Normal usage (0-60% of buffer)
  • Yellow: High usage (60-80% of buffer)
  • Red: Near capacity (80%+ of buffer)

The context buffer status shows current usage and helps you understand when conversations might be trimmed for optimal performance.

🚦 Getting Started

  1. Install Ollama if you haven't already:

    # Download and install from https://ollama.ai/
    # Then pull a model:
    ollama pull llama2
  2. Start Ollama:

    ollama serve
  3. Open VS Code and install this extension

  4. Start Chatting:

    • Use Ctrl+Shift+P β†’ "Ollama: Open Chat"
    • Or click the "Ollama Chat" button in the status bar
    • Or use specific modes: "Ollama: Open Ask Mode" or "Ollama: Open Agent Mode"

πŸ“– Usage Examples

Ask Mode

Perfect for general questions and explanations:

  • "Explain how async/await works in JavaScript"
  • "What are the best practices for error handling?"
  • "How do I optimize database queries?"

Agent Mode

Ideal for development tasks and code assistance:

  • "Analyze the current project structure"
  • "Review the code in the current file and suggest improvements"
  • "Generate unit tests for the selected function"
  • "Find all TODO comments in the project"
  • "Refactor this code to improve readability"

πŸ”§ Commands

  • Ollama: Open Chat - Open the general chat interface
  • Ollama: Open Ask Mode - Open Ask Mode for simple Q&A
  • Ollama: Open Agent Mode - Open Agent Mode for development tasks
  • Ollama: Select Model - Choose your default Ollama model

πŸ› Known Issues

  • File operations in Agent Mode require appropriate permissions
  • Large files may take time to process
  • Some models may have slower response times depending on hardware

πŸ› οΈ Development

To set up the development environment:

# Clone the repository
git clone <repository-url>
cd ollama-chat-extension

# Install dependencies
npm install

# Compile and watch for changes
npm run watch

# Run in Extension Development Host
# Press F5 in VS Code

πŸ“ Release Notes

0.0.1

  • Initial release
  • Ask Mode and Agent Mode functionality
  • Ollama API integration
  • File operations and workspace management
  • Modern chat interface with VS Code theming
  • Model selection and configuration options

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • Ollama for providing the local LLM infrastructure
  • VS Code team for the excellent extension API
  • The open-source community for inspiration and tools

Following extension guidelines

Ensure that you've read through the extensions guidelines and follow the best practices for creating your extension.

Working with Markdown

You can author your README using Visual Studio Code. Here are some useful editor keyboard shortcuts:

  • Split the editor (Cmd+\ on macOS or Ctrl+\ on Windows and Linux).
  • Toggle preview (Shift+Cmd+V on macOS or Shift+Ctrl+V on Windows and Linux).
  • Press Ctrl+Space (Windows, Linux, macOS) to see a list of Markdown snippets.

For more information

Enjoy!

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published