A generic chatbot agent with MCP (Model Context Protocol) server integration for extensible tool use.
- Modular Architecture: Clean separation between LLM clients, MCP integration, and agent logic
- Multiple LLM Support: Azure OpenAI and Ollama clients using OpenAI SDK
- MCP Integration: Connect to multiple MCP servers for tool access
- Streaming Responses: Real-time response streaming with tool calling
- Extensible: Easy to create specialized agents (WebAgent example included)
- Secure: Environment-based configuration with no hardcoded secrets
- Python 3.10+
- uv package manager
# Clone the repository
git clone <repository-url>
cd simple-agent
# Install with uv
uv sync- Copy the example environment file:
cp .env.example .env- Edit
.envwith your configuration:
# Azure OpenAI Configuration
AZURE_OPENAI_KEY=your_azure_openai_key_here
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_API_VERSION=2023-12-01-preview
AZURE_OPENAI_MODEL=your-deployment-name
# Ollama Configuration (alternative)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=llama2Run the included examples:
# Basic agent example with multiple MCP servers and a function tool
python examples/general_agent.py
# Web automation agent demonstration
python examples/web_agent.pyAdd additional MCP servers in mcp_config.json. Select MCP servers to use when instantiating Agent.
simple-agent/
├── src/simple_agent/
│ ├── agents/ # Agent implementations
│ │ └── agent.py # Core Agent class
│ ├── llm/ # LLM client implementations
│ │ ├── azure_client.py
│ │ └── ollama_client.py
│ └── mcp/ # MCP utilities
│ ├── mcp_manager.py
│ └── research_server.py
├── examples/ # Usage examples
├── mcp_config.json # MCP server configuration
└── pyproject.toml # Project configuration
# Format code
ruff format .
# Lint code
ruff check .
# Type checking
mypy src/