Welcome to the AI Customer Support Agent, a production-ready backend system that demonstrates advanced AI workflows with Retrieval-Augmented Generation (RAG), LLM reasoning, caching, ticketing, and monitoring. Its a customer support system that automatically answers user queries, tracks unanswered questions via a ticketing system, and provides real-time analytics through a dashboard. Demonstrates caching, context-aware responses, and monitoring to ensure reliable performance and actionable insights.
- Semantic Search: Retrieve relevant context from PDFs, FAQs, and knowledge base using ChromaDB / FAISS
- LLM Agent: Generate intelligent responses using Ollama or OpenAI
- Redis Memory & Cache: Speed up responses and maintain per-user context
- Ticketing System: Auto-create tickets for unanswered or low-confidence queries
- Analytics Dashboard: Track queries, cache hits, tickets, and visualize metrics
- Monitoring: Prometheus + Grafana integration for real-time metrics
- Dockerized: Easy setup and deployment across environments
git clone [https://github.com/vickytilotia/AI_Support_Agent.git](https://github.com/vickytilotia/AI_Support_Agent.git)
cd AI-Support-Agentpython -m venv venv
# Linux/Mac
source venv/bin/activate
# Windows
venv\Scripts\activate
pip install -r requirements.txtdocker-compose up --buildAccess services:
- FastAPI → http://localhost:8000
- Prometheus → http://localhost:9090
- Grafana → http://localhost:3000 (admin/admin)
Tables auto-create on FastAPI startup.
uvicorn app.main:app --reload- Chat: POST
/chat/→ Send user query - Analytics: GET
/analytics/→ View system metrics
User Query
│
▼
ChromaDB / FAISS (RAG) → Retrieve context
│
▼
LLM Agent → Generate Answer
│
├─> Redis Cache → Check / Store
└─> If Low Confidence → Create Ticket (SQLite)
│
▼
Update Analytics → Prometheus Scrapes → Grafana Visualizes
Flowchart image: flowchart.png
| Scenario | Screenshot |
|---|---|
| Query & Answer | ![]() |
| Query & Answer from Cache | ![]() |
| Ticket Creation | ![]() |
AI-Support-Agent/
├─ app/
│ ├─ api/ # FastAPI routes
│ ├─ services/ # LLM, RAG, Memory, Ticket, Analytics
│ ├─ core/ # DB and Redis setup
│ ├─ main.py # FastAPI entrypoint
├─ data/ # SQLite DB
├─ requirements.txt
├─ Dockerfile
├─ docker-compose.yml
├─ prometheus.yml
├─ flowchart.png # Visual project flow
├─ README.md # Project documentation
- Add API authentication (API keys / OAuth2)
- Implement Celery async tasks for ticket notifications
- Build frontend dashboard for queries and analytics
- Enhance Grafana dashboards with user-level metrics
- Demonstrate multi-instance scaling with Docker Compose
MIT License


