Skip to content

An AI-powered customer support system that automatically answers user queries, tracks unanswered questions via a ticketing system, and provides real-time analytics through a dashboard. Demonstrates caching, context-aware responses, and monitoring to ensure reliable performance and actionable insights.

Notifications You must be signed in to change notification settings

vickytilotia/AI_Support_Agent

Repository files navigation

🧠 AI Customer Support Agent (RAG + LLM + Redis + Ticketing + Monitoring)

Welcome to the AI Customer Support Agent, a production-ready backend system that demonstrates advanced AI workflows with Retrieval-Augmented Generation (RAG), LLM reasoning, caching, ticketing, and monitoring. Its a customer support system that automatically answers user queries, tracks unanswered questions via a ticketing system, and provides real-time analytics through a dashboard. Demonstrates caching, context-aware responses, and monitoring to ensure reliable performance and actionable insights.


✨ Features

  • Semantic Search: Retrieve relevant context from PDFs, FAQs, and knowledge base using ChromaDB / FAISS
  • LLM Agent: Generate intelligent responses using Ollama or OpenAI
  • Redis Memory & Cache: Speed up responses and maintain per-user context
  • Ticketing System: Auto-create tickets for unanswered or low-confidence queries
  • Analytics Dashboard: Track queries, cache hits, tickets, and visualize metrics
  • Monitoring: Prometheus + Grafana integration for real-time metrics
  • Dockerized: Easy setup and deployment across environments

🚀 Quick Setup

1️⃣ Clone Repository

git clone [https://github.com/vickytilotia/AI_Support_Agent.git](https://github.com/vickytilotia/AI_Support_Agent.git)
cd AI-Support-Agent

2️⃣ Python Environment

python -m venv venv
# Linux/Mac
source venv/bin/activate
# Windows
venv\Scripts\activate

pip install -r requirements.txt

3️⃣ Docker Setup

docker-compose up --build

Access services:

4️⃣ Initialize Database

Tables auto-create on FastAPI startup.

uvicorn app.main:app --reload

5️⃣ Test Endpoints

  • Chat: POST /chat/ → Send user query
  • Analytics: GET /analytics/ → View system metrics

📊 Project Flow

User Query
   │
   ▼
ChromaDB / FAISS (RAG) → Retrieve context
   │
   ▼
LLM Agent → Generate Answer
   │
   ├─> Redis Cache → Check / Store
   └─> If Low Confidence → Create Ticket (SQLite)
   │
   ▼
Update Analytics → Prometheus Scrapes → Grafana Visualizes

Flowchart image: flowchart.png


🖼️ Screenshots

Scenario Screenshot
Query & Answer query answer
Query & Answer from Cache query answer from cache
Ticket Creation creating ticket

📁 Folder Structure

AI-Support-Agent/
├─ app/
│  ├─ api/            # FastAPI routes
│  ├─ services/       # LLM, RAG, Memory, Ticket, Analytics
│  ├─ core/           # DB and Redis setup
│  ├─ main.py         # FastAPI entrypoint
├─ data/              # SQLite DB
├─ requirements.txt
├─ Dockerfile
├─ docker-compose.yml
├─ prometheus.yml
├─ flowchart.png      # Visual project flow
├─ README.md          # Project documentation

🌟 Next Steps / Enhancements

  • Add API authentication (API keys / OAuth2)
  • Implement Celery async tasks for ticket notifications
  • Build frontend dashboard for queries and analytics
  • Enhance Grafana dashboards with user-level metrics
  • Demonstrate multi-instance scaling with Docker Compose

📝 License

MIT License

About

An AI-powered customer support system that automatically answers user queries, tracks unanswered questions via a ticketing system, and provides real-time analytics through a dashboard. Demonstrates caching, context-aware responses, and monitoring to ensure reliable performance and actionable insights.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published