High-performance analytics dashboard for ABT Corporation's transaction data. Built with Go backend, React frontend, and powered by DuckDB for blazing-fast SQL analytics.
Author: Randil Tharusha
- Country-level revenue analysis with transaction counts and pagination
- Top 20 most purchased products with stock levels
- Monthly sales volume trends with date-based aggregation
- Top 30 regions by revenue and items sold
- Blazing-fast SQL queries using DuckDB's columnar storage
- Real-time analytics with automatic CSV loading
- Concurrent data processing for optimal performance
- Memory-efficient in-memory database with instant queries
- Backend: Go 1.24
- Frontend: React + Vite
- Database: DuckDB (in-memory columnar analytics)
- Data Processing: SQL-powered queries
- Containerization: Docker + Docker Compose
- Go 1.24 or later
- Node.js 18 or later
- npm or yarn
- Make (optional, for using make commands)
The repository includes dummy data files for reference:
data/raw/transactions.csv(100 rows - 99 records + 1 header)
These files are for reference only. Before running the application:
- Remove the dummy files:
rm data/raw/transactions.csv- Place your full transaction dataset in
data/raw/transactions.csv
Note: DuckDB can handle datasets of any size efficiently thanks to its columnar storage and SQL optimization.
Use start.sh script that automates the setup and startup process:
# Make the script executable
chmod +x start.sh
# Run the script
./start.shThe script will:
- Install backend dependencies
- Build the backend
- Install frontend dependencies
- Build the frontend
- Start both services
- Clone the repository:
git clone https://github.com/randilt/abt-corp-analytics-dashboard.git
cd abt-corp-analytics-dashboard- Backend Setup:
# Install Go dependencies
go mod download
# Build the backend
go build -o bin/server cmd/server/main.go
# Run the backend
./bin/server- Frontend Setup:
cd web
npm install
npm run build
npm run previewPull from dockerhub:
docker pull randilt/abt-corp-analytics-dashboard:latestOr start with docker-compose:
docker-compose up -dThe application can be configured using environment variables:
SERVER_HOST=localhost # Server hostname
SERVER_PORT=8080 # Server port
SERVER_READ_TIMEOUT=15s # Read timeout
SERVER_WRITE_TIMEOUT=15s # Write timeout
SERVER_IDLE_TIMEOUT=60s # Idle timeoutCSV_FILE_PATH=./data/raw/transactions.csv # Path to CSV fileLOG_LEVEL=info # Log level (debug, info, warn, error)Example usage:
# Run with custom configuration
CSV_FILE_PATH=./my-data.csv ./bin/server- Frontend: http://localhost:4173
- Backend API: http://localhost:8080/api/v1
- DuckDB Integration: CSV data is loaded directly into DuckDB's in-memory columnar database
- SQL-Powered Analytics: All analytics are generated using optimized SQL queries
- Automatic Loading: Data is loaded fresh on every application startup
- Memory Efficient: Only loads data as needed for each query
- Real-time: No caching needed - queries are always fresh and fast
GET /api/v1/analytics- Get all analytics data summaryGET /api/v1/analytics/stats- Get analytics statisticsGET /api/v1/analytics/country-revenue?limit=100&offset=0- Country revenue data with paginationGET /api/v1/analytics/top-products- Top 20 productsGET /api/v1/analytics/monthly-sales- Monthly sales trendsGET /api/v1/analytics/top-regions- Top 30 regionsPOST /api/v1/analytics/refresh- Force data reloadGET /health- Health checkGET /ready- Readiness check
- CSV Loading: ~25ms for 99 records
- Query Performance: Sub-millisecond response times
- Memory Usage: Minimal - only loads what's needed
- Scalability: Handles datasets of any size efficiently
- Concurrent Queries: All analytics generated in parallel
- Columnar Storage: Optimized for analytical workloads
- SQL Interface: Familiar and powerful query language
- Memory Efficient: Only loads necessary data
- Blazing Fast: Sub-second queries on large datasets
- No External Dependencies: Embedded database, no server needed
- CSV Native: Direct CSV loading without preprocessing
# Run all tests
make test
# Generate coverage report
make test-coverage
# Generate detailed coverage report
make test-coverage-detailedThe coverage report will be generated in ./coverage/coverage.html. Open this file in your browser to view:
- Line-by-line coverage
- Function coverage
- Overall coverage statistics
Note: Coverage report is generated even if some tests fail.
-
If data isn't loading:
- Check if the backend is running (
curl http://localhost:8080/health) - Verify the CSV file exists in
data/raw/transactions.csv - Check server logs for DuckDB initialization errors
- Check if the backend is running (
-
If the frontend build fails:
- Clear node_modules and reinstall:
rm -rf node_modules && npm install - Ensure you're using Node.js 18 or later
- Clear node_modules and reinstall:
-
If queries are slow:
- Check CSV file format and column names
- Verify data types are correct (dates, numbers)
- Check server logs for SQL errors
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ React Frontend │ │ Go Backend │ │ DuckDB │
│ │ │ │ │ │
│ - Dashboard UI │◄──►│ - REST API │◄──►│ - In-memory DB │
│ - Charts │ │ - Handlers │ │ - SQL Queries │
│ - Real-time │ │ - Middleware │ │ - Columnar │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│
▼
┌──────────────────┐
│ CSV File │
│ │
│ - transactions │
│ - Auto-loaded │
└──────────────────┘
