A comprehensive climate data platform for the ClimaBorough project, enabling cities to visualize, analyze, and manage climate-related data through customizable dashboards.
- Introduction
- Architecture
- Prerequisites
- Installation
- Docker Deployment
- Database Management
- Configuration
- Development
- API Documentation
- Contributing
- License
BESSER-FOR-CLIMA is a model-driven platform that provides a low-code/no-code dashboard solution for climate data visualization and management. As part of the EU-funded ClimaBorough project, the platform enables:
- Dynamic Dashboard Creation: Build customizable dashboards with drag-and-drop widgets
- Multi-City Support: Manage data for multiple cities with role-based access control
- Rich Visualizations: Line charts, bar charts, pie charts, stat widgets, tables, timelines, and maps
- Real-time Data: WebSocket-based live updates and chat assistant integration
- RESTful API: Comprehensive FastAPI backend with automatic OpenAPI documentation
- Code Generation: Automated generation of models, schemas, and API endpoints from domain models
The platform consists of three main components:
BESSER-FOR-CLIMA/
βββ backend/ # FastAPI application with PostgreSQL
βββ frontend/ # React + TypeScript + Vite SPA
βββ climasolutionsbot/ # Python-based chat assistant
Technology Stack:
- Backend: Python 3.11, FastAPI, SQLAlchemy, Pydantic, PostgreSQL
- Frontend: React 18, TypeScript, Vite, TailwindCSS, shadcn/ui
- Authentication: Keycloak (OpenID Connect)
- Deployment: Docker, Kubernetes, nginx
- Python 3.11.1 or higher
- Node.js 22.x or higher
- Docker and Docker Compose
- PostgreSQL 14+ (for local development without Docker)
- Git
cd backend
python -m venv besser_venv
# Windows
besser_venv\Scripts\activate
# Linux/macOS
source besser_venv/bin/activatepip install -r requirements.txtThe platform uses model-driven engineering to generate code from domain models:
python generate.pyThis generates:
- SQLAlchemy ORM models
- FastAPI route handlers
- Pydantic schemas for request/response validation
- Repository pattern implementations
Create a .env file in backend/src/:
DATABASE_URL=postgresql://postgres:password@localhost:5432/climaborough
SECRET_KEY=your-secret-key-here
KEYCLOAK_SERVER_URL=https://auth.climaplatform.eu
KEYCLOAK_REALM=climaborough
KEYCLOAK_CLIENT_ID=climaborough-backendcd src
python init_db.pycd src
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000Access the API at:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
cd frontend
npm installCreate a .env file in frontend/:
VITE_API_URL=http://localhost:8000
VITE_WEBSOCKET_URL=ws://localhost:8765
VITE_KEYCLOAK_URL=https://auth.climaplatform.eu
VITE_KEYCLOAK_REALM=climaborough
VITE_KEYCLOAK_CLIENT_ID=climaborough-frontendnpm run devThe application will be available at http://localhost:5173
npm run buildThe optimized build will be in the dist/ folder.
cd climasolutionsbotpython -m venv besser_venv
besser_venv\Scripts\activate # Windows
pip install -r requirements.txtEdit config.ini with your settings:
[API]
base_url = http://localhost:8000
[WebSocket]
host = 0.0.0.0
port = 8765python climabot.pycd backend/docker
docker-compose -f docker-compose.local.yaml up -dThis starts:
- PostgreSQL database on port 5432
- PgAdmin on http://localhost:80
- Backend API on http://localhost:8000
cd backend/docker
docker-compose -f docker-compose.remote.yaml build
docker-compose -f docker-compose.remote.yaml up -dcd frontend
docker build -t climaborough-frontend .docker run -p 80:80 \
-e API_URL=https://api.climaplatform.eu \
-e WEBSOCKET_URL=wss://bot.climaplatform.eu \
climaborough-frontenddocker tag climaborough-frontend artefacts.list.lu/climaborough/frontend:latest
docker push artefacts.list.lu/climaborough/frontend:latestcd backend/docker
docker exec climaborough_postgres_remote pg_dump -U postgres climaborough > backup_$(date +%Y%m%d).sqldocker exec -i climaborough_postgres_remote psql -U postgres climaborough < backup.sql# Apply StatChart schema updates
docker exec -i climaborough_postgres_remote psql -U postgres climaborough < add_statchart_columns.sqlSee Remote-Database-Backup-Guide.md for detailed instructions.
Key configuration files:
backend/src/app/core/config.py- Application settingsbackend/src/app/core/security.py- Authentication configurationbackend/plantuml/buml_model.py- Domain model definitions
frontend/vite.config.ts- Build configurationfrontend/src/config/env.js- Runtime environment variablesfrontend/tailwind.config.js- Styling configuration
Backend:
DATABASE_URL- PostgreSQL connection stringSECRET_KEY- JWT signing keyKEYCLOAK_SERVER_URL- Authentication serverCORS_ORIGINS- Allowed CORS origins
Frontend:
VITE_API_URL- Backend API endpointVITE_WEBSOCKET_URL- WebSocket serverVITE_KEYCLOAK_URL- Keycloak serverVITE_KEYCLOAK_REALM- Keycloak realmVITE_KEYCLOAK_CLIENT_ID- OAuth client ID
Backend:
backend/src/app/
βββ api/ # API route handlers
βββ core/ # Configuration and security
βββ models/ # SQLAlchemy ORM models
βββ schemas/ # Pydantic validation schemas
βββ repositories/ # Data access layer
βββ services/ # Business logic
Frontend:
frontend/src/
βββ components/ # Reusable UI components
βββ pages/ # Page-level components
βββ contexts/ # React context providers
βββ services/ # API client services
βββ lib/ # Utility functions
- Line Chart - Time series data visualization
- Bar Chart - Categorical comparisons with horizontal/vertical orientation
- Pie Chart - Distribution and proportions
- Stat Chart - Single KPI metric with optional trend indicators
- Table - Paginated data grid with sorting
- Timeline - Project milestones and events
- Map - Geographic data visualization with WMS/GeoJSON layers
- Free Text - Rich text content blocks
- Update Domain Model in
backend/plantuml/buml_model.py - Regenerate Code with
python generate.py - Create Migration if database schema changed
- Update Frontend Services in
frontend/src/services/ - Add UI Components as needed
- Test Thoroughly before deploying
Interactive API documentation is available at:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
GET /cities- List all citiesGET /dashboards- List dashboardsPOST /dashboards/{id}/visualizations/bulk- Bulk create widgetsGET /kpis- List key performance indicatorsPOST /kpis/{id}/values/bulk- Bulk insert KPI dataGET /mapdata/city/{city_code}- Get map layers for a city
The API uses Keycloak for authentication:
- Obtain access token from Keycloak
- Include in request headers:
Authorization: Bearer <token> - Token is validated via JWT verification
We welcome contributions! Please follow these guidelines:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit your changes:
git commit -m 'Add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
- Python: Follow PEP 8, use type hints
- TypeScript: ESLint configuration, strict mode enabled
- Commits: Use conventional commits format
- Testing: Add tests for new features
This project is licensed under the MIT License - see the LICENSE.md file for details.
- Project Website: https://climaplatform.eu
- Issue Tracker: GitHub Issues
- Documentation: Wiki
For support or questions, please open an issue on GitHub or contact the project maintainers.
Funded by the European Union πͺπΊ
This project has received funding from the European Union's Horizon 2020 research and innovation programme. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or CINEA. Neither the European Union nor the granting authority can be held responsible for them.