Microservices-based architecture designed to predict electricity consumption for specific building meters. This system orchestrates data acquisition from external providers (Kunna) and processes it through a Multi-Layer Perceptron (MLP) model to generate somewhat accurate usage forecasts.
The system is composed of three microservices, managed as Git submodules within this repository:
- Orchestrator: The central controller that manages the workflow between data acquisition and prediction.
- Acquisitor: Interfaces with the external Kunna API to retrieve raw meter data and format it into the features used for prediction.
- Predictor: Loads a pre-trained MLP model to generate predictions based on provided features.
All services are containerized via Docker and utilize MongoDB for caching and auditing.
- Git
- Docker & Docker Compose
- Clone the repository and its submodules:
Simply do:
git clone <repo_url>
cd <repo_name>
git submodule update --init --recursive- Configure Environment Variables:
Ensure valid .env, if running manually, or .env.docker, if running with docker-compose, files exist in the service directories (see Configuration below).
You can, for example, do:
for svc in acquisitor predictor orquestrator; do
cp "./${svc}/.env.docker.example" "./${svc}/.env.docker";
done
sed -i "s/KUNNA_TOKEN=/KUNNA_TOKEN='<token>'" ./acquisitor/.env.docker- Start the System:
Using the provided script:
./scripts/start.shRole: Coordinates the prediction flow. It requests data from the Acquisitor and forwards it to the Predictor.
| Path | Method | Description | Payload / Response | Default |
|---|---|---|---|---|
/health |
GET |
Check the current status of the service. | Response: { "service": "orquestrator", "status": "ok" } |
N/A |
/run |
POST |
Triggers the prediction pipeline for a list of dates. | Payload: { "dates": ["YYYY-MM-DD", ...] } |
Current date (Today) |
Role: Connects to the Kunna API to fetch meter data for a specific meter and date and transforms it into the input features required by the model.
| Path | Method | Description | Payload / Response | Default |
|---|---|---|---|---|
/health |
GET |
Check the current status of the service. | Response: { "service": "acquisitor", "status": "ok" } |
N/A |
/data |
POST |
Retrieves features for a specific date. | Payload: { "date": "YYYY-MM-DD" } |
Current date (Today) |
Role: Hosts the MLP model and performs inference.
| Path | Method | Description | Payload / Response | Default |
|---|---|---|---|---|
/health |
GET |
Check the current status of the service. | Response: { "service": "predictor", "status": "ok" } |
N/A |
/ready |
GET |
Checks if the MLP model is loaded and ready for inference. | Response: { "ready": true/false, "modelVersion": "vX.X", [message] } |
N/A |
/predict |
POST |
Generates a prediction based on input features (current model uses 7 features). | Payload: { "features": [Number, ...], "meta": { "featureCount": Number } } |
N/A |
Each service exposes an optional lightweight web interface for manual testing.
- Functionality: Displays health status and provides a way to manually trigger
/run,/data, or/predictendpoints. - Configuration: Can be enabled/disabled via the
EXPOSE_APPenvironment variable.
All services persist data to MongoDB to optimize performance and ensure traceability.
- Cache (
POST /db/cache):- TTL: 1 Hour.
- Behavior: If a request is repeated within the TTL, the service returns cached data instead of reprocessing. The Orchestrator caches results individually per date.
- Audit (
POST /db/audit):- TTL: 1 Month.
- Purpose: Debugging and historical tracking.
- Indexing:
- Acquisitor: Indexed by
dataId. - Predictor: Indexed by
predictionId. - Orchestrator: Indexed by
correlationId(linksdataIdtopredictionId).
- Acquisitor: Indexed by
Each service is configured via environment variables.
| Variable | Description | Default |
|---|---|---|
SERVICE |
Internal service identifier | acquisitor |
LOG_LEVEL |
Logging verbosity (info or error) | info |
PORT |
Service internal port | 3001 |
ADDR |
Host address | localhost |
MONGO_URI |
MongoDB Connection String | mongodb://localhost:27017 |
EXPOSE_DB |
Expose DB endpoints | false |
KUNNA_URL |
Kunna API Endpoint | https://openapi.kunna.es |
KUNNA_TOKEN |
API Authorization Token | [SECRET] |
KUNNA_METER |
Target Building Meter ID | MLU00360002 |
EXPOSE_APP |
Enable Web UI | false |
| Variable | Description | Default/Example |
|---|---|---|
SERVICE |
Internal service identifier | predictor |
LOG_LEVEL |
Logging verbosity (info or error) | info |
PORT |
Service internal port | 3002 |
ADDR |
Host address | localhost |
MONGO_URI |
MongoDB Connection String | mongodb://localhost:27017 |
EXPOSE_DB |
Expose DB endpoints | false |
MODEL_VERSION |
Current ML Model Tag | v1.0 |
EXPOSE_APP |
Enable Web UI | false |
| Variable | Description | Default/Example |
|---|---|---|
SERVICE |
Internal service identifier | orquestrator |
LOG_LEVEL |
Logging verbosity (info or error) | info |
PORT |
Service internal port | 8080 |
ADDR |
Host address | localhost |
ACQUISITOR_URI |
Address of Acquisitor Service | http://localhost:3001 |
PREDICTOR_URI |
Address of Predictor Service | http://localhost:3002 |
MONGO_URI |
MongoDB Connection String | mongodb://localhost:27017 |
EXPOSE_DB |
Expose DB endpoints | false |
EXPOSE_APP |
Enable Web UI | false |