This is a Loopback application to provide an API for the DataWorkbench data.
The application is currently "polymorphic" and can run in three different ways:
- "public": suitable for public exposure, several API end points have been closed
- "private": used inside the cluster, with all end points available
- "datastore": used to run a regular synchronisation with the datastore
The startup mode is determined by environment variables, see below.
To run a local version without docker, and responsive to changes in the code:
npm install
nodemon server/server.jsThis will use an in-memory database and local file storage in the test/tmpdirectory.
To run the tests:
npm testTo run a version connected to for instance a Google Kubernetes cluster, have a look at the configuration in server/datasources.cluster-example.json on how to override datasource settings, using local kubectl port-forwarding to a mongo database and a local service account file to access Google Cloud Storage.
To run with a different datasource configuration in server/datasources.sandbox.json:
NODE_ENV=sandbox nodemon server/server.jsTo create a Docker image, run
docker build --no-cache --build-arg BUILD_ENV=sandbox -t my-api:v1 .The *.local.js and *.local.json configuration files will be excluded from the Docker image.
The folder local can contain for instance the Google account info, and is excluded from git.
API_TYPE - could be public|empty. If it's public we close all routes and open only the routes which exist in server/custom-config/api.__env__.js
RUN_JOBS - could be run|empty. This var used to run job tasks (only on one instance), to retrieve files from the IATI Datastore.
DATASTORE_CRONSCHEDULE - contains the desired cronjob schedule when running jobs.
We added the specific layer for model-config.__env__.js. Every file must use bootstrap(config) from server/custom-config/bootstrap.js
Example:
'use strict';
const modelConfig = require('./model-config.json');
const bootstrap = require('./custom-config/bootstrap');
module.exports = bootstrap(modelConfig);Copy new files from the Datastore. Run as a cronjob (default: refresh once per hour).