Kepler is a vulnerability database and lookup store and API currently utilising National Vulnerability Database as data sources; implementing CPE 2.3 tree expressions and version range evaluation in realtime.
When setting up kepler project locally you can choose either podman or docker container runtime.
Docker (recommended)
We provide a docker bundle with kepler, dedicated PostgreSQL database and Ofelia as job scheduler for continuous update
export CONTAINER_SOCKET=/var/run/docker.sockdocker compose build
docker-compose upPodman (optional)
export CONTAINER_SOCKET=/run/user/1000/podman/podman.sockpodman compose build
podman-compose upOr just use an alias (if you're using podman)
alias docker=podman
The /data directory serves as the source directory for downloading, extracting CVE JSON files and importing data into Kepler DB. When building the kepler image with docker-compose.yaml, the local /data directory is bound to the container:
volumes:
- ./data:/data:ZThe system supports two scenarios:
- Pre-populated
/data: Contains.gzfiles for faster development setup - data is extracted and imported directly - Empty
/data: Triggers automatic download of NIST sources before extraction and import (takes longer as recent years contain large files)
This flexibility allows for reduced initial image size in deployed environments, where sources are updated frequently and downloaded as needed.
# Remove previous volumes
docker-compose down -v
# Re-build a new image
docker compose build
# Spin up a new kepler + kepler_db cluster
docker-compose up
# Run the import task
for year in $(seq 2002 2025); do
docker exec -it kepler kepler import_nist $year -d /data
doneNote
- Ensure you have removed old
/datacontents and only have v2.0.gzNIST files - Kepler doesn't automatically populate the database from
.gzfiles until you explicitly run theimport_nistcommand
[2025-09-15T09:17:46Z INFO domain_db::cve_sources::nist] reading /data/nvdcve-2.0-2025.json ...
[2025-09-15T09:17:46Z INFO domain_db::cve_sources::nist] loaded 11536 CVEs in 351.54686ms
[2025-09-15T09:17:46Z INFO kepler] connected to database, importing records ...
[2025-09-15T09:17:46Z INFO kepler] configured 'KEPLER__BATCH_SIZE' 5000
[2025-09-15T09:17:46Z INFO kepler] 11536 CVEs pending import
[2025-09-15T09:17:47Z INFO domain_db::db] batch imported 5000 object records ...
[2025-09-15T09:17:47Z INFO domain_db::db] batch imported 5000 object records ...
[2025-09-15T09:17:47Z INFO domain_db::db] batch imported 1536 object records ...
[2025-09-15T09:17:48Z INFO kepler] batch imported 5000 cves ...
[2025-09-15T09:17:48Z INFO kepler] batch imported 10000 cves ...
[2025-09-15T09:17:48Z INFO kepler] batch imported 15000 cves ...
[2025-09-15T09:17:48Z INFO kepler] batch imported 20000 cves ...
[2025-09-15T09:17:48Z INFO kepler] batch imported 25000 cves ...
[2025-09-15T09:17:48Z INFO kepler] batch imported 30000 cves ...
[2025-09-15T09:17:49Z INFO kepler] batch imported 35000 cves ...
[2025-09-15T09:17:49Z INFO kepler] imported 37592 records Total
[2025-09-15T09:17:49Z INFO kepler] 37592 new records createdSteps:
# 1. Delete all `.gz` files from `/data`
# 2. Destroy the existing volume where we bound populated `/data`.
docker-compose down -v
# 3. Build a new image with an empty `/data` mount.
docker compose build
# 4. Re-trigger import (this time Kepler will download all year `.gz` files first, then proceed with `.json` extraction and database import)
for year in $(seq 2002 2025); do
docker exec -it kepler kepler import_nist $year -d /data
doneNotice: The extra downloading step appears here compared to normal import with pre-populated /data.
for year in $(seq 2002 2025); do podman exec -it kepler kepler import_nist $year -d /data; done
[2025-09-15T09:20:59Z INFO domain_db::cve_sources] downloading https://nvd.nist.gov/feeds/json/cve/2.0/nvdcve-2.0-2002.json.gz to /data/nvdcve-2.0-2002.json.gz ...
[2025-09-15T09:21:00Z INFO domain_db::cve_sources::nist] extracting /data/nvdcve-2.0-2002.json.gz to /data/nvdcve-2.0-2002.json ...
[2025-09-15T09:21:00Z INFO domain_db::cve_sources::nist] reading /data/nvdcve-2.0-2002.json ...
[2025-09-15T09:21:00Z INFO domain_db::cve_sources::nist] loaded 6546 CVEs in 92.942702ms
[2025-09-15T09:21:00Z INFO kepler] connected to database, importing records ...
[2025-09-15T09:21:00Z INFO kepler] configured 'KEPLER__BATCH_SIZE' 5000
[2025-09-15T09:21:00Z INFO kepler] 6546 CVEs pending import
[2025-09-15T09:21:01Z INFO domain_db::db] batch imported 5000 object records ...
[2025-09-15T09:21:01Z INFO domain_db::db] batch imported 1546 object records ...
[2025-09-15T09:21:01Z INFO kepler] batch imported 5000 cves ...
[2025-09-15T09:21:01Z INFO kepler] imported 9159 records Total
[2025-09-15T09:21:01Z INFO kepler] 9159 new records created
[2025-09-15T09:21:01Z INFO domain_db::cve_sources] downloading https://nvd.nist.gov/feeds/json/cve/2.0/nvdcve-2.0-2003.json.gz to /data/nvdcve-2.0-2003.json.gz ...
[2025-09-15T09:21:02Z INFO domain_db::cve_sources::nist] extracting /data/nvdcve-2.0-2003.json.gz to /data/nvdcve-2.0-2003.json ...Kepler automatically prevents duplicate data imports through database constraints:
- Object table: Unique constraint on the
cvefield prevents duplicate objects - CVEs table: Composite unique constraint on
(cve, vendor, product)prevents duplicate vulnerability entries
This ensures data integrity and prevents redundant imports when running import commands multiple times.
Database constraints source code:
When the application starts, it automatically checks for and applies any pending database migrations. To prevent automatic migration and stop when a pending migration is detected, remove the --migrate option.
When using our Docker bundle, the system automatically fetches and imports new vulnerability records every 3 hours. Historical data must be imported manually using the commands below.
Kepler currently supports two data sources: National Vulnerability Database and NPM Advisories. Historical data can be imported using the following methods:
To import NIST records from all available years (2002 to 2025):
for year in $(seq 2002 2025); do
docker exec -it kepler kepler import_nist $year -d /data
done-
The system automatically fetches and imports new records every 3 hours using a scheduled
Ofeliajob -
Use the
--refreshargument to force re-downloading from the National Vulnerability Database (NVD) source
Example - Refresh data for 2025
docker exec -it kepler kepler import_nist 2025 -d /data --refreshExample - Custom batch size -e KEPLER__BATCH_SIZE
docker exec -it -e KEPLER__BATCH_SIZE=4500 kepler kepler import_nist 2025 -d /data --refreshNOTE: Postgres supports 65535 params total so be aware when changing the default
KEPLER__BATCH_SIZE=5000- Postgres limits
There are two primary APIs as of right now β the product API and the cve API detailed below.
Products can be listed:
curl http://localhost:8000/productsGrouped by vendor:
curl http://localhost:8000/products/by_vendorOr searched:
curl http://localhost:8000/products/search/iphoneTo use the vulnerabilities search API via cURL (prepend node- to the product name in order to search for NPM specific packages):
curl \
--header "Content-Type: application/json" \
--request POST \
--data '{"product":"libxml2","version":"2.9.10"}' \
http://localhost:8000/cve/searchResponses are cached in memory with a LRU limit of 4096 elements.
If you're interested in adding new migrations you should check out and install Diesel-cli.
After you have diesel-cli installed, you can run:
diesel migration generate <name_your_migration>This will generate up.sql and down.sql files which you can then apply with:
diesel migration run- Or by restarting your Kepler container (this automatically triggers migrations)
Alternatively, you can build Kepler from source. You'll need rust, cargo, and libpg-dev (or the equivalent PostgreSQL library for your Linux distribution):
cargo build --release
If you get the linking with cc error that looks similar to this one, you're likely missing some c related tooling or libs.
error: linking with `cc` failed: exit status: 1
//...
= note: /usr/bin/ld: cannot find -lpq: No such file or directory
collect2: error: ld returned 1 exit statusThis error requires installing PostgreSQL-related C libraries:
Fedora:
sudo dnf install postgresql-develArch:
sudo pacman -S postgresql-libs