MSStats is a tool for extracting MemoryStore database metrics. The script is able to process all the Redis databases, both single instance and replicated (Basic or Standard) ones that belong to a specific service account. Multiple service accounts can be used at once.
The script will purely use google cloud monitoring api for getting the metrics. It will never connect to the Redis databases and it will NOT send any commands to the databases.
This script by no means will affect the performance and the data stored in the Redis databases it is scanning.
The script will run on any system with Python 3.9 or greater installed. If you receive dependency errors, try a more recent version of Python. Python is an ever changing environment and things change that are out of our control.
Download the repository
git clone https://github.com/Redislabs-Solution-Architects/msstats && cd msstats
Prepare and activate the virtual environment.
Ensure you have the right version of python in your path. On a Mac it might be python3 as Mac as v. 2.7 installed by default.
python -m venv .env && source .env/bin/activate
Install necessary libraries and dependencies
pip install -r requirements.txt
Copy your service account .json files in the root directory of the project:
(skip this step if you want to use the user-account of your machine, if you already use gcloud on the machine)
cp path/to/service_account.json .
This script only retrieves information and metrics for Redis standalone instances. It does not capture information about Valkey or Redis Cluster instances. If you have Valkey or Redis Cluster instances, run the memorystore.py script instead.
python msstats.py
This generates a file named .xlsx. You need to get that file and send it to Redis.
This script retrieves information and metrics for Redis, Valkey and Redis Cluster instances. It leverages logic from the msstats.py script to consolidate and package the metrics.
# To use the copied service-account:
python memorystore.py --project YOUR_PROJECT_ID --credentials /path/to/sa.json --out /path/to/out.csv
This generates a csv file. You need to get that file and send it to Redis. By default, it uses steps of 60 seconds and a period of 7 days (604800 seconds). You can set different values as follows:
python memorystore.py --project YOUR_PROJECT_ID --credentials /path/to/sa.json --out /path/to/out.csv --duration 1800 --step 300
This can help solving issue like:
google.api_core.exceptions.ResourceExhausted: 429 Maximum response size of 200000000 bytes reached.
Consider querying less data by increasing the step or interval, using more filters and aggregations, or limiting the time duration.
Follow the installation process as described in the previous section and apply the following before executing the script.
Grant monitoring.viewer role to the service account in all associated Google Cloud projects
./grant_sa_monitoring_viewer.sh <service_account>
For example,
./grant_sa_monitoring_viewer.sh [email protected]
Edit the batch_run_msstats.sh file to set the right path to the credentials file.
To collect metrics and information for Redis standalone instances, execute:
./batch_run_msstats.sh
To collect metrics and information for Redis, Valkey and Redis Cluster instances, execute:
./batch_run_msstats.sh
Remove monitoring.viewer role from the service account in all associated Google Cloud projects.
./remove_sa_monitoring_viewer.sh <service_account>
For example,
./remove_sa_monitoring_viewer.sh [email protected]
Run all tests (unit and integration):
pytest test_msstats.py
Format code:
black *.py
When finished do not forget to deactivate the virtual environment
deactivate