Version 1.0
A simple and secure backup script for Linux: compress → encrypt → upload to the cloud.
Uses rclone + GPG to create encrypted archives and send them to your cloud storage (OneDrive, Google Drive, S3, etc).
- Compress multiple folders/files into one archive
- Encrypt the archive with GPG (public key)
- Upload encrypted archive to any rclone remote
- Automatic cleanup of old backups (local + remote)
- Beginner-friendly and cron-ready
- Colorful logs and optional ASCII banner
- Linux with bash
- GPG (recipient public key required)
- rclone configured
- tar, zstd or pigz/gzip
➡️ Full list is in requiraments
git clone https://github.com/popek1990/rclone-gpg-cloud-backup.git
cd rclone-gpg-cloud-backup- Make files executable Before running the script, make sure it has execute permissions:
sudo chmod +x rclone-gpg-cloud-backup.sh requiramentsInstall required packages
Install all system dependencies automatically from the included list:
sudo apt update
grep -v '^#' requirements.txt | xargs -r sudo apt install -y- rclone last version / official installer:
curl https://rclone.org/install.sh | sudo bash
cd rclone-gpg-cloud-backup && \
sudo nano rclone.confAfter open this file in your text editor (for example nano) set:
BACKUP_ITEMS— which folders/files to back upGPG_RECIPIENT_FPR— your GPG public key fingerprint # `gpg --list-keysREMOTE_NAME— rclone remote name (e.g. onedrive, gdrive, s3)REMOTE_DIR— base folder on cloud storage
Make sure your rclone remote is working:
rclone configYou can test access to your remote with (example for OneDrive):
rclone lsd onedrive:Run a full compression and encryption test without sending files to the cloud:
./rclone-gpg-cloud-backup.sh --dry-runIf everything works correctly, you will see messages similar to:
✅ Dependencies OK.
✅ Encrypted: /path/to/archive.tar.zst.gpg
🚧 Dry-run: upload skipped
To create and upload an encrypted backup:
./rclone-gpg-cloud-backup.shThis will:
- Compress all items listed in
BACKUP_ITEMS - Encrypt the archive with your GPG key
- Upload the
.gpgfile to your rclone cloud remote - Clean up old backups automatically (based on retention)
To automate daily backups at 02:00, add this line to your crontab:
0 2 * * * /path/to/rclone-gpg-cloud-backup.sh >> /var/log/rclone-gpg-cloud-backup.log 2>&1Check your cron logs to confirm it’s running correctly.
- Collects all paths from
BACKUP_ITEMS - Creates a compressed archive (
.tar.zstor.tar.gz) - Encrypts it with your GPG public key →
.gpg - Uploads to:
REMOTE_NAME:REMOTE_DIR/LABEL/HOST_TAG/YYYY-MM-DD/ - Deletes old archives (based on retention settings)
Edit your local config file ./.rclone.conf with values similar to:
BACKUP_ITEMS=( "/etc" "$HOME/projects" )
BACKUP_ROOT="$HOME/cloud-backup"
LABEL="myserver"
HOST_TAG="$(hostname -s)"
COMPRESSION="zstd"
GPG_RECIPIENT_FPR="9KASPA3681F2041TAODB3ACNEPTUNE54124D1A"
GPG_IMPORT_KEY_FILE=""
REMOTE_NAME="onedrive"
REMOTE_DIR="Backups"
LOCAL_RETENTION_DAYS="30"
REMOTE_RETENTION_DAYS="45"- Quick health check (deps + config + GPG + rclone):
./rclone-gpg-cloud-backup.sh --check
- Skip cleanup (keep all backups):
./rclone-gpg-cloud-backup.sh --no-retain
- Logs are stored under:
$BACKUP_ROOT/YYYY-MM-DD/<label>_cloud_backup_<timestamp>.log
- Always verify your GPG key fingerprint before placing it in the config file.
- Import a public key manually if needed:
gpg --import /path/to/public_key.asc
- List available keys:
gpg --list-keys
If you find this project useful — star ⭐ it on GitHub and share it.
Pull requests with small improvements (e.g., better logging or new cloud examples) are welcome.
Version 1.0
- Initial release: compression, encryption, upload + retention
- Separate config file (./.rclone.conf) and cron support
- Beginner-friendly and portable