Backup & restore your dotfiles, packages and tooling across multiple machines — using GitHub Releases.
A lightweight, practical tool to replicate your working environment on several computers quickly and safely.
This project creates an archive of your selected config files, package lists (APT / Flatpak / Snap / pip / npm / go) and system metadata, optionally encrypts it with GPG, and uploads it as a GitHub Release.
Restore is interactive and safe — it takes a snapshot before changing files and offers rollback.
Perfect if you:
- maintain multiple developer workstations (your purpose: 3 machines),
- want a one-command sync of dotfiles + packages,
- need simple, auditable backups stored in your GitHub account.
- ✅ Create compressed backup archive (
backup_YYYY-MM-DD_HHMMSS.tar.gz) - 🔒 Optional GPG encryption of archives
- 🧾 Save package lists for apt, flatpak, snap, pip, npm, go
- 🕒 Systemd timer-friendly (you already have it scheduled weekly)
- ♻️ Cleanup old releases (keeps X latest)
- 🧰 Restore that:
- makes a snapshot before modifying files
- validates backup contents
- offers rollback (
--rollback) - post-restore health checks for critical tools
- 🧪
--dry-runsupport for safe testing - 📝 Colorful, readable logs for each step
Prereqs (typical):
bash,gh(GitHub CLI),tar,gzip,gpg(optional),sudo(for some restore steps),pv(optional progress).
Install any missing tool or accept warnings — script logs missing commands but continues where possible.
- Clone repo:
git clone https://github.com/Xzar-x/github-release-dotfiles-backup.git
cd github-release-dotfiles-backup- Create and edit config:
# edit backup_restore.config to point to your GH repo, backup paths and GPG recipient (optional)- Dry-run to verify what would happen:
./backup-cloud.sh --dry-run- Run real backup:
./backup-cloud.sh- Restore from an existing backup:
./restore-cloud.sh
# follow interactive menu to select release & sections to restore- Rollback last restore (if needed):
./restore-cloud.sh --rollbackPlace a real config at the script directory (backup_restore.config):
# backup_restore.config (example)
GH_REPO="youruser/your-private-backup-repo"
GPG_RECIPIENT_EMAIL="you@example.com" # optional: leave empty to disable encryption
LOG_FILE="$HOME/.backup_logs/backup.log"
# paths to back up (absolute or relative)
BACKUP_PATHS=( "$HOME/.zshrc" "$HOME/.config" "$HOME/.ssh" )
KEEP_LATEST_RELEASES=5
# optional: array of go tools for restore if .log_go_packages.txt missing
GO_TOOLS_TO_BACKUP=( "github.com/golangci/golangci-lint/cmd/golangci-lint@latest" )Important: Do not commit your real
backup_restore.configwith secrets or tokens. Add it to.gitignoreif needed.
a) Repeat every 48 hours (independent of calendar dates)
/etc/systemd/system/backup.service
[Unit]
Description=Run dotfiles backup (oneshot)
[Service]
Type=oneshot
WorkingDirectory=/path/to/github-release-dotfiles-backup
ExecStart=/path/to/github-release-dotfiles-backup/backup-cloud.sh/etc/systemd/system/backup.timer
[Unit]
Description=Run dotfiles backup every 48 hours
[Timer]
OnBootSec=10min
OnUnitActiveSec=2d
Persistent=true
[Install]
WantedBy=timers.targetb) Calendar-based: every other day (even days)
If you prefer calendar parity:
[Timer]
OnCalendar=*-*-*~2
Persistent=trueNote:
*-*-*~2triggers on even calendar days (2,4,6,...). UseOnUnitActiveSec=2dfor exact 48h gaps.
Enable:
sudo systemctl daemon-reload
sudo systemctl enable --now backup.timer
systemctl list-timers | grep backup/etc/systemd/system/backup-clean.service
[Unit]
Description=Cleanup old pre_restored files
[Service]
Type=oneshot
ExecStart=/usr/bin/find /home/youruser -name "*.pre_restored*" -type f -mtime +14 -delete/etc/systemd/system/backup-clean.timer
[Unit]
Description=Run backup-clean once a day
[Timer]
OnCalendar=daily
Persistent=true
[Install]
WantedBy=timers.targetEnable it:
sudo systemctl daemon-reload
sudo systemctl enable --now backup-clean.timerbackup.sh
├─ prepare BACKUP_DIR
├─ copy selected files → BACKUP_DIR
├─ save system metadata & package lists
├─ tar + gzip (pipe via pv if available)
├─ optional: encrypt with gpg -> .gpg file
└─ upload to GitHub Releases using `gh`
└─ cleanup_old_releases() keeps latest N releases
Restore (restore-cloud.sh) does:
- list releases (gh)
- download selected archive
- decrypt (if needed)
- unpack to temp folder
- validate content
- create a snapshot of existing configs
- interactively restore sections (config files, apt, debs, flatpak, snap, pip, npm, go, git repos)
- post-restore health check
- show restore summary
- offer rollback via stored snapshot
- Make the GitHub repo private. The script checks that the repo is private before uploading.
- Use a minimal-purpose GitHub token for
ghCLI (scopes needed:repofor releases; choose least privilege necessary). - Keep
backup_restore.configprivate — do not push it to a public repo. - If using GPG encryption, ensure recipient public key is present on the machine that creates backups; private key is needed only for decrypting restores on machines you trust.
- Validate
ghauthentication (gh auth status) before running the real backup.
- Always run
./backup-cloud.sh --dry-runto check operations without making changes. - Check logs (if
LOG_FILEset in config) or run the script interactively to see colored output. - If upload fails: check
ghauth and repo visibility. - If restore fails: use
./restore-cloud.sh --rollbackto revert to last snapshot.
KEEP_LATEST_RELEASEScontrols how many releases you keep; older ones are deleted automatically bycleanup_old_releases().- Add a
backup-cleantimer (example above) to automatically remove.pre_restored*files older than N days to avoid clutter. - If you want backup frequency different than weekly, change the systemd timer as described earlier.
This project is MIT-licensed — feel free to adapt for personal or internal use.
