Can you please share your backup strategies for linux? I’m curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?
One reason for moving to Nix was declarative config so at least that part of my system is a series of Nix files to build into a working setup.
…The rest… let’s just say “needs improvement” & I would like to set up a NAS.
- Work in a cloud-synced folder by default.
That’s all my step 🦥
Same, but with rsync.
I’m just not competent.😆
Timeshift for configs to a locally attached drive. Home partition to cloud with rsync
Firstly, for my dotfiles, I use home-manager. I keep the config on my git server and in theory I can pull it down and set up a system the way I like it.
In terms of backups, I use Pika to backup my home directory to my hard disk every day, so I can, in theory, pull back files I delete.
I also push a core selection of my files to my server using Pika, just in case my house burns down. Likewise, I pull backups from my server to my desktop (again with Pika) in case Linode starts messing me about.
I also have a 2TiB ssd I keep in a strongbox and some cloud storage which I push bigger things to sporadically.
I also take occasional data exports from online services I use. Because hey, Google or Discord can ban you at any time for no reason. :P
etckeeper, and borg/vorta for /home
I try to be good about everything being installed in packages, even if Im the one that made the package. that means I only have to worry about backing up my local package archive. but Ive never actualy recreated a personal system from a backup, and usually end up starting from a fresh install, slowly adding back things from the backup if I missed them. this tends to cut down on cruft and no longer needed hacks and fixes. also makes for a good way to be exposed to new paradigms (desktop environments, shells, etc)
something that helps is daily notes. one file for any day Im working on my system and want to remember what a custom file, confg edit, or downloaded/created package does and why. these get saved separately and I try to remember to grep them before asking the internet
i see the benefit to snapshots, but disk space is expensive, and Im (usually) careful (enough) not to lock myself out or prevent boots. anything catastophic I have to fix is usually seen as a fun, stressful learning experience! that rarely happens anymore, for better or for worse
I use Borg Backup, automated with a bash script that Borg provides. A cron job runs the script at the desired frequency. I keep backups on different computers, ideally I would recommend one copy in the cloud and one copy on a local machine. Borg compresses and encrypts its backups.
Edit: I migrated a server once using the backups from this system and it worked great.
I should really cron my Borg script rather than waiting for a sinking anxiety to set it and doing backups at random intetvals
Make sure to check if it actually ran from the cron job, cron is a finnicky tool
Keep everything on Nextcloud and back that up via Proxmox Backup Server.
Nuke and pave takes me less time to reconfigure Plasma and install NC client than bothering to back anything up directly.
I use syncthing to sync almost everything across my computer, laptop (occasional usage), server (RAID1), old laptop (powered up once every month or so), and a few other devices (that only get a small subset of my data, though). On the computer, laptop, and server, I have btrfs snapshots (snapper). Overall, this works very well, I always have 4+ copies of my data in 2+ geographical locations.
Software & Services:
- Restic client
- Restic REST server
- https://github.com/rbuchberger/res-man + systemd timers or cron to configure & run restic nightly
- healthchecks.io for monitoring
- ntfy.sh for notifications
Destinations:
- Local raspberry pi with external hdd, running restic REST server
- RAID 1 NAS at parents’ house, connected via tailscale, also running restic REST
I’ve been meaning to set up a drive rotation for the local backup so I always have one offline in case of ransomware, but I haven’t gotten to it.
Edit: For the backup set I back up pretty much everything. I’m not paying per gig, though.
Here’s one that probably nobody else here is doing. The backup goes on my mobile device. Yes, the thing in my pocket.
- Mount it over SSHFS on the local network
- Unlock a LUKS container in the form of a 30GB sparse file on the device
rsync
the files across- Lock, unmount
The backup is incremental but the container file never changes size, no matter what’s in it. Your data is in two places and always under your physical control. But the key is never stored on the remote device, so you could also do this with a VPS.
Highly recommended.
Where is the key stored?
Locally.
If your local machine dies, and you have a backup on your phone which you cannot unlock… aren’t you screwed?
Good question. No, but at a small cost in security. The key I generated using
sha512sum
using a very solid memorized passphrase. This means I can regenerate the key in the scenario you describe.
All important files go in
/data
./data
is ZFS, snapped and sent to NAS regularlyEvery time I change a setting, it gets added to a
dconf
script. Every time I install software, I write a script.Dotfiles git repo for home directory.
With that, I can spin up a fresh machine in minutes with scripts.
Example of a Bash script that performs the following tasks
- Checks the availability of an important web server.
- Checks disk space usage.
- Makes a backup of the specified directories.
- Sends a report to the administrator’s email.
Example script:
#!/bin/bash # Settings WEB_SERVER="https://example.com" BACKUP_DIR="/backup" TARGET_DIRS="/var/www /etc" DISK_USAGE_THRESHOLD=90 ADMIN_EMAIL="[email protected]" DATE=$(date +"%Y-%m-%d") BACKUP_FILE="$BACKUP_DIR/backup-$DATE.tar.gz" # Checking web server availability echo "Checking web server availability..." if curl -s --head $WEB_SERVER | grep "200 OK" > /dev/null; then echo "Web server is available." else echo "Warning: Web server is unavailable!" | mail -s "Problem with web server" $ADMIN_EMAIL fi # Checking disk space echo "Checking disk space..." DISK_USAGE=$(df / | grep / | awk '{ print $5 }' | sed 's/%//g') if [ $DISK_USAGE -gt $DISK_USAGE_THRESHOLD ]; then echo "Warning: Disk space usage exceeded $DISK_USAGE_THRESHOLD%!" | mail -s "Problem with disk space" $ADMIN_EMAIL else echo "There is enough disk space." fi # Creating backup echo "Creating backup..." tar -czf $BACKUP_FILE $TARGET_DIRS if [ $? -eq 0 ]; then echo "Backup created successfully: $BACKUP_FILE" else echo "Error creating backup!" | mail -s "Error creating backup" $ADMIN_EMAIL fi # Sending report echo "Sending report to $ADMIN_EMAIL..." REPORT="Report for $DATE\n\n" REPORT+="Web server status: $(curl -s --head $WEB_SERVER | head -n 1)\n" REPORT+="Disk space usage: $DISK_USAGE%\n" REPORT+="Backup location: $BACKUP_FILE\n" echo -e $REPORT | mail -s "Daily system report" $ADMIN_EMAIL echo "Done."
Description:
- Check web server: Uses
curl
command to check if the site is available. - Check disk space: Use
df
andawk
to check disk usage. If the threshold (90%) is exceeded, a notification is sent. - Create a backup: The
tar
command archives and compresses the directories specified in theTARGET_DIRS
variable. - Send a report: A report on all operations is sent to the administrator’s email using
mail
.
How to use:
- Set the desired parameters, such as the web server address, directories for backup, disk usage threshold and email.
- Make the script executable:
chmod +x /path/to/your/script.sh
- Add the script to
cron
to run on a regular basis:
crontab -e
Example to run every day at 00:00:
0 0 * * * /path/to/your/script.sh
Bareos. Its a newer Form of bacula and is a realworkhorse.
Dump configs to backup drive. Pray to the machine spirit that things don’t blow up. Only update when I remember. I’m a terrible admin for my own stuff.
Thanks to you, I don’t need to answer to OP anymore👍
Dot files on github, an HHD for storing photos, downloads, documents as well as my not in use games. I also sync keepass files across all network devices.