Can you please share your backup strategies for linux? I’m curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

  • @[email protected]
    link
    fedilink
    69 months ago

    One reason for moving to Nix was declarative config so at least that part of my system is a series of Nix files to build into a working setup.

    …The rest… let’s just say “needs improvement” & I would like to set up a NAS.

  • Cynicus Rex
    link
    fedilink
    4
    edit-2
    9 months ago
    1. Work in a cloud-synced folder by default.

    That’s all my step 🦥

  • SavvyWolf
    link
    fedilink
    English
    29 months ago

    Firstly, for my dotfiles, I use home-manager. I keep the config on my git server and in theory I can pull it down and set up a system the way I like it.

    In terms of backups, I use Pika to backup my home directory to my hard disk every day, so I can, in theory, pull back files I delete.

    I also push a core selection of my files to my server using Pika, just in case my house burns down. Likewise, I pull backups from my server to my desktop (again with Pika) in case Linode starts messing me about.

    I also have a 2TiB ssd I keep in a strongbox and some cloud storage which I push bigger things to sporadically.

    I also take occasional data exports from online services I use. Because hey, Google or Discord can ban you at any time for no reason. :P

  • aquafunkalisticbootywhap
    link
    fedilink
    8
    edit-2
    9 months ago

    etckeeper, and borg/vorta for /home

    I try to be good about everything being installed in packages, even if Im the one that made the package. that means I only have to worry about backing up my local package archive. but Ive never actualy recreated a personal system from a backup, and usually end up starting from a fresh install, slowly adding back things from the backup if I missed them. this tends to cut down on cruft and no longer needed hacks and fixes. also makes for a good way to be exposed to new paradigms (desktop environments, shells, etc)

    something that helps is daily notes. one file for any day Im working on my system and want to remember what a custom file, confg edit, or downloaded/created package does and why. these get saved separately and I try to remember to grep them before asking the internet

    i see the benefit to snapshots, but disk space is expensive, and Im (usually) careful (enough) not to lock myself out or prevent boots. anything catastophic I have to fix is usually seen as a fun, stressful learning experience! that rarely happens anymore, for better or for worse

  • Earth Walker
    link
    fedilink
    35
    edit-2
    9 months ago

    I use Borg Backup, automated with a bash script that Borg provides. A cron job runs the script at the desired frequency. I keep backups on different computers, ideally I would recommend one copy in the cloud and one copy on a local machine. Borg compresses and encrypts its backups.

    Edit: I migrated a server once using the backups from this system and it worked great.

  • @[email protected]
    link
    fedilink
    English
    39 months ago

    Keep everything on Nextcloud and back that up via Proxmox Backup Server.

    Nuke and pave takes me less time to reconfigure Plasma and install NC client than bothering to back anything up directly.

  • @[email protected]
    link
    fedilink
    29 months ago

    I use syncthing to sync almost everything across my computer, laptop (occasional usage), server (RAID1), old laptop (powered up once every month or so), and a few other devices (that only get a small subset of my data, though). On the computer, laptop, and server, I have btrfs snapshots (snapper). Overall, this works very well, I always have 4+ copies of my data in 2+ geographical locations.

  • @[email protected]
    link
    fedilink
    English
    2
    edit-2
    9 months ago

    Software & Services:

    Destinations:

    • Local raspberry pi with external hdd, running restic REST server
    • RAID 1 NAS at parents’ house, connected via tailscale, also running restic REST

    I’ve been meaning to set up a drive rotation for the local backup so I always have one offline in case of ransomware, but I haven’t gotten to it.

    Edit: For the backup set I back up pretty much everything. I’m not paying per gig, though.

  • @[email protected]
    link
    fedilink
    59 months ago

    Here’s one that probably nobody else here is doing. The backup goes on my mobile device. Yes, the thing in my pocket.

    • Mount it over SSHFS on the local network
    • Unlock a LUKS container in the form of a 30GB sparse file on the device
    • rsync the files across
    • Lock, unmount

    The backup is incremental but the container file never changes size, no matter what’s in it. Your data is in two places and always under your physical control. But the key is never stored on the remote device, so you could also do this with a VPS.

    Highly recommended.

  • fmstrat
    link
    fedilink
    English
    19 months ago

    All important files go in /data.

    /data is ZFS, snapped and sent to NAS regularly

    Every time I change a setting, it gets added to a dconf script. Every time I install software, I write a script.

    Dotfiles git repo for home directory.

    With that, I can spin up a fresh machine in minutes with scripts.

  • @[email protected]
    link
    fedilink
    English
    69 months ago

    Example of a Bash script that performs the following tasks

    1. Checks the availability of an important web server.
    2. Checks disk space usage.
    3. Makes a backup of the specified directories.
    4. Sends a report to the administrator’s email.

    Example script:

    #!/bin/bash
    
    # Settings
    WEB_SERVER="https://example.com"
    BACKUP_DIR="/backup"
    TARGET_DIRS="/var/www /etc"
    DISK_USAGE_THRESHOLD=90
    ADMIN_EMAIL="[email protected]"
    DATE=$(date +"%Y-%m-%d")
    BACKUP_FILE="$BACKUP_DIR/backup-$DATE.tar.gz"
    
    # Checking web server availability
    echo "Checking web server availability..."
    if curl -s --head $WEB_SERVER | grep "200 OK" > /dev/null; then
    echo "Web server is available."
    else
    echo "Warning: Web server is unavailable!" | mail -s "Problem with web server" $ADMIN_EMAIL
    fi
    
    # Checking disk space
    echo "Checking disk space..."
    DISK_USAGE=$(df / | grep / | awk '{ print $5 }' | sed 's/%//g')
    if [ $DISK_USAGE -gt $DISK_USAGE_THRESHOLD ]; then
    echo "Warning: Disk space usage exceeded $DISK_USAGE_THRESHOLD%!" | mail -s "Problem with disk space" $ADMIN_EMAIL
    else
    echo "There is enough disk space."
    fi
    
    # Creating backup
    echo "Creating backup..."
    tar -czf $BACKUP_FILE $TARGET_DIRS
    
    if [ $? -eq 0 ]; then
    echo "Backup created successfully: $BACKUP_FILE"
    else
    echo "Error creating backup!" | mail -s "Error creating backup" $ADMIN_EMAIL
    fi
    
    # Sending report
    echo "Sending report to $ADMIN_EMAIL..."
    REPORT="Report for $DATE\n\n"
    REPORT+="Web server status: $(curl -s --head $WEB_SERVER | head -n 1)\n"
    REPORT+="Disk space usage: $DISK_USAGE%\n"
    REPORT+="Backup location: $BACKUP_FILE\n"
    
    echo -e $REPORT | mail -s "Daily system report" $ADMIN_EMAIL
    
    echo "Done."
    

    Description:

    1. Check web server: Uses curl command to check if the site is available.
    2. Check disk space: Use df and awk to check disk usage. If the threshold (90%) is exceeded, a notification is sent.
    3. Create a backup: The tar command archives and compresses the directories specified in the TARGET_DIRS variable.
    4. Send a report: A report on all operations is sent to the administrator’s email using mail.

    How to use:

    1. Set the desired parameters, such as the web server address, directories for backup, disk usage threshold and email.
    2. Make the script executable:
    chmod +x /path/to/your/script.sh
    
    1. Add the script to cron to run on a regular basis:
    crontab -e
    

    Example to run every day at 00:00:

    0 0 * * * /path/to/your/script.sh
    
  • nomad
    link
    fedilink
    39 months ago

    Bareos. Its a newer Form of bacula and is a realworkhorse.

  • @[email protected]
    link
    fedilink
    English
    99 months ago

    Dump configs to backup drive. Pray to the machine spirit that things don’t blow up. Only update when I remember. I’m a terrible admin for my own stuff.

  • @[email protected]
    link
    fedilink
    49 months ago

    Dot files on github, an HHD for storing photos, downloads, documents as well as my not in use games. I also sync keepass files across all network devices.