Can you please share your backup strategies for linux? I’m curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

  • Wanderer@r.nf
    link
    fedilink
    English
    arrow-up
    2
    ·
    35 minutes ago

    The glorious life of openSUSE, defaults to btrfs on install and everything is preconfigured with snapper out of the box. Easy life, nothing to worry about.

  • PetteriPano@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 hour ago

    My desktop, laptop and homelab all synd my important stuff over syncthing. They all do btrfs snapshots three months back in case an oopsie would propagate.

    The homelab additionally fetches deduplicated snapshots of my VPS weekly, before syncing all of the above to an encrypted hetzner storage for those burning-down-the-house events.

  • digdilem@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    45 minutes ago

    Scuse the cut and paste, but this is something I recently thought quite hard about and blogged, so stealing my own content:

    What to back up? This is a core question to ask when you start planning. I think it’s quite simply answered by asking the secondary question: “Can I get the data again?” Don’t back up stuff you downloaded from the public internet unless it’s particularly rare. No TV, no Movies, no software installers. Don’t hoard data you can replace. Do back up stuff you’ve personally created and that doesn’t exist elsewhere, or stuff that would cause you a lot of effort or upset if it wasn’t available. Letters you’ve written, pictures you’ve taken, code you authored, configurations and systems that took you a lot of time to set up and fine tune.

    If you want to be able to restore a full system, that’s something else and generally dealt best with imaging – I’m talking about individual file backups here!

    Backup Scenario Multiple household computers. Home linux servers. Many services running natively and in docker. A couple of windows computers.

    Daily backups Once a day, automate backups of your important files.

    On my linux machines, that’s things like some directories like /etc, /root, /docker-data, some shared files.

    On my windows machines, then that’s some mapping data, word documents, pictures, geocaching files, generated backups and so on.

    You work out the files and get an idea of how much space you need to set aside.

    Then, with automated methods, have these files copied or zipped up to a common directory on an always-available server. Let’s call that /backup.

    These should be versioned, so that older ones get expired automatically. You can do that with bash scripts, or automated backup software (I use backup-manager for local machines, and backuppc or robocopy for windows ones)

    How many copies you keep depends on your preferences – 3 is a sound number, but choose what you want and what disk space you have. More than 1 is a good idea since you may not notice the next day if something is missing or broken.

    Monthly Backups – Make them Offline if possible

    I puzzled a long time over the best way to do offline backups. For years I would manually copy the contents of /backup to large HDDs once a month. That took an hour or two for a few terabytes.

    Now, I attach an external USB hard drive to my server, with a smart power socket controlled by Home Assistant.

    This means it’s “cold storage”. The computer can’t access it unless the switch is turned on – something no ransomware knows about. But I can write a script that turns on the power, waits a minute for it to spin up, then mounts the drive and copies the data. When it’s finished, it’ll then unmount the drive and turn off the switch, and lastly, email me to say “Oi, change the drives, human”.

    Once I get that email, I open my safe (fireproof and in a different physical building) and take out the oldest of three usb Caddies. Swap that with the one on the server and put that away. Classic Grandfather/Father/Son backups.

    Once a year, I change the oldest of those caddies to “Annual backup, 2024” and buy a new one. That way no monthly drive will be older than three years, and I have a (probably still viable) backup by year.

    BTW – I use USB3 HDD caddies (and do test for speed – they vary hugely) because I keep a fair bit of data. But you can also use one of the large capacity USB Thumbdrives or MicroSD cards for this. It doesn’t really matter how slowly it writes, since you’ll be asleep when it’s backing up. But you do really want it to be reasonably fast to read data from, and also large enough for your data – the above system gets considerably less simple if you need multiple disks.

    Error Check: Of course with automated systems, you need additional automated systems to ensure they’re working! When you complete a backup, touch a file to give you a timestamp of when it was done – online and offline. I find using “tree” to catalogue the files is worthwhile too, so you know what’s on there.

    Lastly – test your backups. Once or twice a year, pick a backup at random and ensure you can copy and unpack the files. Ensure they are what you expect and free from errors.

  • JustEnoughDucks@feddit.nl
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 hour ago

    321

    Kopia backup to secondary HDD

    • Pictures (phone photos backed up to my server via immich)
    • workspace (git repos, ECAD, MCAD, firmware, etc…)
    • qmk layout
    • Documents
    • vim folder with bundles
    • ebooks

    KDE vaults stores on secondary HDD

    Soon I will set up kopia to also back up every via SSH to my server and then small size essentials and important docs via google drive

    I need to set server cloud backups too, but haven’t had the time…

  • TimeSquirrel@kbin.melroy.org
    link
    fedilink
    arrow-up
    7
    ·
    3 hours ago

    I plug in an external drive every so often and drag and drop parts of my home dir into it like it’s 1997. I’m not running a data center here. The boomer method is good enough and I don’t do anything important enough to warrant going all out with professional snapshot based backup solutions and stuff. And I only save personal documents, media, and custom config files. Everything else is replaceable.

    • Papamousse@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      yeah about the same, old coot here, I plug a USB3-SSD (encrypted with LUKS) and rsync from internal HD to this external HD. That’s it.

  • LemmyBe@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    2 hours ago

    I use Bluebuild to create a reproducible system, plus a post-install script to handle other post-install tasks such as setting up initial preferences.

    Also Vorta to backup files and settings to external HD, plus OneDrive Linux client to sync files and settings to cloud.

  • astrsk@fedia.io
    link
    fedilink
    arrow-up
    8
    ·
    4 hours ago

    Borg backup is gold standard, with Vorta as a very nice GUI on machines that need it. Otherwise, all my other Linux machines are running in proxmox hypervisors and have container/snapshot/vm backups regularly through proxmox backup server to another machine. All the backup data is then replicated regularly, remotely via truenas scale replication tasks.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      Borg via Vorta handles the hard parts: encryption, compression, deduplication, and archiving. You can mount backup snapshots like drives, without needing to expand them. It splits archives into small chunks so you can easily upload them to your cloud service of choice.

  • Nicht BurningTurtle@feddit.org
    link
    fedilink
    arrow-up
    2
    ·
    3 hours ago

    I have my important folders synced to my Nextcloud and create nightly snapshots of that to a different drive using borg.

    One thing I still need to do, is offsite encrypted backups using rsync.

  • The Doctor@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 hours ago

    All of my servers make local dumps of their databases and config files to directories owned by unprivileged users. This includes file paths, permissions, and ownerships (so I know how to put them back).

    My primary research server at home uses rsync to pull copies of those local backups from my servers.

    My primary research server uses Restic to make a daily incremental backup to Backblaze’s B2 service.

  • TomBombadil [he/him, she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    2 hours ago

    My backup is begging my computer to implode so I can experience the sweet relief of getting offline.

    But also I use external discs and make copies of important files I can’t recreate. Don’t care too much about config as I am happy enough to distro hop and set things up anew.