Can you please share your backup strategies for linux? I’m curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

  • clif@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 hours ago

    Internal RAID1 as first line of defense. Rsync to external drives where at least one is always offsite as second. Rclone to cloud storage for my most important data as the third.

    Backups 2 and 3 are manual but I have reminders set and do it about once a month. I don’t accrue much new data that I can’t easily replace so that’s fine for me.

  • capital@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 hours ago

    restic -> Wasabi, automated with shell script and cron. Uses an include list to tell it what paths to back up.

    Script has Pushover credentials to send me backup alerts. Parses restic log to tell me how much was backed up, removed, success/failure of backup, and current repo size.

    To be added: a periodic restore of a random file to have its hash compared to the current version of the file (will happen right after backup, unlikely to have changed in my workload), which will be subsequently deleted, and alert sent letting me know how the restore test went.

  • TimeSquirrel@kbin.melroy.org
    link
    fedilink
    arrow-up
    10
    ·
    8 hours ago

    I plug in an external drive every so often and drag and drop parts of my home dir into it like it’s 1997. I’m not running a data center here. The boomer method is good enough and I don’t do anything important enough to warrant going all out with professional snapshot based backup solutions and stuff. And I only save personal documents, media, and custom config files. Everything else is replaceable.

    • Papamousse@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      7 hours ago

      yeah about the same, old coot here, I plug a USB3-SSD (encrypted with LUKS) and rsync from internal HD to this external HD. That’s it.

  • astrsk@fedia.io
    link
    fedilink
    arrow-up
    12
    ·
    8 hours ago

    Borg backup is gold standard, with Vorta as a very nice GUI on machines that need it. Otherwise, all my other Linux machines are running in proxmox hypervisors and have container/snapshot/vm backups regularly through proxmox backup server to another machine. All the backup data is then replicated regularly, remotely via truenas scale replication tasks.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 hours ago

      Borg via Vorta handles the hard parts: encryption, compression, deduplication, and archiving. You can mount backup snapshots like drives, without needing to expand them. It splits archives into small chunks so you can easily upload them to your cloud service of choice.

    • NotAnArdvark@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      Adding my “Me too” to Vorta/Borg. I use it with Borgbase, which I like because it’s legitimately cheap and they support Borg development. As well, you can set Borg backups with Borgbase to “append only,” which prevents ransomware or other unexpected “whoopsies” from wiping out your backup history.

      I backup most of my computer every hour, but have pruning rules that make sure things don’t get too out of hand. I have a second backup that backs everything up to my NAS (using Vorta, again). This is helpful for things like my downloads folder, virtual machines, or STEAM library - things I wouldn’t want to backup over the network, but on occasion I do find myself going “whoops, I wanted that.”

      I also have Vorta working on my Mom’s Macbook, then have Borgbase send me an email when there isn’t any activity for longer than a couple of days. Once I got automatic pruning working right I never had to touch this again.

  • JustEnoughDucks@feddit.nl
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    6 hours ago

    321

    Kopia backup to secondary HDD

    • Pictures (phone photos backed up to my server via immich)
    • workspace (git repos, ECAD, MCAD, firmware, etc…)
    • qmk layout
    • Documents
    • vim folder with bundles
    • ebooks

    KDE vaults stores on secondary HDD

    Soon I will set up kopia to also back up every via SSH to my server and then small size essentials and important docs via google drive

    I need to set server cloud backups too, but haven’t had the time…

  • LemmyBe@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    7 hours ago

    I use Bluebuild to create a reproducible system, plus a post-install script to handle other post-install tasks such as setting up initial preferences.

    Also Vorta to backup files and settings to external HD, plus OneDrive Linux client to sync files and settings to cloud.

  • aquafunkalisticbootywhap@lemmy.sdf.org
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    11 hours ago

    etckeeper, and borg/vorta for /home

    I try to be good about everything being installed in packages, even if Im the one that made the package. that means I only have to worry about backing up my local package archive. but Ive never actualy recreated a personal system from a backup, and usually end up starting from a fresh install, slowly adding back things from the backup if I missed them. this tends to cut down on cruft and no longer needed hacks and fixes. also makes for a good way to be exposed to new paradigms (desktop environments, shells, etc)

    something that helps is daily notes. one file for any day Im working on my system and want to remember what a custom file, confg edit, or downloaded/created package does and why. these get saved separately and I try to remember to grep them before asking the internet

    i see the benefit to snapshots, but disk space is expensive, and Im (usually) careful (enough) not to lock myself out or prevent boots. anything catastophic I have to fix is usually seen as a fun, stressful learning experience! that rarely happens anymore, for better or for worse

  • seaQueue@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    9 hours ago

    I leverage btrfs or ZFS snapshots. I take rolling system level snapshots on a schedule (daily, weekly, monthly and separately before any package upgrades or installs) and user data snapshots every couple of hours. Then I use btrbk to sync those snapshots to an external drive at least once a week. When I have all of my networking gear and home services setup I also sync all of this to storage on my NAS. Any hosts on the network keep rolling snapshots stored on the NAS as well.

    Important data also gets shoveled into a B2 bucket and/or Google drive if I need to be able to access it from a phone.

    I keep snapshots small by splitting data up into well defined subvolumes, anything that can be reacquired from the cloud (downloads, package caches, steam libraries, movies, music, etc) isn’t included in the backup strategy. If I download something and it’s hard to find or important I move it out of downloads and into a location that is covered by my backups.

  • Minty95@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    9 hours ago

    Timeshift for the system, works perfectly, if you screw up the system, bad update for instance just start it, and you’ll be back up running in less than ten minutes. Simple Cron backups for data, documents etc, just in case you delete a folder, document, image etc . Both of these options to a second internal HD

  • traches@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    10 hours ago

    Software & Services:

    Destinations:

    • Local raspberry pi with external hdd, running restic REST server
    • RAID 1 NAS at parents’ house, connected via tailscale, also running restic REST

    I’ve been meaning to set up a drive rotation for the local backup so I always have one offline in case of ransomware, but I haven’t gotten to it.

    Edit: For the backup set I back up pretty much everything. I’m not paying per gig, though.

  • Earth Walker@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    edit-2
    11 hours ago

    I use Borg Backup, automated with a bash script that Borg provides. A cron job runs the script at the desired frequency. I keep backups on different computers, ideally I would recommend one copy in the cloud and one copy on a local machine. Borg compresses and encrypts its backups.

    Edit: I migrated a server once using the backups from this system and it worked great.

  • Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    9
    ·
    11 hours ago

    I use rsync to incrementally back up / to a separate drive, as well as a drive on another device (my server), which then packs, compresses and encrypts the latest backup of all devices daily, and uploads them to Hetzner as well as GDrive.

  • shapis@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    5 hours ago

    All my code and projects are on GitHub/codeberg.

    All my personal info and photos are on proton drive.

    If Linux shits itself (and it does often) who cares. I can have it up and running again in a fresh install in ten minutes.