I’m particularly interested in low bandwidth solutions. My connection to the internet is pretty rough 20mbps down and 1mbps up with no option to upgrade.

That said, this isn’t limited to low bandwidth solutions.

I’m planning on redoing my entire setup soon to run on Kubernetes followed by expanding the scope of what my server does (Currently plex, a sftp server and local client backups). Before i do that i need a proper offsite backup solution.

  • pound_heap@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Not my solution, but I liked an idea and thinking to use it too - copy backups on external HDD and put it into your car trunk. Maybe have two drives in rotation.

    It eliminates a need to drive somewhere for rotation, and any cost of renting a safebox.

    Doesn’t protect from a serious disaster like forest fire or earthquake or nuclear war, but I keep the most important data in cloud, and if my house and car burns I would be having other problems than worrying about some homelab snapshots.

    • PrettyFlyForAFatGuy@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      arrow-down
      4
      ·
      1 year ago

      actually not a bad idea. i live in a flat so my car is parked in a car park like 200m away from my property. if my entire town goes up in smoke then i imagine that losing data would be the least of my problems

  • 486@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Any backup software that supports incremental backup should work similarly bandwitdth-wise. I like Restic. You can even do incremental backups with plain rsync, if you want. If your data does not change much, than you should be okay. For the initial backup run it would be helpful if you have physical access to the remote location so you can bring a full backup there without having to upload it through your slow uplink.

    • PrettyFlyForAFatGuy@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      arrow-down
      9
      ·
      1 year ago

      Definitely an option if I’m a bit more selective with what i back up. At the moment for the client backups i’m zipping and encrypting the entire home folder for each client once a week. I could probably write something that looks for file changes and uploads just those

  • codefossa@social.codefossa.net
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    @PrettyFlyForAFatGuy@lemmy.ml Remote backups might be rough with that upload speed. For example, you will be looking at over 2 hours per GiB uploaded.

    I personally have a 3 node setup using kubernetes and I run longhorn for volume management. I do hourly snapshots, and then daily backups of all volumes to an additional drive on one of my 3 nodes with a simple NFS server which is also running in kubernetes. In longhorn I keep 2 replicas of every volume as well so losing one doesn’t hurt anything.

    I would imagine it would be pretty easy in this case to replace my local NFS with AWS storage and then I would have remote backups, but since I back up roughly 100 GiB per day that would be a little time consuming. At my 50 Mbps that’s about 4.5 hours, though remote backups could be done less often as a last resort backup.

    • PrettyFlyForAFatGuy@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      arrow-down
      4
      ·
      1 year ago

      Yeah it is pretty rough although the files don’t necessarily change all that much so if i can set up a backup somewhere and prepopulate it with my data as it stands now then incrementally keep it update it with nightly jobs then i’m hoping it’ll mostly be done by the morning.

      My backup backup plan would be to buy a couple high capacity solid state disks and either take them myself or mail them to my parents once a week. The mailman has pretty high bandwidth, even if the latency is rather rough