I know that for data storage the best bet is a NAS and RAID1 or something in that vein, but what about all the docker containers you are running, carefully configured services on your rpi, installed *arr services on your PC, etc.?

Do you have a simple way to automate backups and re-installs of these as well or are you just resigned to having to eventually reconfigure them all when the SD card fails, your OS needs a reinstall or the disk dies?

  • Eskuero@lemmy.fromshado.ws
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    My docker containers are all configured via docker compose so I just tar the .yml files and the outside data volumes and backup that to an external drive.

    For configs living in /etc you can also backup all of them but I guess its harder to remember what you modified and where so this is why you document your setup step by step.

    Something nice and easy I use for personal documentations is mdbooks.

    • Kaldo@kbin.socialOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Ahh, so the best docker practice is to always just use outside data volumes and backup those separately, seems kinda obvious in retrospect. What about mounting them directly to the NAS (or even running docker from NAS?), for local networks the performance is probably good enough? That way I wouldn’t have to schedule regular syncs and transfers between “local” device storage and NAS? Dunno if it would have a negative effect on drive longevity compared to just running a daily backup.