How do you backup

A huge number of times I have been asked to restore data or systems over the last 30 years. Only to discover that no viable backup exists.
Homeassistant is a very useful tool so I wanted to offer some insight on my setup and how I back it up and build in redundancy.

After all; There is nothing worse in IT that for your baby to fail after weeks and weeks of work and hours tweaking. So I wanted to profer up how I do things and ask how do you ensure YOUR hard work is not lost when things go awry.

My network consists of the following;

A laptop - the main brain, (MB)
My main workstation - from where I control my world! (MWS).
Many nodecu’s
Some Wemos D1 mini’s
Many sonoff devices - slamphers, basics, touches, ch10/16 devices. Some with Tasmota some with esphome.
A rather funky 16 port GB switch (found it cheap and faulty and made it work again).
Three wifi access points capable of up to four ssid’s each (old ex corp equipment).
A home brew router running open source software and providing four different individual networks (note SSIDs) AKA (FW).

So with such disparate systems how best to manage the backup/disaster recovery?

I am a fanatic for the opensource world - by that I mean there are many, many people smarter that me. I try my best to contribute to each community as best I can. And while not rich enough to financially contribute to all those projects that offer great value. I do what I can. Remember ‘not free beer’ but ‘free to use!’

So I use hassio via docker running on (for arguments sake Ubuntu) Hassio and the minor configuration changes and updates are handled brilliantly by the use of snapshots but what of the other components? These are downloaded manually to my (MWS).

I have an independent MySQL server on the (MB) that caters for a number of applications outside of docker up to and including HA. For example Nextcloud, zoneminder, a wiki. to name but a few! To ensure that the data I create is backed up I use :- https://sourceforge.net/projects/automysqlbackup/ it takes the manual pain out of the whole process. and is included in my SpiderOakOne backup.

The SpidrOakOne account runs via cron on both the (MB) and (MWS) encrypting and moving the relevant data off site.

Here is the real nub :- I have a 2TB USB3 HDD and image the drives from both (MB) and (MWS) and (FW) for storage after any truly major change in the disk set up or OS upgrade using:- https://www.clonezilla.org/
It is stored in a fire safe (nope I am not kidding)

Any Tasmota device has its config backed up to my (MWS) and is then included in the SpiderOakOne backup.

Any esphome device is already included in the snapshots that I do/automate and is included in the SpiderOakOne backup.

The (FW) has its own automated config backup which is included in the SpiderOakOne backup.

Once a month I test the restore on an old workstation. Just to make sure I can!

Some of you may find it overkill but I have spent many hours developing and building my systems and would cry buckets if I had to start over from scratch. The most I should ever lose is a days work!

Of course these are just the basics I’ve listed. My security goes much deeper.

How do you deal with yours?

2 Likes

My home assistant installation runs in docker with a lot of other containers, all running on an Ubuntu VM inside Proxmox. I do a daily backup of it to my NAS and weekly I backup the data onto an external USB drive and to 2 large USB sticks, one is attached to my keys and the other to the keys of my girlfriend. I also regularly backup my config to Github.

Excellent, regular and off-site. Many miss the latter part.

1 Like

I’ve blogged about how I do it for Home Assistant, but more widely:

  • My home server uses mirroring (ZFS) - it’s not backup (before anybody starts yelling) but it reduces the chance of data loss through disk failure
  • Every hour
    • My home server takes file system snapshots, and I keep the last week of snapshots
    • Home Assistant pushes (using rclone) changes to cloud backup - versioned, retained indefinitely
    • My backup media server pulls all changes to any media from the home server
  • Every two hours
    • My backup server pulls (using rsnapshot) all system configs and user files. I keep a days worth of hourly backups, a week of daily backups, a month of weekly backups, a year of monthly backups, and yearly backups indefinitely
    • My home server pushes all changes to cloud backup - versioned, retained indefinitely
  • Twice a day
    • Windows systems pushes changes to the backup server (using urbackup) - retained for at least 40 backups, and at most 100
  • Before major OS upgrades
    • Snapshot the OS (ZFS boot environment)
    • Clone the SD card (using rpi-clone)
  • After a successful OS upgrade
    • As above

This means that:

  • If the house burns down/everything is stolen, I can recover - I test this twice a year to check it works (spoiler, it does)
  • If I delete a file or mess something up, I can recover quickly - I don’t need to test this, because I’m usually recovering something at least once a month
  • If an OS upgrade goes wrong (or an SD card dies), I can recovery quickly - I’ve switched a few SD cards, and used the boot environment during upgrades, so I know this works
3 Likes

Superb. Very well thought out and implemented. Lest I forget tested; kudos for that!

With just regards to my Home Automation system:

I run HA in Docker on Alpine Linux:

  • config directory is synced using Syncthing to NAS, Desktop, Laptop
  • NAS is sharing out config directory (and a few others, via NFS) to a VM running CrashPlan Pro
    • this retains at least 5 latest versions on each system locally, and infinite versions in CrashPlan
  • config gets synced to personal bitbucket every so often from desktop, feel no need to do this automatically

Every hour, restic creates a backup of HA config and every bind mounted directory for all my other containers to my Minio running on my NAS.

  • the Minio instance is snapshotted 3 times a day (ZFS)

I have tested this configuration 3 times in the last 2 and a half years, and I have enough copies floating around that I can find nearly any version from nearly any point in time that I don’t concern myself with losing data.

1 Like

I accidently did a Monkey test last weekend and deleted one week of work.
I do take snapshots within HA every night but the Monkey (me) deleted the whole VM.
Lessons learned I need to push my snapshot out of the VM into my Nextcloud.
The rclone seems to be able to do that. My question is could that be included as a HA Addon? I feel the flexibility that rclone gives regarding the backup target (sftp, WebDAV etc.) can cover 100% of the backup scenarios people have.

@Tinkerer what is your thought on my idea to make it an addon?

Sure, more backup options for those running Supervised can only be a good thing.

What would be the way to create an rclone addon for HA?

See the developer docs