Backup your RPI!

Actually what I’m looking for are scripts that would be run by an HA automation that update the files on git.

Would love to see something like this as a component. Turn my office light red when there is a failed backup. :slight_smile:

It wouldn’t be hard to do with an automation. It could simply run a command line script that would git push.

I figured as much, I just don’t know the “git” side of things. I have an automation right now, copying everything to a thumb drive, but git with it’s version control would be nicer. I just don’t know how to use it and rather than beat my head through the documentation to try and figure it out, I thought I would ask for an example like this to get started with.

I haven’t (yet) made an image of pi SD card and your post will hopefully motivate me to make one. Anyway, I thought I would paste a script I have running daily via cron job which backups contents of /home/homeassistant/.hass/ and copies it to the home directory of pi user. Then it copies to another linux machine on network via SCP.

#!/bin/bash
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
SRCDIR=/home/hass/.homeassistant
DESDIR=/home/pi
sudo tar -cpzf $DESDIR/$FILENAME $SRCDIR
scp /home/pi/$FILENAME user@otherhost:/home/user/backups/$FILENAME
#END#!/bin/bash
3 Likes

I think I really need to sit down and try and learn to use git.

Thanks @keith-michael - I have been looking for an example of how to use tar like this. There is a thread here on backup scripts worth a look - and probably a paste of that script. I’m probably going to pinch some of it.

Thanks for this - I am considering moving from raspberry pi to an older celeron NUC I have and using an SSD, this might be cheaper and worth a look.

My usual cause of failure is a power blackout. As I understand it, it just needs to be timed when the wrong file is being written to for corruption to occur. On some of my other Pi’s used in a squeezebox network there is SD corruption built in and I have never had these corrupt, unfortunately I think in one case (max2play) this is done by temporarily disabling / enabling write permissions to the card which would not work with HASS. The other uses microcore linux and runs entirely in ram (picoreplayer).

it is a nice start, but if something happens you still have to install hass and all dependencies all over.
if your other Linux machine hass enough gb free and a sd card port it would be wise to copy an image at least once as well.

I noticed in another thread here that the AIO installer has changed user / directory structure last month, so even with all the files backed up it might not match your new clean install if you don’t have an image. Fixable, but not instant like a ready to go spare SD card or image.

1 Like

For those using DD to clone their SD, since there is activity on the card while doing so, are you experiencing any file corruption issues? Are you using Remount to stop the OS from writing to the card?

As well, I don’t want the home-assistant_v2 imaged. Is there a way to exclude that from being part of the back up?

when you backup an image, you normally dont exclude a specific file.
you make an exact copy from the sdcard.
i have no idea if it possible though. but i dont see why i should do that.
if i want a newer file after i started the image, i just overwrite the file.

if you dont want a file, you also could delete it before you make the image, but then you need to stop hass for a while.

i use dd to make a live image over the network and it is working like i expect it.
havent seen any coruption in the images.

As far as backups go. There are two types of backups.

  1. Rebuild the machine
  2. Ooops did I really delete that file.

These backups work in harmony to protect your system. Make a rebuild the machine backup by copying your sd card once a month or whenever you have done a significant amount of work on installing a new application that you would hate to have to redo. Don’t worry about it for simple configuration changes to HA or AD type files. Only for things that you really beat your head against and feel a great sigh of relief that you finally accomplished it. Make a copy of your SD card then so you don’t cry when it gets corrupted.
Make an oops backup nightly. My oops backup includes the config files from my HA directory and my source files from my AD directory. Those files change almost daily so having copies of those that are current is very important.

2 Likes

Okay, as @Bit-River shown above, I use SyncThing to back up my configuration directory to my RPi being used as a file server. So I got that covered and it has run well for a week or two now. It’s automatic so I don’t have to worry too much there.

On the SD image side, I could run something like:

dd if=/dev/sda of=/dev/sdb bs=1024M

Do that once every 24-hrs to a folder, then move or make another copy that went into a monthly folder, perhaps using an automation to do so. Or is it just better to tie to some kind of slider switch in HA and run it as you said, when installing a new application (like AppDaemon perhaps)?

And if I may impose your AD directory? That’s the AppsDaemon directory, correct?

1 Like

i made a shl file like this:

#!/bin/bash
sudo dd if=/dev/mmcblk0 of=/mnt/backup/sdcards/backup$(date +%Y%m%d).img bs=1M

and i run that once a week from now.
until now i did it nonautomated but thats not an option anymore. :wink:

and because appdaemon is my friend it will be a run_daily constraint to 1 specific day :wink:

2 Likes

So my next question is, can I use this to backup the SD card over the network. In other words, can I back up the RPi hosting Home Assistant to another RPi that is on the same network? The command as written appears to make the image locally, on the device in which the command was run.

You are able to combine ssh and dd and use a command like this:

ssh user@remote "dd if=/dev/sda | gzip -1 -" | dd of=image.gz

Your remote machine is the rpi.

2 Likes

an image is exactly as big as the sd card, so it cant be made locally. :wink:
i now have 64 gb sd cards and every image is 64 gb.
and it is saved on the mounted network drive.

if you zip it it can off course be much smaller, but then it must be unzipped before you can write it to an sd card again.

i think that the command that @fgabriel gives is from another machine to make a packed image and that is saved locally on the rpi.

Yes @ReneTode, your right. The command I mentioned must be executed on another machine then the rpi (e.g. Ubuntu desktop).

No @ReneTode , your wrong. This command logs in rpi via ssh, run dd of the sd and zip that on the rpi and finally transmit it over network to the machine, the whole command was entered and write it on that machine into a file called image.gz.

Yes @ReneTode , your right. If you would write the image to an other sd you frist have to unzip it. But that’s really simple.

1 Like

This applies to anything that HA is loaded on. The Raspberry Pi may have certain issues but any system (odroid, nuc, full blown pc/mac) WILL fail; its just a matter of time and luck.

@ReneTode what is the reason for frequent updates of the OS image? I image it once I install the OS and HA and any other items and then let it be. Yes the HA version will be out of date but as quick as the HA updates are for me I would spend more time imaging the SD card than just running an update after I re-image it. If I need to install something for a new component I would create an image then, however, no regular backups of the SD card.

Just checking that I am not missing something and should have more frequent image backups.

Very Nice. I used a minor tweek on this to account for the default device and to give dd the root permissions.
ssh pi@my-rpi “sudo dd if=/dev/mmcblk0 | gzip =1 -” dd of=image.gz

Like I said earlier, there are two different types of backups that need to be maintained. One is a system (SD) card backup and should be done once your system is stable. It’s a known point in time when you can go back and recover your system. I see this as something I do when I have done more work in the way of changes than I want to have to re-do if there is a failure. For example, about a week ago, I moved my log to mysql. While that wasn’t a lot of work, and it went pretty well, it represented enough of a change in my mind that once I got it working the way I wanted, I would do a SD card backup and save it somewhere. Whether that is to an image file on a larger drive somewhere, or to a spare SD card is user preference.
The second type backup is one I automate to run nightly. It only gets the configuration files for HA and AD and any python source in AD. This is my fall back when I try setting up something and screw up my config files to the point that I can’t remember what all I changed and want to go back to a known good configuration to start over with the configuration. I’ve automated it to run nightly at midnight because I don’t have enough self discipline to make myself do it before every time I start editing the config or source files. So automating it nightly works for me. If you have the self discipline to remember to do it before you start working on a file, then do it yourself. Storing the files in Git could be another method for this type backup.
And finally, yes all hardware fails at some point. There are things you can do to help protect yourself from it, but it becomes a cost benefit exercise at that point, and that is different for each one of us.