Automatic Backup of Configuration Files

I have seen a few topics related to automatic backups. Im still researching on backing up the SD card files but the below setup should be a good starting point to get your configuration files backed up to another drive in case your SD card fails or you otherwise corrupt them. Im sure what I have isn’t the most efficient but should be a place to start for others and it works for me.

First task was to create a bunch of shell commands that could backup the individual files:

shell_command:
  backup_configuration: cp -a /home/hass/.homeassistant/configuration.yaml "/mnt/usbdrive/backup/configuration.yaml-$(date +"%m%d%Y-%H%M")"
  backup_groups: cp -a /home/hass/.homeassistant/groups.yaml "/mnt/usbdrive/backup/groups.yaml-$(date +"%m%d%Y-%H%M")"
  backup_customize: cp -a /home/hass/.homeassistant/customize.yaml "/mnt/usbdrive/backup/customize.yaml-$(date +"%m%d%Y-%H%M")"
  backup_known_devices: cp -a /home/hass/.homeassistant/known_devices.yaml "/mnt/usbdrive/backup/known_devices.yaml-$(date +"%m%d%Y-%H%M")"
  backup_device_tracker: cp -a /home/hass/.homeassistant/device_tracker.yaml "/mnt/usbdrive/backup/device_tracker.yaml-$(date +"%m%d%Y-%H%M")"
  backup_notification: cp -a /home/hass/.homeassistant/notification.yaml "/mnt/usbdrive/backup/notification.yaml-$(date +"%m%d%Y-%H%M")"

Note:

  1. I am backing up to a USB drive. The USB drive folder is mounted at /mnt/usbdrive/backup. You can backup to any drive (network or otherwise) that you have given the user HASS permission/access to.
  2. I am appending my files with the Month, Date, Year, Hour and Minute to tell them apart. That is what the “$(date +”%m%d%Y-%H%M"" part of the code is adding to the file name.

Next was to provide a means to execute all those shell commands - First a Manual Method

script:
  backup:
    alias: Backup HA Configuration Files
    sequence:
      - service: shell_command.backup_configuration
      - service: shell_command.backup_groups
      - service: shell_command.backup_customize
      - service: shell_command.backup_known_devices
      - service: shell_command.backup_notification
      - service: shell_command.backup_device_tracker

Note: I have this script available under a view on my HA. This way before I start messing with my configuration I can click the script and it backs up all my files. I like this as I have wanted to start over a few times when making a lot of changes and having the files from a known working state is handy.

A means to execute all those shell commands - Automated Method

automation:
  - alias: Backup HA Configuration Files
    trigger:
      platform: time
      hours: 1
      minutes: 1
      seconds: 0 
    action:
      - service: shell_command.backup_configuration
      - service: shell_command.backup_groups
      - service: shell_command.backup_customize
      - service: shell_command.backup_known_devices
      - service: shell_command.backup_notification
      - service: shell_command.backup_device_tracker

Note: this is basically the same as the script above but is automated to perform the backup every morning at 1:01 a.m.

Hope this helps someone, I have gotten a ton of help from this community and wanted to try and contribute a bit.

28 Likes

That’s looks very useful. I will set that up myself.

I used Dropbox as a backup when I was using openhab. See the link below for the openhab Dropbox binding.


It should be possible to do something similar for HA. Might put this as a feature request and see how much interest there is.
1 Like

I’ve been trying to set up an automated backup using robocopy but may have to move to this.

I don’t suppose you know how to change a samba share? I made the mistake of sharing the folder as “home assistant” with a space in the name and no matter how many times I redo the samba share as something else or delete and recreate it seems to stick. I’ve deleted and rebooted but cant seem to make it change when accessed from Windows 10.

Or alternatively - does anyone know how to make robocopy recognise a space?

This is great! Thanks :slight_smile:

I’ve set it up and it’s working perfectly.

@tinglis1 The drop box integration is a bit over my head. To get the above working took quite a bit of googling on my end. It would be nice to have the configs on a remote host though, if a power surge/fire/flood knocked out my Pi and SD card it would likely knock out my USB drive too.

@barryhampants try putting the file path in quotes. http://stackoverflow.com/questions/12027987/how-to-copy-directories-with-spaces-in-the-name

Thanks silvrr, that worked great.

I’ll probably switch to your method as soon as I have some time to work it out. And then perhaps have a second task on the networked device to zip and copy that backup to a cloud synced folder.

I’m not sure if this might help or not @silvrr, but in my backups I’m using wildcards to copy the files I want. It probably isn’t useful to paste my code in here as it’s using robocopy but for example I tell it to copy “*.yaml” in my HASS directory and it grabs all the yaml files in the HASS folder and all subdirectories. I think you can even just use *yaml to copy anything with yaml in the name. I’m also grabbing *.sh *.xml and *.conf plus the relevant files in my HaDashboard directory.

If wildcards work in this method it might mean you can shrink your code a bit and if you start to add more yaml files or folders as it grows they will be automatically backed up too.

Thanks, I’ll give it a try.

I can’t work it out. I can copy all the yaml files in a directory but it doesn’t seem to include subdirectory,

cp /home/pi/TEST1/*.yaml /home/pi/TEST2

or I can copy all subdirectories and yamls but not the top level.

cd /home/pi/TEST1 ; cp --parents */*.yaml /home/pi/TEST2/

and this is the closest I can get…

find /home/pi/TEST1/ -type f -name '*.yaml' -print0 | xargs -0 -I % cp -a % /home/pi/TEST2

…copies everything but does not keep the directory structure, it did give me all the yaml files though. It definitely seems possible.

So thinking about this some more. I like the idea of using the *.yaml option to copy all the yaml files from the directory. I guarantee that I will break something off of my main configuration file and forget to add it to the backup. The *.yaml option automates this.

However, without adding a unique file name to each backup the files either won’t copy or I would be forced to overwrite the previous backup, which I don’t want to do. I want to keep older backups for a little while in case I want some older text.

If anyone knows of a way to use the wildcard and append a unique file name like I have above please chime in, I would much prefer that option.

I’ve been trying to keep it as simple as I can so I haven’t tried to append the date bit yet. Believe it or not those three lines above took me quite a few hours of googleing and trial and error. Linux is completely new to me. I still cant work out how to copy the directory structure but according to my googling last night you can use --backup[=CONTROL].

--backup=numbered

find /home/pi/TEST1/ -type f -name '*.yaml' -print0 | xargs -0 -I % cp --backup=numbered % /home/pi/TEST2

leaves me with
device_trackers.yaml,
and device_trackers.yaml.~1~ next time it runs
and then device_trackers.yaml.~2~

also

-u

Copy only when the SOURCE file is newer than the destination file or when the destination file is missing.

I’m sure this including directory structure business must be relatively simple to do.

Maybe use a weekday variable so you only get 7 versions of the backups.
http://bneijt.nl/blog/post/add-a-timestamp-to-your-bash-prompt/

I would either zip the files up and store them all in a zip with the specific date\time on it or create a new directory in your backup folder that has the date/time in the name. So you can keep a full copy for as long as you want.

backup_dir
    HASS-2016-10-11-152530
         put all the files here 

I am not at home right now, but will give this a go later tonight.

OK! This works and keeps directory structure intact and it also creates numbered versions which is a little bit messy. It does however copy directory structure all the way up to and including home -

find /home/pi/TEST1/ -type f -name '*.yaml' -print0 | xargs -0 -I % cp --backup=numbered --parents % /home/pi/TEST2

I believe if run from the HASS user a ~ can be used to lower the directory start point down to the HASS account home folder. So in example above as I am logged in as user “pi” it can be shortened to

find ~/TEST1/ -type f -name '*.yaml' -print0 | xargs -0 -I % cp --backup=numbered --parents % ~/TEST2

I still get HOME/PI/TEST1/ copied into the folder TEST2 though. I think it can be done with a .\ or .\ but I’m not sure where to stick it

Just want to add how I do it.
As I don’t see the need of backing up my configuration files if I didn’t make any changes, I just fire off a backup.bat after I did some changes. So no need for a job to run daily or so.
The backup.bat is in de same folder as the HASS configs and I always work in that folder through the samba link.
The batch first copies all previous backup files from MYPC\pi3\hass\backup\ to MYPC\pi3\hass\backup\previous
then it copies all current config files (including the zwave xml) to MYPC\pi3\hass\backup.

Next to that I always use Notepad++ to edit my config files and setup automatic version backups in there (Preferences -> Backup -> Verbose backup), so each time I save a config file it makes a copy to the subfolder \nppBackup.

Also next to that I have a daily Cobian backup schedule on my pc which zips and uploads the MYPC\pi3\hass\backup\ and MYPC\pi3\hass\backup\previous\ files via FTP to my website domain hosting. My pc is usually turned on in the evening so that upload happens often enough.

Only thing I didn’t manage yet is a regular sdcard full backup.
I currently just take out the card and make an image on my pc right before I update HASS.

1 Like

I’ve actually just moved to GIT for backups. When I make a change a just run some quick scripts to push everything up to github. The scripts are available at : https://github.com/CCOSTAN/Home-AssistantConfig/tree/master/shell_scripts but they are pretty basic. I’m pretty happy with this solution and it wasn’t too hard to do.

Protection + Sharing = Winning. :wink:

2 Likes

Can you also put it on GIT and that it stays private?
I prefer to keep it private and the reason is (maybe I’m being to paranoid) that I have automations that deal with keeping away burglars and alarms, and I don’t want anyone to know how/what they are because then it’s known and someone could take advantage of it.

1 Like

I think you have to pay for private github repos.

Check out etckeeper. It tracks a directory in a revision control system(configurable with git, mercurial, plus others I’m not familiar with). The advantage of using RCS for config files is less disk use, and more importantly for me, less “noise” if you’re trying to see how you used to have it configured.

I use mercurial at work, and I’ve been dragging my feet to learn all the git differences, but they’re both “distributed”. GitHub is very popular, but it’s not git itself. You can have a git repository entirely on your local machine. Or for backup purposes you could push it to other machines under your control.

If you’re doing any kind of coding, I’d definitely advise you to learn some RCS system, because I have no idea how I got by with that old tedious process of manually copy in backups, trying changes, and saying “oh no! I broke it! Which named file was good again?”. I love mercurial, but if i had to advise a new user, I’d suggest learning git, because it seems it owns the open source world.

1 Like

You’re backup script got me curious. I want to replicate it, but decided I could go with a more scalable approach. I’ve made a script in Python. It’s more scalable, because you can let the script clean up the directory, backup more configfiles of other applications etc. For now, it’s pretty basic.
I’m referencing it in my HA like this:

# Launch Backup script
shell_command:
  backup_HassConfig: python /var/opt/homeassistant/PythonScripts/AutoHassBackups.py

The Python script is really basic right now, since it’s my first ever Python. But it works =)

To make this work:
sudo apt-get install sshpass

Then put this script in the appropriate directory:

from datetime import datetime, date, time
import subprocess

tijd = datetime.now()
tijd = tijd.strftime('%Y%m%d_%H%M%S')
localFileLoc = "/var/opt/homeassistant/"
remoteFileLoc = "USERNAME@IP:/share/Backups/PiController/Hass_"
remoteFileLoc = remoteFileLoc + tijd

shellCommand = 'sshpass -p "YOURPASSWORDHERE" scp -r' + " "+ localFileLoc + " " + remoteFileLoc

subprocess.call(shellCommand, shell=True)

This is designed to work a QNAP NAS with SCP (secure file copy over SSH). This way it backups my config files from HASS, but also the Python script which is in the same directory. I want to improve on it by creating ssh key instead of a password, but for now it’s fine.

3 Likes