Hass.io Add-on: Auto backup

Marcelveldt’s Hassio Add-ons: Auto backup

About

Automatically create snapshots of your hassio installation and optionally upload them to the cloud (currently only Google Drive).
No need to create tasks yourself in HomeAssistant automations, this add-on will take of the scheduling, backupping and even uploading to the cloud.
By default the snapshot will contain your homeassistant and hassio configuration, the ssl folder and the local addons folder.
Installed addons and your share folder (even on a subfolder level) can be optionally included in the snapshot.

Installation

The installation of this add-on is pretty straightforward and not different in
comparison to installing any other Hass.io add-on.

  1. Add my Hass.io add-ons repository to your Hass.io instance.
  2. Install the “Auto Backup” add-on.
  3. Carefully configure the add-on to your preference with the options (see below).
  4. Click the Save button to store your configuration.
  5. Start the “Auto Backup” add-on.
  6. Check the logs of the add-on to see if everything went well.
  7. At the first start, a first snapshot is created immediately, once done it will follow the schedule
  8. If you enabled upload to Google Drive, the log will tell you to authenticate, use the “Open Web UI” button for that.
  9. Ready to go!

Configuration

Note: Remember to restart the add-on when the configuration is changed.

Example add-on configuration:

{
  "log_level": "INFO",
  "log_file": "/backup/autobackup.log",
  "schedule": "0 4 * * *",
  "backup_addons": {
    "enabled": true,
    "whitelist": [],
    "blacklist": []
  },
  "backup_share": {
    "enabled": true,
    "subfolders": [
      "tools",
      "music"
    ]
  },
  "auto_purge": 3,
  "google_drive": {
    "enabled": true,
    "auto_purge": 2,
    "backup_folder": "hassio_backups"
  }
}

Note: This is just an example, don’t copy and paste it! Create your own!

Option: log_level

The log_level option controls the level of log output by the addon and can
be changed to be more or less verbose, which might be useful when you are
dealing with an unknown issue. Possible values are:

  • DEBUG: Shows detailed debug information.
  • INFO: Normal (usually) interesting events. It’s the default choice.
  • WARNING: Exceptional occurrences that are not errors.
  • ERROR: Something went terribly wrong. Add-on becomes unusable.

Note: The loglevel is only applied to the logfile, not the console output.

Option: log_file

Full path to the logfile. Root folder can be either /backup or /share. The addon does not have access to other hassio folders.

Option: schedule

Schedule when the backup task should run. By default it’s set to every night at 04:00.
You can use CRON syntax for this. http://www.nncron.ru/help/EN/working/cron-format.htm

Option: backup_addons

This setting allows you to include installed addons (and their configuration) into the snapshot.
enabled: include installed addons in the snapshot.
whitelist: (optional) only include addons in this list the snapshot.
blacklist: (optional) include all addons except in items in this list the snapshot.

If you do not specify whitelist and blacklist items, all addons will be included in the snapshot.
You can use either use the full name for an addon or the short name (which you will have to figure out yourself)

Option: backup_share

This setting allows you to include the share folder into the snapshot.
enabled: include share folder in the snapshot.
subfolders: (optional) only include these files/subfolders from /share in the snapshot.

Note: In many cases the share folder is used to store mediafiles etc. and when the subfolders option is ommitted, the snapshot can grow large!.

Option: auto_purge

Automatically purge old snapshot from hassio.
The latest X snapshots will be kept. Set to 0 to disable the auto purging.

Option: google_drive

This setting allows you to enable auto upload of your snapshots to Google Drive.
enabled: enable auto uploading of snapshots to Google Drive.
auto_purge: Automatically purge old snapshot from Google Drive. The latest X snapshots will be kept. Set to 0 to disable the auto purging.
backup_folder: The name of the folder on your Google Drive where to store snapshots (will be created if not exists).

If you enable the Google Drive upload, the add-on will ask you to authorize the app once when the first/next backup task is run.
Check the log for details.

6 Likes

I can’t seem to access the web ui, when I try chrome says: refused to connect

I’m using duckdns/letsencrypt addon, have tried https://ip address:8055, http://ip address:8055 and https://******.duckdns.org:8055 and not work. Any ideas ?

My autobackup.log says:
2019-02-16 16:11:06,896 ERROR – No credential to refresh.
2019-02-16 16:11:06,897 WARNING – You need to authenticate Google through the webinterface!

Just use the local IP/hostname of your hassio host…?

Still says refused to connect with 192...*:8055

Tried opening port 8055 on router but still the same

my best guess is that you have some other addon using port 8055, you might want to try another port. Just adjust it and restart the addon.

No just the same, tried a couple of other ports

Very nice! Works as a charme

strange stuff, it’s just a basic webserver hosted on that port allowing you to do the one time auth with google, nothing fancy. So unless there’s some error in your log (other than the one you posted before), I’m not sure how to help. Maybe you’re using a proxy ?

For me it reverted to my domain which does not allow traffic on that port. Make sure you do it on your local network (so for example 192.168.1.45:8055)

Yes I tried that. Is it because I have the duckdns add-on running ?

The deconz add-on also has a webui page on port 9880 that works fine ?

Ok, so re-reading the docs I see that the auth process happens on first backup attempt. I have it set to 4am so it had not tried yet. I opened the webui today and it worked great.

1 Like

Works perfectly! Thank you @marcelveldt

I also have this issue, it cannot connect no matter what port.
I’m running hassos 64 bit on my pi.

For me it was because I hadn’t reached a scheduled backup time yet. You could try editing the config to a time in the future close to your time temporarily and it should show up in the autobackup.log once it tries a backup

Something else I noticed if the autopurge is set to 0, it immediately deletes the backup straight after uploading to google drive, I just set it to 10.

I’ve seen the log for the attempt

2019-02-17 19:55:48,224 ERROR – No credential to refresh.
2019-02-17 19:55:48,224 WARNING – You need to authenticate Google through the webinterface!

But the webinterface is not accessibl, no matter what port I open.

Yes thats what I had until it scheduled a backup

Thanks for the add-on - I’ll start using it once the add-on supports backup to SMB or FTP.

1 Like

There’s a new version now, which should solve this as the backuptask will run at startup of the addon now, unless you configure it not to.

this is now solved, it will respect the auto purge setting.