Hass.io Add-on: Auto backup

great ideas. I plan to add Onedrive, SMB and maybe even FTP support too.

3 Likes

Thanks for all your work on this, working great

that would be perfect!

New version 1.0.2 works perfectly for me now. Thank you!

1 Like

Authentication was a snap and pretty intuitive, snapshot backup worked perfectly out of the box. Thank you!

1 Like

@marcelveldt as mentioned before, great add on. Thanks! I see you copy the file 1:1. Would it be possible to add some intelligence to the file name? Maybe Hass version name and then add a number. Would make it easier to find back what you need.

Just installed… great add on ! no problems or errors, only one question, looking at the Snapshot section on hassio says:
autobackup 2019-02-19 16:47:02
Partial snapshot
this is my config:
{
“log_level”: “INFO”,
“log_file”: “/backup/autobackup.log”,
“schedule”: “0 4 * * *”,
“run_backup_at_startup”: true,
“backup_addons”: {
“enabled”: true,
“whitelist”: [],
“blacklist”: []
},
“backup_share”: {
“enabled”: true,
“subfolders”: []
},
“auto_purge”: 3,
“google_drive”: {
“enabled”: true,
“auto_purge”: 3,
“backup_folder”: “hassio_backups”
}
}

i’ve done the backup of everything but it says Partial snapshot

the filename is indeed used of the snapshot file created by hassio but in the description it will have the full name including the date.

The addon uses the partial snapshot feature of hassio to create the backup so that’s why it says partial snapshot while it is in fact a full snapshot of all options. So you can ignore the name.

Ah, cool. Did not check this yet. Other idea; In my pure enthusiasm to upgrade to a new available hass version I sometimes forget to take a snapshot before I upgrade. Would be great if you can add an option that this a auto backup is triggered before an upgrade. Or is this and event you can’t capture?

the event can be captured but that would be too late (the upgrade would already have taken place). best to make sure you simply create a snapshot every day or maybe even twice a day and use the auto purge to only keep the last X snapshots to prevent the filesystem of filling up.

Two ideas: Let the add-on make a sensor that tells you when the last backup was taken and if the upload went well, and maybe an option to schedule more than one type of backup. Say you want to do a full snapshot with all add-ons once every 14 days but every night without add-ons for example.

Is it possible to have the addon only have persmission to one folder? Ie the backup folder created for the addon on Google Drive. As of now the addon has full access to absolutely everything. Should be possible to give it less persmissions.

Tried setting this up today but ran into a error. Any ideas what’s wrong?
First I thought it was because of pi-hole but I changed back to google DNS and still the same problem:

Using cron schedule: 0 4 * * 1 *
crond: crond (busybox 1.29.3) started, log level 8
crond: USER root pid  11 cmd python /usr/src/app/main.py > /proc/1/fd/1 2>/proc/1/fd/2
2019-03-29 20:18:33,234 INFO  --  
2019-03-29 20:18:33,235 INFO  -- ############## AUTO SNAPSHOT TASK STARTED ################
/usr/local/lib/python3.7/site-packages/oauth2client/_helpers.py:255: UserWarning: Cannot access /data/credentials.json: No such file or directory
  warnings.warn(_MISSING_FILE_MESSAGE.format(filename))
2019-03-29 20:18:33,239 ERROR -- No credential to refresh.
2019-03-29 20:18:33,240 WARNING -- You need to authenticate Google through the webinterface!
192.168.1.196 - - [29/Mar/2019 20:18:42] "GET / HTTP/1.1" 200 -
192.168.1.196 - - [29/Mar/2019 20:18:43] "GET /favicon.ico HTTP/1.1" 200 -
2019-03-29 20:19:02,724 INFO  -- Received Google code: supersecretgooglecode
192.168.1.196 - - [29/Mar/2019 20:19:02] "POST /send HTTP/1.1" 200 -
2019-03-29 20:19:07,793 ERROR -- Unable to find the server at accounts.google.com

Any movement on a onedrive link for this instead of gdrive?

SOLVED:

The issue was the port, probably I’m already using that port and changing it solved my issue.


Hi @marcelveldt

thanks for sharing this addon, I was looking for something like this.

I have installed and this is my config

{
  "log_level": "INFO",
  "log_file": "/backup/autobackup.log",
  "schedule": "0 4 * * *",
  "run_backup_at_startup": true,
  "backup_addons": {
    "enabled": true,
    "whitelist": [],
    "blacklist": []
  },
  "backup_share": {
    "enabled": true,
    "subfolders": [
      "zigbee2mqtt"
    ]
  },
  "auto_purge": 5,
  "google_drive": {
    "enabled": true,
    "auto_purge": 5,
    "backup_folder": "hassio_backups"
  }
}

I left it almost like the default, just adding a subfolder in the backup_share section and changing the auto_purge values.

When I click START it gets red and the addon does not start and there is no log in the backup folder.

I’m on Hassio 0.93.2

What can I check?

Thanks

Why do you need so many permissions on Google Drive? There are apps that require only to create files and view/edit/delete the files created by them, not ALL the files on GDrive

Thank you very much for the addon.
Can I ask a stupid question.

The default backup profile, your instructions say to make your own and not use the default.

What if I just want exactly the same full backup as the normal hass.io backup menu?

I just want a file export in /backup/ which I can copy out (via a cron script, daily) and if my Hass.IO install catches fire, I can re-install it, fresh and blank, then import the backup.

Is that how it’s pre-configurd?

I’ve installed it ok, however it doesn’t seem to be auto backing up.

That would be awesome. I will wait until that’s available.