Created new add-on to backup snapshots to Google Drive

Check the folder ID you gave the add-on during setup. It can only upload to that folder.

As for the name, I thought about looking into this but given that I back up once per day its easy to figure out which backup aligns with which day by looking at the timestamp.

If you look in your /backup folder of your hassio install you will see the file names of the snapshots in that folder are exactly the same as your google drive.

I meant the file name as well. It is standard and nothing you can do to change it. The data hassio shows in embedded in the rar file itself.

Great idea! So far I’ve been using the folder watcher component to tell when the config was updated, and trigger a backup, then SyncToy (on Windows) to sync it to Drive using an eventwatcher command to launch the sync.

This doesn’t seem to be working for me. It looks like it’s working, but I don’t see any backup files in my Drive folder. I currently have 5 .tar files in my backup folder on my hassio system, so there are files to move.

Here are my configs

{
  "fromPattern": "/backup/*.tar",
  "backupDirID": "My folder ID",
  "purge": {
    "enabled": false,
    "preserve": 3
  },
  "purge_google": {
    "enabled": false,
    "preserve": 12
  },
  "debug": false
}

Configuration.yaml

rest_command:
  google_backup:
    url: 'http://localhost:8055/gb/doBackup'
    timeout: '300'

I run the service “rest_command.google_backup” in the dev tools. Here is my log after doing that

GB_DEBUG = false
INFO:root:No local_settings to import
INFO:oauth2client.client:Refreshing access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.client:Refreshing access_token
WARNING:googleapiclient._helpers:build() takes at most 2 positional arguments (3 given)
INFO:googleapiclient.discovery:URL being requested: GET https://www.googleapis.com/discovery/v1/apis/drive/v3/rest
INFO:googleapiclient.discovery:URL being requested: GET https://www.googleapis.com/drive/v3/files?q=name%3D%27dbc6dcea.tar%27+and+%27My Folder ID%27+in+parents+and+trashed+%3D+false&spaces=drive&fields=files%28id%2C+name%29&alt=json
INFO:root:Backing up /backup/dbc6dcea.tar to My Folder ID
INFO:googleapiclient.discovery:URL being requested: POST https://www.googleapis.com/upload/drive/v3/files?alt=json&uploadType=resumable

It has been about an hour since I started the service and nothing has changed in my Drive folder.

Nevermind. I was impatient, it works!

Thanks for the great add-on!

This is no longer working after upgrading to 0.91.2? It goes to the Google authorization page, allows me to ingest the key, but then nothing.

Thoughts?

once you have done that you need to call the service for it to actually do something

I think you either posted the wrong link or you’re in the wrong thread. I don’t see anything about a Google Drive add-on in the link you shared.

oops sorry. not sure how that got here, indeed i was trying to post in a different thread.

Invalid config for [automation]: required key not provided @ data['action']. Got None required key not provided @ data['trigger']. Got None. (See ?, line ?). Please check the docs at https://home-assistant.io/components/automation/

get this when installing as package

Purge does remove the backups from /backup/, though the snapshots are still visible (0 MB in size) in the frontend. Any way to have those removed as well?

Click the refresh icon in the upper right hand corner.

1 Like

First of all, great plugin. I am pretty new in Hass, but one of the first thing is a good backup.
I have implemented your script and tested it manual. That is working.

In automation the snapshots are created as wel, but i cannot get automation to do a scheduled upload.

In my configuration i have
# Google Backup
rest_command:
google_backup:
url: ‘http://localhost:8055/gb/doBackup’
timeout: ‘300’

This is started succesfully, but in the log files i get.
[08/May/2019 07:15:00] code 400, message Bad request version (‘9Á\x8a¹,?óZî%è\x9fsÜü’)
[08/May/2019 07:15:00] You’re accessing the development server over HTTPS, but it only supports HTTP.

I have setup duckdns with lets encrypt, so my installation is configured over https.
But the url to /gb/doBackup doesn’t work over https?

What am i doing wrong? Because i want to get an succesfull backup offcourse :slight_smile:

@Clubeddie, you won’t be able to call the backup service through your https (sorry, I know that’d be preferable). Make sure you’re really going thru http. You can test your REST Command via the Home Assistant’s Services Development Tool . Once you get it working through that tool, then it should be straight forward to incorporate it into an automation.

Thanks for the quick reply!

For now i did an easier solution, i made a task on my synology (but a simple cronjob from another instance is also possible) and i acces the page through wget… works perfectly

wget -O - http://[myipadres]:8055/gb/doBackup >/shareforlogfile/backuptogdrive.log

If the task is not running for some reason i get an e-mail from my synology.
Er are different ways to Rome :wink:

Hello!

Is there any way for me to call this add-on on the fly with a custom configuration, specifically for the path?

I’m using it for the snapshots and it works great. However, I have a folder inside /share that I would like to backup as soon as I drop a recording from my camera, which is triggered by a MQTT message that i receive.

The adhoc solution doesn’t work for my case because: seems like I can’t call it within Home assistant and because it also replaces all the files even if they already exist in Google drive.

Alternatively, could you provide a flag to disable overwriting existing files when doing adhoc? If that flag is available then it would be possible to make it work by passing an entire directory to the call.

samccauley: Your add-on works great, thank you for that. There’s only one thing: I’m using more then one hassios and couldn’t find the way to manually set the target location in google drive. So every saves are going to the same folder. Usually I keep the files of every instance in a separated folder, now they are custom named in the same, but I’m too lazy to move them by myself. :):rofl: Is there a workaround?

@balazsmark, The hassios are not sharing the add-on config, are they? The add-on config includes the backupDirID setting where you specify which of your google drive folders to copy backups to. You certainly must be familiar with that setting - without it the add-on won’t work at all. As a reminder, the instructions for how to set the value are here in the readme. Simply set this value different for each hassio.