Check the folder ID you gave the add-on during setup. It can only upload to that folder.
As for the name, I thought about looking into this but given that I back up once per day its easy to figure out which backup aligns with which day by looking at the timestamp.
If you look in your /backup folder of your hassio install you will see the file names of the snapshots in that folder are exactly the same as your google drive.
Great idea! So far Iâve been using the folder watcher component to tell when the config was updated, and trigger a backup, then SyncToy (on Windows) to sync it to Drive using an eventwatcher command to launch the sync.
This doesnât seem to be working for me. It looks like itâs working, but I donât see any backup files in my Drive folder. I currently have 5 .tar files in my backup folder on my hassio system, so there are files to move.
I run the service ârest_command.google_backupâ in the dev tools. Here is my log after doing that
GB_DEBUG = false
INFO:root:No local_settings to import
INFO:oauth2client.client:Refreshing access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.client:Refreshing access_token
WARNING:googleapiclient._helpers:build() takes at most 2 positional arguments (3 given)
INFO:googleapiclient.discovery:URL being requested: GET https://www.googleapis.com/discovery/v1/apis/drive/v3/rest
INFO:googleapiclient.discovery:URL being requested: GET https://www.googleapis.com/drive/v3/files?q=name%3D%27dbc6dcea.tar%27+and+%27My Folder ID%27+in+parents+and+trashed+%3D+false&spaces=drive&fields=files%28id%2C+name%29&alt=json
INFO:root:Backing up /backup/dbc6dcea.tar to My Folder ID
INFO:googleapiclient.discovery:URL being requested: POST https://www.googleapis.com/upload/drive/v3/files?alt=json&uploadType=resumable
It has been about an hour since I started the service and nothing has changed in my Drive folder.
Invalid config for [automation]: required key not provided @ data['action']. Got None required key not provided @ data['trigger']. Got None. (See ?, line ?). Please check the docs at https://home-assistant.io/components/automation/
Purge does remove the backups from /backup/, though the snapshots are still visible (0 MB in size) in the frontend. Any way to have those removed as well?
First of all, great plugin. I am pretty new in Hass, but one of the first thing is a good backup.
I have implemented your script and tested it manual. That is working.
In automation the snapshots are created as wel, but i cannot get automation to do a scheduled upload.
In my configuration i have
# Google Backup
rest_command:
google_backup:
url: âhttp://localhost:8055/gb/doBackupâ
timeout: â300â
This is started succesfully, but in the log files i get.
[08/May/2019 07:15:00] code 400, message Bad request version (â9Ă\x8aš,?ĂłZĂŽ%è\x9fsĂĂźâ)
[08/May/2019 07:15:00] Youâre accessing the development server over HTTPS, but it only supports HTTP.
I have setup duckdns with lets encrypt, so my installation is configured over https.
But the url to /gb/doBackup doesnât work over https?
What am i doing wrong? Because i want to get an succesfull backup offcourse
@Clubeddie, you wonât be able to call the backup service through your https (sorry, I know thatâd be preferable). Make sure youâre really going thru http. You can test your REST Command via the Home Assistantâs Services Development Tool . Once you get it working through that tool, then it should be straight forward to incorporate it into an automation.
For now i did an easier solution, i made a task on my synology (but a simple cronjob from another instance is also possible) and i acces the page through wget⌠works perfectly
Is there any way for me to call this add-on on the fly with a custom configuration, specifically for the path?
Iâm using it for the snapshots and it works great. However, I have a folder inside /share that I would like to backup as soon as I drop a recording from my camera, which is triggered by a MQTT message that i receive.
The adhoc solution doesnât work for my case because: seems like I canât call it within Home assistant and because it also replaces all the files even if they already exist in Google drive.
Alternatively, could you provide a flag to disable overwriting existing files when doing adhoc? If that flag is available then it would be possible to make it work by passing an entire directory to the call.
samccauley: Your add-on works great, thank you for that. Thereâs only one thing: Iâm using more then one hassios and couldnât find the way to manually set the target location in google drive. So every saves are going to the same folder. Usually I keep the files of every instance in a separated folder, now they are custom named in the same, but Iâm too lazy to move them by myself. :) Is there a workaround?
@balazsmark, The hassios are not sharing the add-on config, are they? The add-on config includes the backupDirID setting where you specify which of your google drive folders to copy backups to. You certainly must be familiar with that setting - without it the add-on wonât work at all. As a reminder, the instructions for how to set the value are here in the readme. Simply set this value different for each hassio.