Add-on: Home Assistant Google Drive Backup

This add-on works perfectly :smiley: But I noticed that the file size on the snapshots is much smaller than the files created from within home assistant manually, why is this?
The one created manually is around 450mb, and the one created by this add-on is around 45mb.

It uses the same mechanism under the hood to make the snapshots in both situations, so likely you’re either making partial snapshots with the addon (you would have configured this) or for some reason your snapshots got a lot smaller all of a sudden.

Hello.
I’ve been using this great add-on for months.
Since a couple of weeks the upload to Gdrive doesn’t work anymore.

From the log I get the following:

Traceback (most recent call last):
File “/app/backup/coordinator.py”, line 109, in _sync
self._buildModel().sync(self._time.now())
File “/app/backup/model.py”, line 127, in sync
self._syncSnapshots([self.source, self.dest])
File “/app/backup/model.py”, line 205, in _syncSnapshots
from_source: Dict[str, AbstractSnapshot] = source.get()
File “/app/backup/drivesource.py”, line 86, in get
self._info.drive_folder_id = self.getFolderId()
File “/app/backup/drivesource.py”, line 79, in getFolderId
return self._getParentFolderId()
File “/app/backup/drivesource.py”, line 204, in _getParentFolderId
self._folderId = self._validateFolderId()
File “/app/backup/drivesource.py”, line 230, in _validateFolderId
return self._findDriveFolder()
File “/app/backup/drivesource.py”, line 238, in _findDriveFolder
for child in self.drivebackend.query(“mimeType=’” + FOLDER_MIME_TYPE + “’”):
File “/app/backup/driverequests.py”, line 173, in query
response = self.retryRequest(“GET”, URL_FILES + “?” + urlencode(q), is_json=True)
File “/app/backup/driverequests.py”, line 304, in retryRequest
send_headers = self._getHeaders(refresh=refresh_token)
File “/app/backup/driverequests.py”, line 80, in _getHeaders
“Authorization”: "Bearer " + self.getToken(refresh=refresh),
File “/app/backup/driverequests.py”, line 140, in getToken
resp = self.retryRequest(“POST”, URL_AUTH, is_json=True, data=data, auth_headers=self._getAuthHeaders(), cred_retry=False)
File “/app/backup/driverequests.py”, line 325, in retryRequest
raise e
File “/app/backup/driverequests.py”, line 312, in retryRequest
response = self._request_client.request(method, url, headers=send_headers, json=json, timeout=self.config.get(Setting.GOOGLE_DRIVE_TIMEOUT_SECONDS), data=data, stream=stream)
File “/usr/lib/python3.8/site-packages/requests/api.py”, line 60, in request
return session.request(method=method, url=url, **kwargs)
File “/usr/lib/python3.8/site-packages/requests/sessions.py”, line 533, in request
resp = self.send(prep, **send_kwargs)
File “/usr/lib/python3.8/site-packages/requests/sessions.py”, line 646, in send
r = adapter.send(request, **kwargs)
File “/usr/lib/python3.8/site-packages/requests/adapters.py”, line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host=‘www.googleapis.com’, port=443): Max retries exceeded with url: /oauth2/v4/token (Caused by SSLError(SSLError(1, ‘[SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1108)’)))

Any suggestion on how to solve?
Thanks

Never seen that one before. This would imply that googleapis.com (the host for Google Drive’s public API) is is not serving you valid HTTPS, I’d try the following:

  • Restart the addon or the host machine, if you haven’t already. This will actually solve most problems.
  • Make sure in your addon settings you don’t have “Google Drive IP Address Override” specified, or if you do make sure it is actually google drive’s IP.
  • If you’re using a non-standard DNS configuration on your home network (eg PiHole, your own DNS server, etc) try disabling it and restarting the addon to see if thats the problem.

If you keep seeing problems I’d recommend creating an issue on the project’s GitHub Issue tracker so we can figure it out there.

I have tried all the following:

  • restart the add-on
  • restart the raspi
  • verify I do not have the “Google Drive IP Address Override” specified
  • try to specify google drive’s IP instead of leaving it blank
  • disable Pi-hole
  • connect to a different google account

None of them (or various combinations of them) worked.

The strange thing is that until a couple of weeks ago everything worked perfectly.

Did a binary sensor get removed from this? I used to have a sensor on my home screen that showed me stats about the latest back up etc but seems to have disappeared now which is a shame?

There haven’t been any (intentional) changes to the binary sensor. Sometimes it can take a few minutes after the Home Assistant is restarted for the sensor to show up, because of how Home Assistant works. You might just be seeing that, but if not open a bug and I can dig into it with you there.

Thanks, it seems to be back now so not sure what happened there!

Does this keep it’s “own set” of snapshots in addition to manual ones? Want to be 100% clear so i don’t delete something unwanted.

What i want to do: Let the add-on backup every week, keep only 10 backups on the drive. I don’t want the add-on to make any local backups, and I want the local backups i create myself never get deleted by the add on.

No, it tells HA to create the backup (the normal ones similar to when you manually do it) then uploads it the Google Drive.

The backups it creates are saved locally first before being uploaded. As far as I can tell, the minimum number of snapshots the add-on can leave locally is 1.

The add-on will only make file changes to the backups it creates, so any manual backups are untouched.

Thanks, so just setting it to do 1 local backup, 10 google backups every 7 days should work then, and I will make my manual backups “on the side”.

@ATWindsor
What @sparkydave says is correct, except that the addon will upload and clean up (ie delete) snapshots you create manually too. As far as the addon is concerned, there is no difference between the snapshots it creates and the ones you create manually.

Here is what I think you could do to accomplish what you described. When you first run the addon, open the settings before connecting Google Drive and:

  • Set “Snapshots in Home Assistant” to 1
  • Set “Snapshots in Google Drive” to 10
  • Set “Days Between Snapshots” to 7

Then when you want to create a snapshot manually, either:

  • Do it from within the addon (ACTIONS > Snapshot Now) and select the option “Keep Indefinitely in Home Assistant”
    or
  • Create the snapshot through Home Assistant (like you do now) and after the snapshot is created, open the addon and select ACTION > Never Delete on the newly created snapshot. If you forget to do this the snapshot will get deleted from Home Assistant when a new one is created on your 7 day schedule.

Just like it sounds, “Never Delete”/“Keep Indefinitely” will keep the snapshot around forever until you delete them manually. They also won’t count toward the snapshots kept in Home Assistant (1 in your case).

Oops, bad advice from me. Was this always the case? Maybe I’m thinking of the old Google Drive add-on that this one became far superior to

Snapshot time of day question?
I have my add-on set to12:00 but the snapshots are taken later and later every day.
yesterdays snapshot was at 12:53 and the day before at 12:52 and the day before that at 12:51. So it ads 1 minute every day
Why doesn’t it take the snapshot at the specified time??

This may well be answered, although searching didn’t turn up results.
Is there a way to ignore certain files, ie: the database file and log files as you can with .gitignore?
Seems that the snapshots can get pretty large if you include the .db and .log files.

Any way we can use our own API information for google?

Edit: My bad I can’t read. Thank you for allowing us to use our own!

@CBPetrovic You can make a partial snapshot that ignores one of the 4 main folders (ssl, config, share, local addons) but you can’t exclude specific files in those folders. What I’ve seen most people do is either:

  • Keep all the files they want to ignore in “share” and then exclude that folder in their partial snapshots.
  • Use an addon for the database (Like MariaDB or influx) and then exclude that addon from the snapshot (assuming you just want to exclude the DB).

I think that is the time that the snapshot is completed. If you are including the database and don’t have daily purging of the database then that file will be getting bigger each day, making the snapshot take longer to complete

No, it does not take 50minutes to complete. The snapshot is only 80MB and takes under a 30 seconds to snapshot AND upload if I do it manually

So, I moved the .db file into its own folder and excluded that.
Works fine, reduced the size of the backup file almost in half.

Thanks for the tip/hint