Add-on: Home Assistant Google Drive Backup

Ugh, thats a tricky one. This solution won’t be clean, and will definitely require some work on your part to glue it all together to your liking, but I think it could be functional and you come across to me as the kind of person who might like to hack something together. I kinda went off the deep end figuring this out, but here is my best crack at a solution.

Home assistant doesn’t expose an API for inspecting or deleting backups (AFAIK, someone correct me if I’m wrong), moreover you’d want to know about the status of backups in Google Drive too. The easiest way I can think of to get that data is to query this addon’s webserver directly. I never intended it to be used that way, but there isn’t any reason you can’t because behind the scenes all of Home Assistant’s docker containers share an internal network, including Home Assistant itself. I write this with the caveat that this is the API the addon’s web interface uses to get its data, so it may change in the future without notice for new features or maintinence. I don’t go out of my way to muck around with it or anything, I just need to sometimes make changes to make things work.

So you’d do the following:

  1. Create a sensor that pulls the state of all your backups into the attributes of an entity in Home Assistant:
    In your configuration.yaml:

    sensor:
      - platform: rest
        resource: http://cebe7a76-hassio-google-drive-backup:8099/getstatus
        name: Backup Info
        value_template: "unused"
        json_attributes: ["backups"]
        unique_id: c7b9fa82-1e82-4be5-9e66-5d068d617501
    

    Restart Home Assistant, or just reload the rest entities. You should be able to then see a list of all your backups in the backup attribute of sensor.backup_info. The format should be self-evident, if verbose. The one downside of this is that the data won’t list what folders and addons are in a backup stored only in Google Drive, because the backup needs to be read to determine that (which would be very expensive and slow). It sounds like for your use case that won’t be a problem because you’ll always have anything stored in Google Drive also stored in Home Assistant. this could also potentially use a lot of space in your sensor database, especially if your backup list is large or you poll frequently, so you may want to exclude it from the database or only update when needed manually using the homeassistant.update_entity service. there is other data you could pull from the /getstatus request too (last backup timestamp, settings, etc), but I doubt it would be useful for your use case.

  2. Also in your configuration.yaml, create a rest command that lets you tell the addon to delete a backup from Google Drive or Home Assistant (or both):

    delete_backup:
    	url: http://cebe7a76-hassio-google-drive-backup:8099/deleteSnapshot
    	method: POST
    	headers:
    	  accept: "application/json, text/html"
    	payload: '{"slug": "{{ slug }}","sources": ["{{ sources | join("\", \"") }}"]}'
    	content_type: "application/json; charset=utf-8"
    

    Restart Home assistant for it to register the rest command, you’d call it with something like this:

    service: rest_command.delete_backup
    data:
      slug: 61a4639f
      sources:
        - GoogleDrive
        - HomeAssistant
    

    It takes some parameters in the data field to indicate where to delete:

    • slug This is how Hoem Assistant identifies backups, each one has a unique slug. You should be able to see the slug id listed as part of the data for each backup in in your new sensor.backup_info.
    • sources where to delete the backups from, you can include both palces or just one depending on your needs.

So that gives you a way to see what your backups are and delete them. From there you’d have to make a script or automation in Home Assistant that inspects the backups and chooses which ones to delete. You’d probabyl want to make a python script for this, because I imagine its going to be quite complicated and you can only do so much with templates/automations before it turns into a mess. You could use templates and automations, I suppose. Just my intuition.
Keep in mind that the information you get from the sensor can be stale depending on when you polled it and for a few seconds after changes while the addon resyncs, so you might need to accomodate for that. If you’re intending to manage the deletion of backups, you’ll probably also want to set “Backups in Home Assistant” and “Backups in Google Drive” both to zero in the addon so it doesn’t try to compete with you for deletions.

Most important of all: While you’re developing code for this keep a good full backup safe somewhere else, you’re very likely to write a bug while testing things that deletes everything at some point. I speak from painful experience.

3 Likes

Thank you for the elaboration.
It seems pretty complex though.
Initially, I was thinking about something like grouping files using regexp and then deleting older files group by group. Seems it would fulfill my needs.

I have to read your way a few times more to better understand the flow.

Thank you.

Hi!
Is there any service available to launch a full Google Drive Backup?
Thanks!

I expect that the addon does not have it´s own service to manage that, but that it uses the core service: hassio.backup_full

1 Like

@piggyback is correct, if you call the the hassio.backup_full (or hassio.backup_partial) service the addon will notice the new backup when its done and upload it to Google Drive. This addon doesn’t actually create its own backups, it just asks Home Assistant to make them, so this is bascially equivalent to triggering a new backup from within the addon’s web-ui.

1 Like

Need some help. My backups are not being named. All I see in my Google drive is 4 files with the name of just “.tar”. The backups are different sizes and dates. So it appears that everything is working except for the naming. Doing a manual backup works perfectly and names the file as it should. I have not changed any settings. All is default from the time of install. Can’t seem to figure out how to fix this. Any help is appreciated.

Do you have your backups configured to happen at 2 AM? I had some people run into a similar issue once and we never got to the bottom of it.

Backup is scheduled for 3am. Just changed it to 11pm and will run tonight. I will report back tomorrow. Thank you for all you have done on this addon. Really appreciate hour hard work.

This worked. Full backup completed last night and named properly. Must not like 3am for some reason either.

Hi All,

I think I know the answer to this but as I’m struggling to find a definitive answer and as I’m a bit nervous I thought it would not hurt to ask…

I have a Google account with 100GB of space and I was thinking of backing up all my families HA installs and even some friends. Currently I just back up my own instance and have all my photos backed up too.

I assume that by design there is no way any savvy person on any of the installs can see anybody else’s files or my pictures, even if they can access the root files?

I understand they will see the the registered email address and remaining space, anything else?

I assume that after the connection is made between HA & Google there is no way the details can be used again for any other purpose?

There are a numbers of ways to interpret your questions, so what I’ll try to answer is:

  1. When the addon authenticates with Google what permissions does it have?
  2. If multiple addons backup to the same account, what can they see?

For reference, most of this is also explained here. when the addon authenticates with Google Drive it requests the drive.file permissions scope, which is the most narrow permissions something can have that puts files in Google Drive. It lets the addon see:

  1. Basic account info (I think thats just the email but might also be the name you gave Google).
  2. High level Drive account metadata (space free, space used, space used by trash, etc)
  3. Read/write access to any files created under this scope.

The credentials specific to an installation are only stored locally on disk in a json file (eg on your home assistant machine). The ramifications of this are:

  1. A tech savvy person with access to one of those json credential files (eg access to the Home Assistant file system) and knwoledge of Google Drive’s API has permissions to see/modify any files (eg backups) created by the addon, but no other files in your Google Drive. the credential file is kind of like a password that only works for the files created with the password.
  2. Moreover, if you have muliple instances of the addon uploading to the same Google Drive account, the credentials for one can be used to see the backups other instances. This is because of how Google handles permissions for the Drive.file scope, its scoped on a per {app, user} basis, where app in this case means the addon.

If you’d like to avoid #2 (shared permissions between instances on the same account) the only way is to follow these instructions to create your own credentials with Google, ensuring that you use create a separate Google project for each instance you want to authenticate.

1 Like

Many thanks for your reply. I’ll read the links in detail, thanks for pointing me to them.

I think I understand you …

So even though I have separate folders defined for multiple instances to back up to the fact I use the same Google Project (which happens by default) means they are all linked?

So following this correctly means I’m completely safe? I’ve used Google API’s and Projects etc before so that makes sense.

Happy I asked this now and did not assume.

It would mean one instance couldn’t see anything from the other or your other files, despite being on the same account. So yes safe if thats what you mean.

1 Like

Given 2023.6’s native NAS support, does your add-on use the generational settings for local/NAS HA backups or only those stored on google drive? If it’s currently google drive only, I’d love to see an option to apply generational settings to local backups too.

Right now the addon is network share unaware, and thankfully the mount integration in HA is pretty backward compatible friendly, so everything gets stored on local disk, even if you have backups configured to default to a share. In other words, it works just like it always did.

I’m working on exposing network share options in the addon. My current plan is to:

  1. Use the network share you have configured for backups by default.
  2. Let you override the share the addon creates backups on.

It looks to me like the API in the new version already exposes enough information to do so.

For generation backups specifically, from the addon’s perspective backups in a share look identical to those on disk, HA doesn’t differentiate between them in the API (afaict). So the addon will work the same as it does today, applying the generational settings to the backups stored on the NAS as if they were on disk.

Ok sounds good! Look forward to a future update!

Am I to understand that the backup is still created on the local device, even if I have it set to a network share?

In other words, does the new 2023.6 functionality work the same way as this add-on, in that the file from the local storage device is simply copied to the share rather than created on the remote share?

I’m a big fan of this add-on and have been using it for a long time. After installing Home Assistant 2023.6.1, however, it has stopped working.


I tried re-syching and also deselectig the ‘Manually specify the backup folder’.
The folder is accessible using the supplied link.
I am at a bit of a loss now. Can anyone help?
Thanks

I’d reauthorize the addon with Google Drive (Actions > Reauthorize Google Drive). If you have multiple google accounts, make sure you sign in with the one you intend to use for backups.

The error means Google Drive denied access to the folder or account. Unfortunately Google doesn’t explain why access was denied, it just says “no”, so its difficult to know why this happened. there are a million little reasons Google might shut off access, including if it suspects some part of your account has been compromised, which they’ve tended to be a bit over zealous about lately.

If that still doesn’t work I’d reinstall the addon. This won’t delete anything in Google Drive and it should find your previous backups once you reconnect it. You’ll need to copy your settings form the old one to the new one though.

If that still doesn’t work there is most likely something wrong with your Google account. Log in to https://account.google.com and make sure you don’t have any issues it warns you about.

@CaptTom
I didn’t implement the share feature in HA so I can’t speak confidently, but it definitely needs some local space to store parts of the backup before it creates the final archive on the share. Sometimes that can be as large or larger than the final backup.

My comments above are about how I’m going to make this addon integrate with shares The way that added it to the API, the addon works “just like it used to” right now, creating backups locally, because it doesn’t specify which share the backup should be created on.