I have a curious situation; I wonder if anyone else has come across it.
HA core-2021.12.9 on an Intel NUC.
I only recently discovered this addon, installed it yesterday. I did a Full Backup after deleting any previous backups, snapshots and other stuff sat in the file system that might bloat a baseline full backup. That backup came across to Google Drive at almost 30Gb! Hmmm…
I downloaded it back to my PC, unpacked it and the data payload is a shade under 3Gb. Something in there is packing out 27+Gb with empty space.
I used to have the motioneye addon running for my cctv system but hadn’t used it in a while. I had already cleared out all my cctv video and images but when I unpacked the full backup, that .tar file has a 0 bytes size. That peaked my interest.
I fired up Glances addon and looked at my File Sys size - 47Gb and change, which isn’t anywhere close to what I’m using. I deleted the motioneye addon and behold, the File Sys size dropped to 16Gb.
Something in the motioneye addon is holding a stack of empty space without actually reporting it as used space. Bear that in mind if you are seeing massive backup file sizes that you do noyt expect.
Hi!
I have a question: When I restore a backup, all of my addons are not installed yet. Is that correct? I did a fully backup.
Thanks in advance!
BR,
Sebastian
Yes it should restore everything. Unfortunately restoring some parts can fail and Home Assistant doesn’t always surface the error. You can check the supervisor logs after an attempt to restore to see if anything failed(Configuration > Supervisor > System in the Home Assistant UI). You can still restore just the add-ons or only some of the add-ons from your full backup by telling Home Assistant to restore it as a “Partial Backup”. Checking the supervisor logs for errors is the first thing I do after restoring a snapshot for this very reason, very often one or more add-ons will fail to restore and it’s only sometimes actually my fault.
Love the addon, feels good to have a backup in the cloud
Anybody having an issue with mariaDB? everytime there’s a backup, mariaDB locks, which is to be expected, but then disk storage starts rising, sensors stop reacting and flatlinening and I need to reboot the core to get everything working again.
I have an automation to reboot 30 mins after the backup finished but kind of feels like a workaround…
Are you including the DB in the backup? I also use the Maria DB but I don’t include the Maria DB add-on in my backup. I haven’t had a single corruption yet.
I don’t understand how to make the last backup template correctly? Reading the documentation here and I don’t really understand what needs to be specified here
Template variable warning: ‘default’ is undefined when rendering ‘{{ as_timestamp(state_attr(“sensor.snapshot_backup”, “last_snapshot”)) | timestamp_custom(“%d.%m.%Y %H:%M”, default) }}’
Template warning: ‘as_timestamp’ got invalid input ‘None’ when rendering template ‘{{ as_timestamp(strptime(state_attr(“sensor.snapshot_backup”,“last_snapshot”), ‘%d.%m.%Y %H:%M’ )) | timestamp_custom(“%d.%m.%Y %H:%M”) }}’ but no default was specified. Currently ‘as_timestamp’ will return ‘None’, however this template will fail to render in Home Assistant core 2022.1
Template warning: ‘strptime’ got invalid input ‘None’ when rendering template ‘{{ as_timestamp(strptime(state_attr(“sensor.snapshot_backup”,“last_snapshot”), ‘%d.%m.%Y %H:%M’ )) | timestamp_custom(“%d.%m.%Y %H:%M”) }}’ but no default was specified. Currently ‘strptime’ will return ‘None’, however this template will fail to render in Home Assistant core 2022.1
as_datetime converts the date string to a ‘Datetime object’ as_timestamp converts the ‘Datetime object’ to a UNIX timestamp (the number of seconds) timestamp_custom converts the UNIX timstampt to your preferred representation
By accident I found out that my Hass.io Google Drive Backup got stuck at version 0.100.0. When making a new Home Assistant machine on an RPi 4 I saw that version 0.105.2 was installed.
In the version 0.100.0 there is an ‘auto update’ switch, in V0.105.2 there isn’t…
And in V0.105.0 it’s called Home Assistant Google Drive Backup, no longer Hass.io…
This auto update switch is turned on in my current HA, but still it doesn’t update clearly. Now of course I don’t want to uninstall it, because I’m afraid of loosing everything including my configuration.
What is the correct way to update my current V0.100.0 to the latest version 0.105.2?
Edit:
After switching off this add-on, after a while an upgrade button appeared.
Strange thing now is that on one machine it has an ‘auto update’ switch, on the other it doesn’t. And both machines have version 0.105.2…
I’ve had some users report this in the past but haven’t been able to figure out why. Home Assistant is responsible for determining when a new version of the addon is available (by checking github), but sometime it just doesn’t notice. If you see it again, it might be useful to check the supervisor logs to see if there are any errors. The auto update switch is also something that Home Assistant manages on its own. If you’re able to provide any other information that might indicate why it is sometimes not appearing I’d like to know.
Just wanted to know, what is your typical backup file size. Currently, a full backup of mine is around 400 megs. Its no a big deal for me because I have the space on both the HA RPi (runnig a 500 GB SSD) and my google drive is 2 TB. BUT, when the backup runs, it uses resources on the RPi for about about a good 6 to 7 minutes (about 35% CPU usage, from a standard 4 to 6%) AND my memory swap usage also jumps up by about 1.5 % each time the backup runs. Now, if a 400 meg backup file size is on the large side, I can definitly do a partial back up. My biggest addon could probably be the MariaDB addon as my database is 1.2GB and growing. Does this back up also back up the database, somehow? I would assume not, since then my backup would be more than 1GB but, unless it does compressrion? I do NOT want it to back up the database if that is the case.
The backup is compressed, and if you’re doing a full backup then it is including the database. Depending on what your sensors are like its can be normal to see a very high compression ratio for the database.
A lot of people get concerned when usage spikes during a backup but…is it that bad for some reason? I schedule mine for the middle of the night and it goes unnoticed.
The backup on my primary machine is about 2GB compressed, though thats mostly because I have a lot of frequently updating sensors, which bloats the database. I think is probably most common, based on what I’ve seen users say, for backups to be under 1GB.
I suspect that the high CPU and memory usage on the raspberry pi doesn’t have as much of an impact on performance as saturating the disk write bandwidth. I’ve noticed that the HA UI seems to get very very clunky when something is heavy on disk writes, though I don’t know why.
You’re right, it doesn’t really concern me, and you’ve just further made me think less about it.
All is good now.
Maybe as a future feature or something (you probably have already thought of this), that you can choose to have a couple of options for database addons like MariaDB. For example, for people who have large DB’s they would like the configuration for the addon but not necessarily the DB itself. Like you, have I quite a few sensors reporting, you could imagine in say a year from, now using an SSD I can esily see the DB being 30GB…probably a bit much for my pi to compress.