The addon doesn’t keep any temporary files (except a few kB of configuration). Home Assistant actually creates the backups, though the temporary files it creates shouldn’t persist longer than a few hours after a backup is finished. Its difficult to provide more guidance without knowing exactly what your machine is doing, there are a lot of things that can cause disk space to blow up, but here are some things that might help:
Try restarting the machine (don’t restart HA, reboot the whole machine). If disk usage doesn’t drop, its unlikely to be a problem with temporary files.
If you have something that keeps using more space (another addon, the sensor database, etc) then you should see your backups get larger over time. You can click on a backup in the web-ui to see whats taking up more space.
Make sure you aren’t keeping more backups around in Home Assistant than you’re able to fit on your hard drive. The increase in the graph you’re showing could be explained by just keeping mroe backups over time.
It looks like you might be keeping a very long sensor history (2 weeks or more based on the graph). If you have a lot of sensors (or some sensors that update very frequently) its pretty easy for the database to grow out of control. You can limit the size of the history by configuring the recorder component, especially the purge_keep_days option.
I just released a new version of the addon with only some bug fixes (0.110.2). Most notably among them is a fix for a DST transition issue. If you:
Are in a european timezone that observes DST transitions (eg CEST/EST).
Have your backups configured to start during the DST transition (between 2:00 AM and 3:00 AM)
Your backups take less than an hour to complete
Then its important to update to the latest version before DST ends on March 26th, or you risk losing some backup history. The issue manifests during the DST transition hour, during which the addon continuously thinks a new backup needs to be created which can “crowd out” the history of backups you maintain. This bug has been present for several years, likely causing problems without being noticed for anyone in a timezone that observes DST. It was only identified very recently. Europe is the next place that will observe a DST transition in the near future.
If you don’t already see the update, go to the add-on store in Home Assistant, and select “Check for Updates” from the menu at the top-right.
I keep a longer history. But database is only 1.3 GB. (SSD is 128 GB in total).
Purging and rebooting are not helping. Backup-files have a realistic size about 5-6 GB overall.
You’ll need to identify exactly where space is getting used to figure this out. Home Assistant’s architecture makes this difficult.
If its in one of the main directories (ssl, config, media, etc) then you can install the “SSH and Web terminal” addon, disable protection mode for it, and run this command:
Which will list the sizes of the root directories it can see, ordered by size in MB. If something stands out you can dig deeper into a directory with something like:
If you don’t see anything there, then the “host” operating system is the problem. This is where things get more difficult, as Home Assistant is rather antagonistic about you having access to the host OS. How you do it will depend on what hardware you’re using:
If you have ssh access to host (eg you installed HA manually on debian or a VM) then you can just ssh in and run the commands above.
If you’re running on a raspberry pi (or similar) you can turn it off, and look through ths SD card/hard drive on a different linux computer to see what files are on it. I’m not aware of an easier way to inspect the filesystem for this kind of system.
I’m new to HA, and just found your add-on. It looks truly awesome! Great to see such robust community add-ons. I tried the SAMBA add-on from another author, and it was so stripped down and the sensor had constant issues.
So that brings me to my question: Do you have any plans to add a SAMBA backup option in addition to Gdrive? I have a Synology NAS with vast amounts of space, and would like to have much longer retention on my NAS and only push to GDrive recent backups to help save storage space. I’m not suggesting another plugin…as maintaining two similar code bases would suck. But merely adding an option to use a SAMBA target to your existing codebase.
@sabeechen - is there any size limit for a single backup? I’m going to turn off local DB purging; yearly data growth in my case is around 2GB. I wonder if using SQL Lite is viable option for me.
@DuckDuck25 I want to support a bunch of different storage providers (samba among them), but most likely not. Handling just one well is tricky, the addon does lots of stuff in the background to overcome any hiccups it hits with Drive. Each storage provider has its own set of quirks to manage. Samba, noteably, is an absolute mess to interface with. I have to keep this maintainable, as I’m the only dev and only have my free time to develop it. If it happens in the future, I’ll likely make it a new project, one that talks with Home Assistant as an integration, rather than a supervisor addon. We’ll see.
@trance There is no size limit, though the database gets locked (eg no writes) while its backed up so thats a consideration. I’ve had users whose database takes so long to back up that HA’s buffer of writes overflows and they lose data. 2GB shouldn’t be anything to worry about, but it all depends on how fast your storage, CPU, etc is. SD cards can be pretty slow for example.
Thanks for the answer. I have HA running on RPi 4 and SSD drive and I just did a test to see where is the bottleneck. It seems that backup speed is mostly limited by compression operation which surprisingly is single threaded. The CPU load during the backup is 26-27% which indicates that only 1 out of 4 CPU cores is used. IMO it is wasted potential, backup could be 4 times faster, and the change is either adding some additional command line parameters or moving to something like pigz. Maybe a good idea for some small feature request to HA dev team?
After going through this thread and doing a couple of HA restores, I wrote a blog post on the various restore methods (under 1GB, over 1GB, etc.) and some other tips that I discovered when you have SSL setup. Home Assistant: Restoring your Configuration
A beginner question that for sure has been tackled anywhere, but with “encrypt” I do not find the answer:
So far I was using another add-on to store backups on my own samba server.
But I thought for a worst case scenario it might also be good to have the backups in a cloud outside of my home. But are the backups that this add-on does in any way encrypted? Or is the standard home assistant backup encryption seen as strong enough? I read that you cannot simply open the backup in an editor, but is it all in all good enough to prevent that google can actually open this backup?
I read that you cannot simply open the backup in an editor
You can’t open the encrypted backups in standard tools (Home Assistant uses a non-standard format), but I wrote a little utility that decrypts them with your password:
But are the backups that this add-on does in any way encrypted? Or is the standard home assistant backup encryption seen as strong enough?
I’m not a cryptography expert by any measure, and there are a lot of ways to shoot yourself in the foot when doing your own cryptography, but the way Home Assistant does backup encryption looks fine to me, its pretty boilerplate 128-bit AES. The curious can see how they do it here. If I were to criticize their techniques (warning, technical), its inadvisable to reuse the same IV for every encrypted backup by basing it deterministically on the password. While this wouldn’t allow easy decryption of your data its almost trivial for an attacker to know if two backups have the same (or very similar) data. This might be considered a vulnerability, but I fail to see how someone could use that information to do harm in the context of your backups. All that being said I’d wager, though I’m not sure, there is some cryptographic benefit to having the IV known only to someone who already has the key, especially if they only had one backup and were trying to brute force decrypt it. I guess its a tradeoff?
but is it all in all good enough to prevent that google can actually open this backup
Pretty much, I think so, if you have a good password. I can also say pretty confidently that Google is uninterested in the contents of anyones backups unless you’re trying to hide cp in it.
These are created automatically by Home Assistant when you upgrade an addon or Home Assistant. The “core” in the name indicates its a backup of Home Assistant’s config data. By default this addon ignores them, but your options are fourfold:
(Not recommended) Ignore them and let them pile up.
(Not recommended) Somehow always remember to de-select the “Create a backup” option when you update things.
(Not Recommended) Un-check “ignore upgrade backups” in the addon-setings and upload them to Google Drive. This will let them take the place of full backups, which most people wouldn’t want.
(Recommended) configure the addon to delete them automatically by setting the option “Delete ignored backups after…” to something like 1 week. that way you have them for a while but they don’t pile up forever.
And should i stop addons before i backup it?
No, unless you have a very special use case its almost always better to let Home Assistant handle that for you.
is it possible to exclude some components from backup? Recently there are stability problems with Unifi add-on, and some one mentioned it’s because of conflict with google backup
Can per-addon retention be implemented? I would like to store n backups of each addon regardless frequency of backup creation. Currently frequent backups of one component (or whole HA) wipes out old backups done for single component (it created prior to its update).
I’ve created FR some time ago but got no response so far
In general I don’t plan to add further logic that complicates which backups are to be included, ignored, or how they’re requested. It would be a useful feature, but require a lot of customization and configuration to make it broadly useful, and I have to prioritize maintainability of the project as I have increasingly less time to devote to new development. If I bite off too much, the stability and polish of what currently works will suffer.
That being said, you can accomplish a lot of customized logic by setting 'Days Between Backups` to zero, which disables automatic backup creation by the addon. You can then write automations in Home Assistant that call hassio.partial_backup with whatever schedule and parameters you like. The addon will copy anything to Google Drive if you configure it to, and thats how I advocate most people get to a custom or more complicated workflow.
Thank you for the honest answer. I understand and respect your decision.
Could you point me to the way how can I control file retention? I mean, I would like to maintain untouched at least 1 the very last backup of each component (to be able to rollback in case the new version of addon fails)
From what you wrote above I can control when GoogleBackup addon issues backups. But what about old backups removal (locally and in the cloud)?