The initial release of the auto backup system was meant to be very basic and just get all the pieces in place. There’s already folk working on integrations that will expand its functionality. We’ll have to see what makes it through next month and what the core devs do to the current functionality.
I don’t think a “backup” is required at all. What would be better is a Git repository of user created data. So as a user make changes, the complete history if all configuration changes is saved. He can go back in time to any point.
There is certainly no need to save anything that was downloaded, like old copies of Home Assitant OS. Maybe log files can optionally be backed up but those are much less important then config files.
It is also pointless to encryp backups that are stored in the same file system as the live system. Ant sensitive data is already exposed. So in the case of backing up to a local location, just skip that and use Git.
In fact a “backup” on the local filesystem is not even a backup
Offsite backup would be very easy, just push the Git repository to a server. Yes, hide Git from non-technical users with a GUI
Databases and historical data…
Cute idea for very technical users like me and you, bad idea for regular people.
What the majority need is the ability to do point-in-time restores, with “smart” checkpoints (N times per TIMEUNIT, a checkpoint before upgrades, etc), and they don’t care about the technology. They do care about full recovery, though – and Git isn’t very suitable for e.g. the recorder/statistics data.
(If you meant “Git-like” as in commits that are change-deltas, and being able to do fast and effortless pushes and checkouts, rather than “use Git for backups!”, that’s a better idea ).
Perhaps not for you. But some people might need a particular version, for whatever reason, and they could have slow internet connection, or need toi be able to restore if internet connection is not available for whatever reason, or in the case that the particular HAOS or addon versions are no longer available online. It’s easy to assume a lot of things - I have a low-latency fios link at home, downloading a few gigabytes takes no time. A few years ago, I worked on an IoT project where some team members thought an African project member was lazy, until they realised he had to get on a bicycle and do a 2x15km roundtrip every time they fucked up the firmware and needed him to do a reset.
This is true, but keep in mind there are a lot of ways to run HA.
With the existing system, you could run on a RasPi off a SD card, backing up to a NAS. You could run on a HA-Yellow EMMC with NVMe Storage. You could do several other forms of local backup, and add whatever remote you wanted on top.
Bad idea. You’re seeing Git as a hammer that makes everything a nail. One really big thing it doesn’t address, and a very legitimate reason for the HA devs to do the forced encryption, is encryption at rest. While I don’t agree with the way they implemented it, you really should have encryption at rest and zero-knowledge encryption for backups.
Sune, closing the other thread was not an invitation to continue in this thread. Please take a break.
I’m sorry that you see a post about why a technical solution is a bad idea as a continuation of the “communication issues” theme in the now-closed backup topic, but OK.
I specifically made this topic (before the backup topic was closed!) because I wanted to avoid adding more noise there.
Git doesn’t encrypt repos internally, but a git-bundle is a monolithic single file, like a tarball, and can therefore easily be encrypted with GPG or OpenSSL.
An encrypted git-bundle would have similar pros and cons to the current encrypted backups - but with the richer history (and complexity) that Git repos provide.
The SSL folder issue is fixed in the latest release 2025.1.3 I see
@NYZack
“Always include SSL folder in backups”
I see some other approved fixes/improvements as well. but they are not in this release
I try to migrate to new system. Setup supervised sytem same version. Also the OS version installed, same result
Have created 4 backups and restored now 8 times, but all without history. Restores alywas results in not responding restore page, never a restart by iteself
I use MariaDB for history, the file of backup is 5Gb what it always was so loggins must be available
but whatever i try, never my history is available on the new system. Does anyone have this same issue? (and solved? )
I have the same problem with influxDB and Grafana, I’m using a VM so I don’t brick my running instance, don’t know if that has anything to do with it?
Also my automations.yaml and gmail.yaml were not restored from the backup which prevented it from running to the point I could not even get to the backup menu to do another restore.
At the moment I would not be able to restore to a fully functioning system if I replaced my hardware or SSD and will probaly lose all my years of data!!!
It is definitely not working if you use mariadb for history logging. Tried a lot of different backup and restore methods without good result
Ended up by installing MariaDB on the new system (both same version) and shutdown the database on both
Use ‘rsync’ to copy the database folder from old to new system. Overwrite everything and rebooted the OS
Backups to include an external DB depends on the RDBMS used. HA can backup the default internal DB (SQLite), which also happens to reside in the config directory.
Technically you could configure your external DB to also host its files in the config directory, but that would be odd (and likely a bad idea).
When choosing another option, you should be comfortable in the role of the database administrator, including making backups of the external database.
it might really be a rather bad idea to backup files from a database only file based, likely this means an inconsistent database when changes are still written to the database while the backup is taking place. Databases need to know about the backup being executed and during that time write only into a write-ahead log, redo log or whatever they call it in the used DBMS
And the backup tool has to be able to execute online backups for the DBMS
I already said that it’s likely a bad idea, but you could do it with if the HA service and DB have been stopped. So, technically it’s all possible, but you should know what you’re doing. This is also only relevent if your DB is on the same host.
I know, therefore I stopped containers before moving the files. I was forced to move to a new system due to hardware failure so this was my only choice as the HA-backup functions doesnt do a correct backup/restore from the Core-Addon MariaDB including data
At present, the new backup functionality isn’t particularly useful to me.
Good:
The ability to do encrypted backups, although I would agree a simple command line decryption tool should probably be available, even if just a simple script.
Bad:
This system is basically limited to local-only backups for me. I still need to manually download the backups and send them off to my on-site and offsite backup locations.
I don’t have a local un-encrypted NAS, as I get >650MB/s transfers via. scp/SSHFS. My offsite servers are only accessible via SSH, as well.
An SSH storage backend would make a world of difference to my situation.
Am i the only one that cant restore? It says my encryptionkey is not correct? But its the one I saved when I created the automatic backup?
What to do???
The backups seem to only be partial backups which seem to be worthless for disaster recovery. I was not able to use it to restore a new system during the onboarding.
I’m wondering where people save their keys.
I usually would store this in vaultwarden, but since I have installed vaultwarden as an addon on HA that would not be very smart.
I don’t trust cloud based password managers.
For now I stored a copy of the key in onedrive, but I’m wondering if that’s the best place to store it.
One copy on my network share, one copy on my laptop, and one copy on a portable hard drive. Pretty much how/where I store anything that it would cost me to lose, once bitten twice shy.