Actually it’s important.
NC ignored GDPR when they introduced “Analytics” and I remember a similar discussion, where informed users expressed their concerns then. 2 years agoe maybe 3?
Maybe they took that on board a few years back and are now addressing that. However, more likely is that their lawyers have suddenly switched onto Amazon, Apple, Meta and Google all being sued successfully by the EU for contravening these laws which they thought as, USA companies, didn’t apply to them.
However maybe GDPR with this release should be in a different thread.
They really didn’t. It was opt-in from day 1, and they made sure to explain that your data is aggregated and not personally-identifiable.
I’m literally looking at the analytics page right now and there’s no data exposed which could remotely be associated to my (or your) particular installation.
Maybe, but it was discussed as an issue back then and wasn’t addressed in the forum. Hey ho.
It was clearly explained when it launched, together with a link to the documentation and the open-source code so anyone could verify what was actually being sent.
There was an accompanying forum post with the release, same as with all releases, so all people had to do was to read the release notes.
Yes all true. However in the blog after the release there was a discussion about whether, if people opted in, it would contravene GDPR.
That’s all I remember and it doesn’t alter the fact that GDPR is important for companies.
Anyway this is all irrelevant to this topic now.
TIL about the summary feature… thank you. That was very helpful.
thank you for this
Thanks for this as well. Didn’t know the summary feature was there. Much more easily digestible.
There is also another serious side effect of this whole change with the backups: We cannot set the “default backup location” with the old usual way (3 dots in backups GUI) without completely configuring backups in HA itself. I’m using exclusively the Google Drive Backup add-on for my backups, which uses the “default” location for its backups to a SMB share. The “default” backup location of HA could easily be set before 2025.1. But now what is the “default” backup location if the user does not go through the HA backup procedure? The option is missing in the backups GUI and can only be configured while setting backups through HA itself. So I had to specifically change the backup location in the add-on itself from “default” to my network storage. I don’t know where the backups would go if I left the “default” setting in the add-on.
maybe, you should just refer to this, to make clear, what someone could do with the data, he could get from an unencrypted backup stored in a cloud - unfortunately, it is in german, because it was a convention from our “chaos computer club” (maybe, they do have subtitles)
What a dumpster fire this backup thing has turned into. Almost can emulate the “beautiful login page” back from 2022. Yeah, this change was counterintuitive and poorly thought out.
...
Long ago I started updating only to the last release of the month.
(Don’t tell them I told you). Sorry, but my HA prod system is too critical to play with it. Not touching any release with this enlightened backup mess with a ten foot pole. The very crucial feature of backups is accessibility. You don’t want to be fumbling through encrypted, monolithic backups when SHTF.
I most certainly won’t be doing any core updates until/unless these mandatory encryption and hard coded time issues are reverted. Thank god I was able to roll back to 2024.12.5 without too much work. I absolutely was not expecting this sort of issue when I updated. No off switch–incredible. My backups occur when I want them to occur, and are saved into a Veracrypt container. This solution works and I don’t need or want anything different. I appreciate the software, which had made my life so much easier, but I would really suggest doing a poll on changes that affect the entire user base.
If you re-read the thread, you can still make backups unencrypted and on your schedule.
I think at this point, everyone needs to take a deep breath, read the full thread, and await a more in-depth response from the team once things calm down on their side.
Re-read 874 posts? How does anyone get anything done in HA unless it’s a full-time job? Very glad to have my system in a stable state with no need for any of the recent new features.
Is this too much to ask?
It’s responses like these that makes the topics unnecessarily long, because the same things need to be repeated again and again.
I’ve got a couple suggestions after reading through this whole thread that may be able to help future efforts like this from HA’s side as well as giving more time for feedback for an important feature such as this. I don’t think I saw it mentioned before either.
Experimental Features/Feature Flags
As a developer I can empathize with wanting to get features out, even in a basic MVP form, I feel like HA has grown enough and moves fast enough that having more time for features to bake outside the beta windows would be beneficial. Adding a panel for advanced users in Core/Supervisor to opt-in to upcoming features would be useful. Especially when designing a new feature to replace an existing one, more so for critical features such as backups. That would allow HA to ship features that aren’t fully fleshed out, while still enabling merging and shipping of features without the dreaded long-lived development branches. It can be a featured part of the announcement posts and even pop a notification (or new area of settings) to let users know there are new upcoming features that they can try out and provide feedback on. It can even include information such as the expected launch release, much like you do with the deprecated warnings that give (usually) ample time for external developers to be notified and make changes.
Personally I would love to see this integrated in to the stack as a way to test new features as they are being developed. However, that being said I would be completely against using this feature for A/B testing, as HA is a private affair and even my own personal HA can’t access the internet except through a proxy that only allows connections to specific hosts.
SecureTar changes
SecureTar seems to be a HA project, started by a NC dev, and follows the HA development styles and releases. There are a couple security issues with how it’s implemented in the supervisor that leads to some concerns, as well as how the core package handles encryption.
- In Supervisor, the IV is always the same based off the Key. Every cryptography expert will warn loudly that you should never reuse the same Key + IV pair, especially in CBC mode. Incidentially enough, the SecureTar package handles this correctly internally by randomly generating a new IV and prepending it to the tar file for internally encrypted files. However, the Supervisor breaks this rule and reuses the same IV each time. This needs to be changed.
- The key material (and IV in Supervisor) is being generated by grabbing the first 16 characters of running SHA-256 100 times. This is not secure. Using an configurable algorithm such as Argon2, PBKDF2, scrypt, etc.
- AES-CBC is generally not recommended anymore when other options are available. At minimum using AES in CTR mode or AES-GCM would alleviate the IV issues, as would using a AEAD cipher, like ChaCha or others.
- SecureTar has no way of indicating its mode and cipher selections. It should be updated to add its information as a header or trailer to the file to allow for future upgrades while maintaining compatibility with future security changes. An example of this is how the
/etc/shadow
password database has updated through the years and has well-known prefixes to indicate the parameters used to validate the password, like2b$12$DfR07XM1et8bz.kY3LgckOhjT6n...
.
I miss the partial backups since this update when an addon is updated that you can only restore this.
There was always an option backup with an addon update this is now gone
There is also another serious side effect of this whole change with the backups: We cannot set the “default backup location” with the old usual way (3 dots in backups GUI) without completely configuring backups in HA itself.
I also stumbled upon this because I have to set the storage location to local for Samba backup. If you then also activate the auto backup in the native backup configuration, Google Backup will delete the backups created by HA.
Meanwhile .2 was released and I want to shout about something I really love: the release notes for the patch are greatly improved! A small quote:
- Bump pysuezV2 to 2.0.3 (@jb101010-2 - #135080) (suez_water docs)
The addition of that little link at the end to the docs of the affected HA component is huge. It’s not even that I need the docs, but I need to know which integration is affected, to know if it affects me. Previously for half of the entries it would not be clear at all. I love this change.