This is absolutely terrible news for me as it pretty much breaks my entire HA setup. I see in the PR that the entity filter parameter is now required due to possible performance hit of fetching all entities, and “This was only used by the old history panel”. Well, if you have a documented rest api people might actually use it. Assuming it was only used by X kind if ignores that fact that the api is, well, an api.
As for the performance hit, well yes, I do want to fetch the history for all my entities. So any performance hit is actually a choice of mine. That’s why I fetch it once every now and then and have a whole bunch of scripts working against that data offline instead of having each script fetching the history data it wants.
It worked perfect for years and I see no other alternative in the current api that can be used as a replacement. I fetch the history (first and last entries, i.e. the minimal_response option) for all entities because I don’t know beforehand which entities I have or want to process. If this is a performance hit due to a heavy call in the backend, well I think I as the user should be the one deciding if it’s a reasonable performance hit. I get that this should not be done if a component or user don’t explicitly ask for it, but that’s something different than just nuking the possibility for all of us.
I guess I could fetch all my X thousand entities one by one or in batches and then rebuild the merged history from it. Very ugly and not really good performance.
Okay, that completely kills an important feature of my history explorer card.
I’m doing an initial query without entity_id, with a very short query period, to get a list of all entities that are are stored in the db, so that users get only relevant entities in the entity selector UI. That will not work anymore with this change. Is there any other API that will let me get the list of recorded entities ?
Edit: oh wait, is that just for the REST API or is the WS API affected too ?
Wouldn’t providing an API endpoint to return a list of all entities being recorded be something to consider ? It would make such queries much more efficient (not only for my own use case).
I’ve been interested in migrating away from MariaDB as well, but the only thing keeping me from doing so is that the Nginx Proxy Manager addon requires MariaDB
Wow that was quick ! Honestly, having that would help a ton. It may somewhat of a niche feature, but this would make a lot of users of my card happy, as it would make certain awkward workarounds (especially around wildcards and dynamic history list population) a thing of the past. Gonna try this asap.
Edit: I tested it and it works great. I commented my rationale about why I think this should be merged into Core on the PR. Thanks so much for drafting this, I’ve been waiting for such an API call for a long time.
It’s an advanced history panel basically. It presents users with an interactive way to create their history layout, and in order to do this, users are given a list of entities that are recorded in the database (and only those). While the hass object obviously gives a list of all entities in the state machine, there is no way to query if an entity is actively being recorded in the database (ie. not excluded by the user), as far as I’m aware.
Not quick and easy necessarily, but free. You can create your own self-signed certificates, with the caveat that you then need to install the signatures to both the supervisor and core as trusted sites. HA has started reporting these changes as manual modifications, thus breaking official support. Also, it’s a pain because EVERY upgrade of either supervisor or core requires re-installation of said signatures.
I’ve lobbied for a text field in settings (advanced) allowing one to place signatures there, which HA would import to trusted – no luck so far. I’ll probably try to bring it up again during the next “why the hell…?” event.
wth rarely leads to anything anyway. Big fuss about nothing.
So you are saying that this Self-signed certificate for SSL/TLS - Home Assistant 中文网 does not work? Looks like a permanent and update safe approach. Wanted to try it
tomorrow but will not bother if i have to redo after updates.
Same experience here: after the update google assistant exposed entities are no longer what they were before. Next to that the list of exposed entities is lost after the update, I’ve encountered 2 more issues:
1 - aliases do not seem to be exposed any more to the google assistant: asking my google home to switch on a light by using the light’s alias results in “Ik begrijp het niet”, whereas if I ask it to switch on the same light using the entity’s name, the light is switched on.
2 - i used to have a helper (input_boolean) exposed to the google assistant, after the update I no longer find this helper in the list of items I can expose. Is this no longer supported?
Yes, that should work just fine for accessing home assistant (in fact, my configuration uses that). Disregard my comments, those were for getting HA to trust self-signed certs so HA-sourced traffic can use ssl locally. If you’re targeting commercial devices with their own certs, those should already be trusted.
It sounds like performance of the internal DB has really improved since ‘back in the day’ when I moved to MariaDB in a Docker container. I would love to use a unitary backup solution that would include the DB primarily for situations where bugs in new releases have me downgrading.
There is a post below which says an input_boolean helper is not exposed either, probably set in YAML.
I’ve looked at the ffmpeg integration and that also does not offer a unique_id option, and probably many more…
I do use a camera with ffmpeg to access the feed on a Google Nest Hub. If I will roll to 2023.5.X I will loose probably that camera on the Nest Hub.
It is a bit controversial adding the devices this way to any voice assistant, if it has unique_id or not. First all the HA integration should support the unique_id option, by default, being yaml or not.
Does anybody know how I can list all my entities which hasn’t got a unique_id?