Has anyone found a way to make Assist distinguish between areas and rooms?
I have a number of areas defined. Some are rooms and some are storage areas which have their own temperature sensors. If I ask “Which is the coolest room in the house?” I get a storage area. I really don’t want to go and sit in a cupboard.
The areas have labels “room”, “storage”, “garden”.
I’m using the OpenAI integration with default settings (so gpt-4o-mini). I’ve tried a variety of elaborate prompts and descriptions in the intent script (as in Friday’s Party).
In context, I don’t have a distinction… They’re all… Just there… I have a whole virtual floor for the ai to see with its ‘areas’ including the back porch now split into three deck areas (I wanted independent music control and that’s one where I split out an area)
If I ask temp I’m only getting areas with temp so the rest are immaterial?
So what’s not working?
I have areas which are rooms and areas which are IT cupboards - both have temp. We’re having a heatwave at the moment - the coolest place is an IT cupboard!
Ahh… Yeah I do t have the temps for that stuff attached to an area. (I also don’t generally ask for temps)
You’re into custom intents Jack?
Here’s how I’d do it if I HAD to (and I’d avoid it) Copy the current temp intent and customize it to filter only rooms with label X attached? Where label X is Room and the part that does the resolving of area uses this list instead of all areas. Poof.
That was the road I was going down when I ended up in the cupboard. Labelling the temp sensors as either “room” or “storage” worked first time, but I’ll give it another go - your approach would work for any kind of entity assigned to an area.
The broader picture is that I have not made OpenAI my default conversation agent, but it does have permission to control HA. I’m developing a single all-purpose intent script that uses OpenAI as the conversation agent to interrogate the system (I have another using Google AI to - well, Google - outside world questions). Otherwise it’s all custom and local. The point of this massive workaround is to send responses to my Sonos speakers.
Assist is automatically aware of them, and there seems to be a built-in alias “room” (which is annoying). You can add labels to areas, but Assist insists that the labels don’t exist (even with Friday’s index tool). There’s nothing there to expose to Assist. It’s all a bit odd.
Edit: OK, cracked it. Special index tools in the prompt to identify rooms, storage areas and outside areas.
Below is a list of areas with the tagging label 'room'. Use these areas when speaking about entities associated with rooms in the house.
rooms: {{ label_areas('room') }}
It’s a bit of a black art, isn’t it?
Another edit: …and it only works about 60% of the time.
OK, now bear with me. I’m stumbling along at Friday’s party a loooong way behind you and understanding about one word in ten - so I’ll ask stupid questions here.
About descriptions in intents. Am I right in thinking that
The AI reads them
If there is anything embedded - a template, for example - the AI will expand it and extract its value
They’re actually an extension of the prompt, but apply only to the intent in which they appear.