Hi,
I was playing around with a custom integration for my city public transport, mainly based (aka copy & paste code) on Dublin bus, Dublin Bus - Home Assistant
The integration is working, I have some devices for the bus stops and the routes, near my home.
Now I want to integrate that with conversation, using openai, just because my daughter said, “uhhh, thats cool, but can I just ask alexa for the next bus?”… ok, no alexa but better with home assistant… the thing is, right now my integration, as Dublin bus does, exposes some information on attributes, like “Due in”, how can I instruct openai to use that attr? or is better to create single devices, for stop/route, with sensors like due in…
BTW, I managed to get it working, I mean, asking for next bus works… but it simply says “next bus in will arrive in ”, but doesn’t give me information about the route, I can’t ask for a specific route, its somehow limited…
Could you create a template sensor with the output you want and use that state when you ask for next bus ?
something like this ?
Next bus is {{ state_attr('sensor.parada_silva_y_almafuerte', 'Route') }} and is due in {{ state_attr('sensor.parada_silva_y_almafuerte', 'Due in') }} minutes at {{ state_attr('sensor.parada_silva_y_almafuerte', 'Due at') }}
Thanks, I’ll try that… but I also want to understand how conversation integrations works… the “instructions” field, how is used? how often is parsed? what is better for this kind of integrations, a device with several sensors or one sensor with attributes? how the sensors are consumed by the conversation agent? only the value is exposed? any method or attribute could be used to provide a more friendly response?.. I’m trying to avoid the duplication of having a sensor and a template sensor just to provide the text to conversation…