Alexa recognizable Input Helper linked to ADT panel

Hello,

I’ve setup my ADT via the envisalink yaml update and integration that creates a read only panel that I can adjust in HA just fine but I’m wanting to create one that’s propagated to Alexa via linked skill & lambda integration. The ADT Panel has 4 states (home, away, night & disarmed). I’ve created virtual devices with on/off and those are able to show up via Alexa but input helper does not appear as an alexa controlled device. I want to be able to use voice command to set the states on the ADT panel. Is there a way to create a device with varying value attributes that you can link to entity states like I’m suggesting or a way to allow input selector to do this? In the helper I’ve chosen expose but that’s not doing anything; maybe just for HA cloud? but not for linked skill method. I’m looking for the latter.

As far as I know, Alexa only recognises input_boolean helpers, so you would need one for each ADT state, but if I’ve understood you correctly what you actually need to do is launch scripts in HA from voice commands, which you can do with Alexa routines. Wouldn’t that work?

It’s not no. At least through gui; the scripts or automations are not allowing control of boolean helpers to have away turn on, then turn off home, night & disarm and vice versa for all the other possibilities. or to then call into the binary entity to control the actual device.

NVM i figured out a way to do this via automation. It’s been 5 days for me learning HA and so many things are configurable via the gui. I love it. Smartthings shouldn’t be called that at all. Should have moved from that a long time ago.