After getting many “Sorry, I couldn’t understand that” messages from the built in conversation agent I think a set of sentences to help with discovery would be useful.
The conversation agent is basically like a read-eval-print-loop interface like the python shell or router command lines except as a voice eval speech loop. However as a relatively new interface it currently lacks some of the built in discovery features that modern text shells tend to have.
I was thinking having some commands like:
(help | what can you do) → a short overview of the supported sentences and listing a few or referring to Talking to Assist - Sentences starter pack - Home Assistant
Since the likelihood of a command being supported heavily depends on the names of areas, rooms, and devices I think it could help for the conversation agent to have commands to list what those areas, rooms and entities are exposed.
list areas, list rooms → I bet each of those is short enough to sit through a list being spoken
list devices → might be too long to list and would need to be filtered to list devices in (room | area)
and also maybe commands like:
list scenes in (room | area)
list playlists
If those are available, I haven’t found them yet.
And after writing this I wonder if a blueprint could handle a prototype of this.