I would like to be aware of what state a voice assistant is in- listening, processing and responding. I am using an assist pipeline with VOIP, and while I can know when a session has started, I can not react to and show loading states during the conversation.
More advanced- it would be great to know the previously interpreted statement & responses as sensors to react to.