Anybody attempted to build a custom Alexa skill that allows us interact with HA’s Assist pipeline via Alexa’s wake word & built-in microphone/speaker? Here’s what I mean:
- User says “Alexa, assist” to invoke our custom Alexa skill.
- User then says, “Turn on the kitchen lights”. Which triggers the custom skill send Alexa’s voice recording to Home Assistant’s voice pipeline you’ve already configured in HA.
- HA voice pipeline uses Whisper’s STT engine to do natural language processing.
- HA voice pipeline makes a local service call to turn on the kitchen lights.
- HA voice pipeline uses Piper’s TTS engine to generate an audio response in .wav format.
- HA makes a service call back to the Alexa Media Player with Piper’s .wav file which causes the Alexa’s built-in speaker to say, “Turned on lights.”
This implementation would effectively enable Alexa to turn on your lights locally via HA’s Assist Pipeline.
NOTE: Hypothetically, you could combine steps 2 & 3 if you wanted to use the Amazon cloud’s STT engine in your custom skill instead of Whisper. This would make you more dependent on the Amazon Cloud, but it may be faster than Whisper.
SECOND NOTE: If your HA voice pipeline was configured to use chatGPT as the conversation agent it would mean you could talk to chatGPT via your Alexa dot.