Looking into getting a voice assistant preview and connecting it to my home cinema which is a Denon x3500h but I’m just wondering how would that work.
I guess I could get a 3.5mm to coax cable and connect that way but I am not seeing how that would make the voice assistant play the output over the AV receiver (without me switching it on or going to the right input)
Something like that impossible perhaps?
If using the Denon as a media player entity from HA it does work as expected, takes over the sound output so maybe like that?
I setup a universal that represents the av stack then when the living room VPE goes active I pause playing media save state of universal. Switch to ‘friday’ input the when it goes lo flip back to prior state
Works great EXCEPT if we get a false enabled wake word! TV sets her off… The yihr show us suddenly pausing, the led ring is spinny and your SO is staring at you with the ‘fix it’ glare.
Hint. Get a ground loop isolator for your vpe - you WILL need it on the denon. (it whines like a turbo spooling when the led goes if you dont…)
Oh BTW H… Music assistant and divorcing Amazon fixed a lot of my gripes with Spotify on Denon
Nathan, my apologies but I have not read your Friday trials and tribulations thread since I found it a week or two ago. Picking that thread up from the beginning seems daunting for someone starting out. There is a lot of information and a lot of rabbit holes to get caught in for myself specifically.
With that said, I am unfamiliar with binding the PE and an output source into a universal player. This has been on my list to learn. To cheat a little, I wonder if you would in the briefest of overviews, explain how this might work?
I know I can go into the UI and create a helper which binds the entities into a universal player, but what benefits would this have over referencing the entities individually, other than saving a little bit of coding?
Because the you use normal media player stuff to control it when you want to use it not having to custom write every automation. You… Control a media player and represent it as such on cards and your AI sees jt the same way you do. It’s about creating a single AUTHORITATIVE space to manage that media space and dressing it with context.
Thanks for your reply Nathan. Do you have a good reference for this bit of your explanation so that I might read more and learn? I am most interested in the save state bit.
VPE goes active I pause playing media save state of universal. Switch to ‘friday’ input the when it goes lo flip back to prior state
It’s a riff on this Redirect Voice PE Replies to Sonos. You may not use it exactly but you can get enough on how to trigger when the vpe is active. Then either build a script or build it into your universal
Lol, I have been bantering back-and-forth with Snapjack on that thread as we both have been working on ways to circumvent the built-in intent pipeline.
The problem is the PE calls the intent pipeline explicitly at the start of the conversation and the pipeline is explicitly designed to return a response directly to the entity which recorded the voice input.
In the case of trying to output the response to the secondary entity rather than the recorded entity, things get muddled. You are caught trying to override the HA native pipeline. Things go awry quickly.
Your idea of a universal player is something I stored in the back of my mind, but would solve this problem by bundling the PE and output source (two different entities) into one helper entity which the pipeline would see as the same input and output source. If it works, brilliant.
Now I need to find some tinkering time amongst food prep, mowing, family, dogs, other life to-do items today. Ha!
ps., this all would be made MUCH easier if there were a method built into the native pipeline which allowed for the output response to be redirected to a different source rather than the input source.