I’ve used a similar configuration to get Google Home to respond to specific queries. We only use it for “where is dad?”, “where is mom?”, “where is daughter?”, but I’m going to expand it to other sensor queries. The problem I had is that there was no way to specify which Google Home device to respond, so you had to have all of them respond. With like 7 of these things around the house, that was less than ideal. I’ve shortened this example to be for 2 google home’s just to keep it succinct. Also, using this with location tracking is a very complex endeavor. This could be greatly simplified if the intent was to just have a sensor reading spoken from a specific google home (more on that at the end).
This is a hacky, but so far reliable, way to have the Google Home that is asked to also be the only one that responds. It requires Google Assistant integration with HA, Google Home Routines, and a Google Home integrated music service. I use Google Play Music because I have a subscription. The one caveat is that you have to be able to upload personal music tracks.
For this example, I’m using the Dad finder with the Google Location component, and I have a dummy google account setup on the HA server. I’m sharing all family locations with this account (in the words of Dr. Zzz, “it isn’t stalking if it’s your family”). So starting with the script that will give us the answer:
Where’s dad
script:
dad_finder:
alias: 'dad finder' #when imported into google assistant, this is the name google home will recognize as the script
sequence:
- delay: '00:00:08' #longer delay needed than xbmcnut's 2 second recommendation. We have to allow enough time for google home device selection automation to work
- service: tts.google_say
data_template:
entity_id: '{{ google_home_device }}' #this is the wildcard that will get the actual google home device from an automation we'll set up later
message: >
{% if is_state('device_tracker.google_maps_XXXXXXXXXX', 'not_home') %}
Dad was last seen at {{ (states.device_tracker.google_maps_XXXXXXXXXX.attributes.address) }} at {{ as_timestamp(states.device_tracker.google_maps_XXXXXXXXXX.attributes.last_seen)| timestamp_custom('%I:%M %p') }} #if Dad isn't at a pre-defined HA zone, then this will give address info
{% else %}
Dad was last seen at {{ states('device_tracker.google_maps_XXXXXXXXXX') }} at {{ as_timestamp(states.device_tracker.google_maps_XXXXXXXXXX.attributes.last_seen)| timestamp_custom('%I:%M %p') }}. #if Dad IS in a pre-defined HA zone, then the name of that zone will be read
{% endif %}
You will need to set up a Google routine with every permutation you would expect your family to use for Where’s dad. For whatever reason, we actually have to use last names in this query when we ask by formal name…otherwise we get a nonsensical web response. YMMV. Since so far wild cards aren’t allowed in routines, you’ll need to set up a routine (and script) for each member of the family you want to be able track. And make sure in your Google Assistant integration with HA you have scripts in the included domains.
You’ll want a Google Routine that does two things when asked “where’s dad”? You want it to play a short music track you uploaded to Google Play music on the Google Home you just asked “Where’s dad?” and then you want it to run the script above.
I named the mp3 that I uploaded to GPM “Location Finder dad”. This music track is simply 9-10 seconds of recorded silence. Shorter may work if you have speedy internet, but 9-10 seconds worked perfectly for me.
I had to upload 3 separately recorded music tracks (all 9-10 seconds of silence, one for each member of the family) to EACH user’s GPM account. Even though we have a GPM family account, personal tracks can only be played in “my library”, which is tied to the individual user. When I tried to upload the exact same recording, just with a different name for each family member, GPM recognized them as the same track (due to exactly matching duration) and only kept one.
You see in the routine that the first step is “Play Location Finder dad in my library”. I know this seems needlessly convoluted, but welcome to the bleeding edge (and I sincerely hope that someone much further along with HA, Google Home, etc. responds to this by saying that I have waaayyyyy over-engineered this, and here’s the simple solution).
The next part of this is to set up a virtual sensor for each Google Home device that you want to have the capability to answer. Since we’re looking for media title, we’ll need to use a value template to extract attribute media_title.
#Answering GH sensor for triggering automations
- platform: template
sensors:
office_home_answer:
value_template: '{{ states.media_player.office_home_2.attributes.media_title }}'
- platform: template
sensors:
den_mini_answer:
value_template: '{{ states.media_player.den_mini.attributes.media_title }}'
And now for the last part. This is the automation that will populate the entity_id: ‘{{ google_home_device }}’ to the script. It simply says the last GH to get play “Location Finder dad” is the active google home. You will need each media device set up for each family member tracked, so a total of 6 automatons in this example (I had 21 - 7 GH devices X 3 people tracked).
#Select the office home for dad
- alias: Office Home dad finder to respond
trigger:
- platform: state
entity_id: sensor.office_home_answer
to: 'Location Finder dad'
action:
- service: script.dad_finder
data:
google_home_device: 'media_player.office_home_2'
#Select the den mini for dad
- alias: Den Mini dad finder to respond
trigger:
- platform: state
entity_id: sensor.den_mini_answer
to: 'Location Finder dad'
action:
- service: script.dad_finder
data:
google_home_device: 'media_player.den_mini'
#Select the office home for mom
- alias: Office Home mom finder to respond
trigger:
- platform: state
entity_id: sensor.office_home_answer
to: 'Location Finder mom'
action:
- service: script.mom_finder
data:
google_home_device: 'media_player.office_home_2'
#Select the den mini for mom
- alias: Den Mini mom finder to respond
trigger:
- platform: state
entity_id: sensor.den_mini_answer
to: 'Location Finder mom'
action:
- service: script.mom_finder
data:
google_home_device: 'media_player.den_mini'
#Select the office home for daughter
- alias: Office Home daughter finder to respond
trigger:
- platform: state
entity_id: sensor.office_home_answer
to: 'Location Finder daughter'
action:
- service: script.daughter_finder
data:
google_home_device: 'media_player.office_home_2'
#Select the den mini for daughter
- alias: Den Mini daughter finder to respond
trigger:
- platform: state
entity_id: sensor.den_mini_answer
to: 'Location Finder daughter'
action:
- service: script.daughter_finder
data:
google_home_device: 'media_player.den_mini'
As stated in first paragraph, this is about as complex as it gets. If you were looking for a simple sensor response:
You’d need a script for each of those sensors, just like xbmcnut lays out with some minor but important changes:
script:
speak_garage_status:
alias: 'Ask Google for Garage Status'
sequence:
- delay:
seconds: 8 #xbmcnut's 2 second delay may be enough. If you get large gaps of silence, then shorten this time. GH had to go to cloud to get location data, but sensor info is likely local
- service: tts.google_say
entity_id: '{{ google_home_device }}' #this is the wildcard that will get the actual google home device
You would record an 8-10 second mp3 for each sensor you’d want GH response capability for. You’d then have to upload those mp3’s to each users’ GPM (or whatever service your using) library. And note: this may not be required if your music service allows you to share tracks across family members. This was truly a hassle for me to figure out, and annoying - especially since we have a GPM family account we’re paying for.
As to the Google Home setup, the same applies on setting up the garage sensor routine. Set up every permutation your family may use to ask about the garage status. Then setup the Garage Status routine to 1) call the mp3 above - I’m calling it Garage Status, and then 2) turn on the garage status script.
We’d use the same logic for the virtual sensors setup we did for the location finder, so to stick with xbmcnut’s example:
#Answering GH sensor for triggering automations
- platform: template
sensors:
kitchen_home_answer:
value_template: '{{ states.media_player.kitchen_home.media_title }}'
- platform: template
sensors:
insignia_speaker_answer:
value_template: '{{ states.media_player.insignia_speaker.media_title }}'
Then we’d need the automation. But since we’re calling a script for the sensor we would be able to create just the one instance for each GH:
#Select the kitchen home for garage status
- alias: Kitchen Home to respond to garage status request
trigger:
- platform: state
entity_id: sensor.kitchen_home_answer
to: 'Garage Status' #this would be whatever the title of the mp3 we uploaded is for garage status
action:
- service: script.garage_status
data:
google_home_device: 'media_player.kitchen_home'
#Select the insignia speaker for garage status
- alias: Insignia Speaker to respond to garage status request
trigger:
- platform: state
entity_id: sensor.insignia_speaker_answer
to: 'Garage Status'
action:
- service: script.garage_status
data:
google_home_device: 'media_player.insignia_speaker'