Have Google home read out sensor state when asked?

I have simplified greatly my ask_google.yaml and that can be found here. Using scripts directly with Google/Alexa. Google still says “Activating Pool Status” etc hence the 2s delay but otherwise it’s very simple and works well.

I’ve used a similar configuration to get Google Home to respond to specific queries. We only use it for “where is dad?”, “where is mom?”, “where is daughter?”, but I’m going to expand it to other sensor queries. The problem I had is that there was no way to specify which Google Home device to respond, so you had to have all of them respond. With like 7 of these things around the house, that was less than ideal. I’ve shortened this example to be for 2 google home’s just to keep it succinct. Also, using this with location tracking is a very complex endeavor. This could be greatly simplified if the intent was to just have a sensor reading spoken from a specific google home (more on that at the end).

This is a hacky, but so far reliable, way to have the Google Home that is asked to also be the only one that responds. It requires Google Assistant integration with HA, Google Home Routines, and a Google Home integrated music service. I use Google Play Music because I have a subscription. The one caveat is that you have to be able to upload personal music tracks.

For this example, I’m using the Dad finder with the Google Location component, and I have a dummy google account setup on the HA server. I’m sharing all family locations with this account (in the words of Dr. Zzz, “it isn’t stalking if it’s your family”). So starting with the script that will give us the answer:

Where’s dad

script:  
  dad_finder:
    alias: 'dad finder' #when imported into google assistant, this is the name google home will recognize as the script
    sequence:
      - delay: '00:00:08' #longer delay needed than xbmcnut's 2 second recommendation.  We have to allow enough time for google home device selection automation to work

      - service: tts.google_say
        data_template:
          entity_id: '{{ google_home_device }}' #this is the wildcard that will get the actual google home device from an automation we'll set up later
          message: >
            {% if is_state('device_tracker.google_maps_XXXXXXXXXX', 'not_home') %}
              Dad was last seen at {{ (states.device_tracker.google_maps_XXXXXXXXXX.attributes.address) }} at {{ as_timestamp(states.device_tracker.google_maps_XXXXXXXXXX.attributes.last_seen)| timestamp_custom('%I:%M %p') }} #if Dad isn't at a pre-defined HA zone, then this will give address info
            {% else %}
              Dad was last seen at {{ states('device_tracker.google_maps_XXXXXXXXXX') }} at {{ as_timestamp(states.device_tracker.google_maps_XXXXXXXXXX.attributes.last_seen)| timestamp_custom('%I:%M %p') }}. #if Dad IS in a pre-defined HA zone, then the name of that zone will be read
            {% endif %}

You will need to set up a Google routine with every permutation you would expect your family to use for Where’s dad. For whatever reason, we actually have to use last names in this query when we ask by formal name…otherwise we get a nonsensical web response. YMMV. Since so far wild cards aren’t allowed in routines, you’ll need to set up a routine (and script) for each member of the family you want to be able track. And make sure in your Google Assistant integration with HA you have scripts in the included domains.


You’ll want a Google Routine that does two things when asked “where’s dad”? You want it to play a short music track you uploaded to Google Play music on the Google Home you just asked “Where’s dad?” and then you want it to run the script above.

I named the mp3 that I uploaded to GPM “Location Finder dad”. This music track is simply 9-10 seconds of recorded silence. Shorter may work if you have speedy internet, but 9-10 seconds worked perfectly for me.

I had to upload 3 separately recorded music tracks (all 9-10 seconds of silence, one for each member of the family) to EACH user’s GPM account. Even though we have a GPM family account, personal tracks can only be played in “my library”, which is tied to the individual user. When I tried to upload the exact same recording, just with a different name for each family member, GPM recognized them as the same track (due to exactly matching duration) and only kept one.

You see in the routine that the first step is “Play Location Finder dad in my library”. I know this seems needlessly convoluted, but welcome to the bleeding edge (and I sincerely hope that someone much further along with HA, Google Home, etc. responds to this by saying that I have waaayyyyy over-engineered this, and here’s the simple solution).

The next part of this is to set up a virtual sensor for each Google Home device that you want to have the capability to answer. Since we’re looking for media title, we’ll need to use a value template to extract attribute media_title.

#Answering GH sensor for triggering automations
  - platform: template
    sensors:
      office_home_answer:
        value_template: '{{ states.media_player.office_home_2.attributes.media_title }}'

  - platform: template
    sensors:
      den_mini_answer:
        value_template: '{{ states.media_player.den_mini.attributes.media_title }}'

And now for the last part. This is the automation that will populate the entity_id: ‘{{ google_home_device }}’ to the script. It simply says the last GH to get play “Location Finder dad” is the active google home. You will need each media device set up for each family member tracked, so a total of 6 automatons in this example (I had 21 - 7 GH devices X 3 people tracked).

#Select the office home for dad
- alias: Office Home dad finder to respond
  trigger:

    - platform: state
      entity_id: sensor.office_home_answer
      to: 'Location Finder dad'

  action:
    - service: script.dad_finder
      data:
        google_home_device: 'media_player.office_home_2'


#Select the den mini for dad
- alias: Den Mini dad finder to respond
  trigger:

    - platform: state
      entity_id: sensor.den_mini_answer
      to: 'Location Finder dad'

  action:
    - service: script.dad_finder
      data:
        google_home_device: 'media_player.den_mini'

#Select the office home for mom
- alias: Office Home mom finder to respond
  trigger:

    - platform: state
      entity_id: sensor.office_home_answer
      to: 'Location Finder mom'

  action:
    - service: script.mom_finder
      data:
        google_home_device: 'media_player.office_home_2'


#Select the den mini for mom
- alias: Den Mini mom finder to respond
  trigger:

    - platform: state
      entity_id: sensor.den_mini_answer
      to: 'Location Finder mom'

  action:
    - service: script.mom_finder
      data:
        google_home_device: 'media_player.den_mini'

#Select the office home for daughter
- alias: Office Home daughter finder to respond
  trigger:

    - platform: state
      entity_id: sensor.office_home_answer
      to: 'Location Finder daughter'

  action:
    - service: script.daughter_finder
      data:
        google_home_device: 'media_player.office_home_2'


#Select the den mini for daughter
- alias: Den Mini daughter finder to respond
  trigger:

    - platform: state
      entity_id: sensor.den_mini_answer
      to: 'Location Finder daughter'

  action:
    - service: script.daughter_finder
      data:
        google_home_device: 'media_player.den_mini'

As stated in first paragraph, this is about as complex as it gets. If you were looking for a simple sensor response:

You’d need a script for each of those sensors, just like xbmcnut lays out with some minor but important changes:

script:
  speak_garage_status:
    alias: 'Ask Google for Garage Status'
    sequence:
    - delay:
        seconds: 8 #xbmcnut's 2 second delay may be enough.  If you get large gaps of silence, then shorten this time.  GH had to go to cloud to get location data, but sensor info is likely local
    - service: tts.google_say
      entity_id: '{{ google_home_device }}' #this is the wildcard that will get the actual google home device

You would record an 8-10 second mp3 for each sensor you’d want GH response capability for. You’d then have to upload those mp3’s to each users’ GPM (or whatever service your using) library. And note: this may not be required if your music service allows you to share tracks across family members. This was truly a hassle for me to figure out, and annoying - especially since we have a GPM family account we’re paying for.

As to the Google Home setup, the same applies on setting up the garage sensor routine. Set up every permutation your family may use to ask about the garage status. Then setup the Garage Status routine to 1) call the mp3 above - I’m calling it Garage Status, and then 2) turn on the garage status script.

We’d use the same logic for the virtual sensors setup we did for the location finder, so to stick with xbmcnut’s example:

#Answering GH sensor for triggering automations
  - platform: template
    sensors:
      kitchen_home_answer:
        value_template: '{{ states.media_player.kitchen_home.media_title }}'

  - platform: template
    sensors:
      insignia_speaker_answer:
        value_template: '{{ states.media_player.insignia_speaker.media_title }}'  

Then we’d need the automation. But since we’re calling a script for the sensor we would be able to create just the one instance for each GH:

#Select the kitchen home for garage status
- alias: Kitchen Home to respond to garage status request
  trigger:

    - platform: state
      entity_id: sensor.kitchen_home_answer
      to: 'Garage Status' #this would be whatever the title of the mp3 we uploaded is for garage status

  action:
    - service: script.garage_status
      data:
        google_home_device: 'media_player.kitchen_home'

#Select the insignia speaker for garage status
- alias: Insignia Speaker to respond to garage status request
  trigger:

    - platform: state
      entity_id: sensor.insignia_speaker_answer
      to: 'Garage Status'

  action:
    - service: script.garage_status
      data:
        google_home_device: 'media_player.insignia_speaker'
8 Likes

Probably a little hacky, but I use a routine for reading out the todays weather and the like; I have motion sensors in the close proximity of the Google Home’s and so use the last activated motion sensor to determine which one to send the message to. Binary sensor unfortunately doesnt store this data so an input_select needs to be updated to store that value.

1 Like

HI all,

I’ve setup a person sensor and I would like to know if it is possible to let google declare actual state (for example not home) and also last zone entered.

Thank you

I must say I admire the creativity of the solution combining Routines, Sensors, Automations and scripts to achieve this. Wish it was a simpler solution (like “google_home_device” being set directly as a variable when the routine is calling a script in HA).

I am no expert yet, but could this not be extended by using data_template in the automations?

That way, if I understand this correctly, it could also be used to track which user is talking to the device, simply by setting another variable in the Automation. Let’s say that you have a “Location Finder Dad” file that is id3-tagged with “Artist = Ben” in your library, “Artist = Wife” in your wifes library and Artist = “Kid” in your kids library, you could add sensors with some magic like

sensors: 
  office_home_answer:
    value_template: '{{ states.media_player.office_home_2.attributes.media_title }}'
  office_home_user:
    value_teplate: '{{ states.media_player.office_home_2.attributes.media_artist }}'
  den_mini_answer:
    value_template: '{{ states.media_player.den_mini.attributes.media_title }}'
  den_mini_user:
    value_teplate: '{{ states.media_player.den_mini.attributes.media_artist }}'

And then use the sensor-values in the templates for automation and script:

automations: 
  trigger:
    entity_id: sensor.den_mini_answer
    to: 'Location Finder Dad'
  action:
    service: script_dad_finder
    data_template: 
      google_home_device: 'media_player.den_mini'
      google_home_user: {{ sensor.den_mini_user }}

And:

script: 
  {% if google_home_user == 'Ben' %}
    something
  {% elif google_home_user == 'Wife' %}
    something else
  {% endif %}

I have not tested this yet, as I only have one device and us currently just setting this up, but is there any reason that should not work?

1 Like

Could you share the example please? :slight_smile:

Hi, i have a working setup og google assistant Integration with my HA (docker).
I’ve linked with Google Home android app and populated some sensors and switches that can be seen in the app. Switches can be managed (On/OFF/Bright without issues)
The problem is that SENSORS don’t show the current values. I can see ony the entity but not the states (for Puerta) /values (for temperature and humidity). (see image)
I have the component in debug and i can see the updates to the API.

Any help about the solution for this problem??

2020-01-07 14:56:49 DEBUG (MainThread) [homeassistant.components.google_assistant.http] Response on https://homegraph.googleapis.com/v1/devices:reportStateAndNotification with data {‘requestId’: ‘XXXXXXXXXXXXXXXXXXXXXXXX’, ‘agentUserId’: ‘AAAAAAAAAAAAAAAAA’, ‘payload’: {‘devices’: {‘states’: {‘sensor.termostato_temperature’: {‘online’: True, ‘thermostatTemperatureAmbient’: 20.1}}}}} was {
“requestId”: “XXXXXXXXXXXXXXXXXXXXXXXX”
}


1 Like

I logged in hell, I created an account just to say, you sir are an absolute genius. As a software engineer and someone who is just starting to taste the power of HA, I applaud your efforts.

Now the question is… do I set this up now, or wait for a simpler solution…

The question is? Is a simplest solution available?

1 Like

Sam. Is there a simpler solution? Following this thread, i am more confused.

Is there any other solution to this? Would it be possible to change the Home Assistant Google Assistant integration and see the device that triggers the script in HA