Have Google home read out sensor state when asked?

Not quite sure what you mean when you say Google can ‘see’ your switch but Home Assistant can’t. You could look into creating a template switch for your command line switch which may be better and simpler than my method and template switches show up in both Google and Alexa inside the HA cloud.

You need to get to a point where HA knows the state of your ‘filtro_piscina’ and when it does, you can then use my method to query Google by asking Google to turn on a fake switch. The fake switch could be the new template switch which I’m going to try at some point.

It will be something like EWeLink which has native Google Assistant support but is not supported by HA without setting up IFTTT and binary switches/sensors or flashing with Tasmota

Meross switch has a native Google Home support, Home Assistant none.
So, I have to use Google Assistant Webserver with HA in order to create a switch in HA, using text command, as I wrote. So I have the possibility to use the Meross in HA.
But this method gives no feedback about switch state on or off.

So how can I “ask” to GA via GH? You wrote about a script for the garage, but you didn’t published it. Could you post it?

Think sa lot.

The garage script is exactly the same as the pool one except with a different entitiy ID. Have Google home read out sensor state when asked? - #24 by xbmcnut. When I did this, Google and Alexa could only control switches so you need to be able to get a switch into Home Assistant that you can voice control then you run a script when that switch is activated. I believe now that you might be able to get Google (and Alexa) to control input_booleans and if so, that would be easier.

So the steps are:

  1. Have the device you need to get the feedback on reporting its state to Home Assistant.

  2. Create a fake switch that you can turn on with voice control. (My method, input_boolean if controllable with Google/Alexa or maybe a template switch).

  3. Use that fake switch to run a script that grabs the value from your device and reads it aloud using the TTS engine.

I just played around with creating a bogus switch and it works.

switch:
  - platform: template
    switches:
       pool_status:
         friendly_name: "Ask Google for the Pool Temperature"
         value_template: "{{ is_state('input_boolean.pool_status', 'on') }}"
         turn_on:
           service: script.pool_status
         turn_off:
           service: script.none

There is no input_boolean defined and no script called script.none but no errors throw up and it works a treat. Turning on this switch reads out the pool temperature and the switch turns itself off ready to use again. Much easier than setting up the whole demo platform.

Welcome to any ideas on a better way to set up a fake switch.

2 Likes

Furthermore, I have confirmed input_boolean function directly within Google Home and Alexa via the HA cloud. No need for fake switches at all. Here is a high tide example which gets Google or Alexa to tell me the time of the next high tide. Cool.

input_boolean:
  next_high_tide:
    name: Next high tide by Google
    icon: mdi:swim

script:
  speak_next_high_tide:
    alias: 'Ask Google for next High Tide time'
    sequence:
    - service: tts.google_say
      entity_id:
        - media_player.kitchen_home
        - media_player.gaming_room_home
      data_template:
        message: >-
          The next high tide is at {{ as_timestamp(states.sensor.onetaunga_bay_tides.attributes.high_tide_time_utc) | timestamp_custom("%H:%M") }}

automation:
  - alias: 'Check Next High Tide'
    trigger:
      - platform: state
        entity_id: input_boolean.next_high_tide
        from: 'off'
        to: 'on'
    action:
      - delay:
          seconds: 2
      - service: script.turn_on
        entity_id: script.speak_next_high_tide
      - service: input_boolean.turn_off
        entity_id: input_boolean.next_high_tide

The add the following to your cloud config:

cloud:
  google_actions:
    filter:
      include_entities:
        - input_boolean.next_high_tide

  alexa:
    filter:
      include_entities:
        # - switch.pool_light
        - input_boolean.next_high_tide

Lastly, say “OK Google, sync my devices” to add the input_boolean in and discover devices in Alexa to have it added there. Then give the device a nickname in each platform and create a routine in Google so you can ask for the high tide using different phrases.

3 Likes

Is there a way that you can turn on the input_boolean without it responding “sure turning on …” I thought that was the point of doing it with a boolean and not just exposing the script to GH. Otherwise you could just make sure that scripts is exposed to Google like this.

google_assistant:
  project_id: something_something
  api_key: <its mine and I'm not telling>
  exposed_domains:
    - switch
    - light
    - group
    - script

With scripts you can just say “Turn on …” or “activate …” and GH responds with “sure, Activating …” but I want to just have my TTS and not that part.

Not sure what you’re looking for, but if you want a customized request and response, you could use IFTTT, and say something like “what is the current water temperature?”.

As for the state of certain devices, you should be able to just ask,“are my kitchen lights on” and GH will respond with the state.

My end goal is to ask Google “What is the battery percentage of my car …” and have it respond only with the sensor value. I was trying to avoid using ifttt because it adds a extra level of complication.

1 Like

Anyone clever enough to figure out how I could add a condition to my TTS automation to define the Google Home speaker that just activated and to speak back only to that one?

I have simplified greatly my ask_google.yaml and that can be found here. Using scripts directly with Google/Alexa. Google still says “Activating Pool Status” etc hence the 2s delay but otherwise it’s very simple and works well.

I’ve used a similar configuration to get Google Home to respond to specific queries. We only use it for “where is dad?”, “where is mom?”, “where is daughter?”, but I’m going to expand it to other sensor queries. The problem I had is that there was no way to specify which Google Home device to respond, so you had to have all of them respond. With like 7 of these things around the house, that was less than ideal. I’ve shortened this example to be for 2 google home’s just to keep it succinct. Also, using this with location tracking is a very complex endeavor. This could be greatly simplified if the intent was to just have a sensor reading spoken from a specific google home (more on that at the end).

This is a hacky, but so far reliable, way to have the Google Home that is asked to also be the only one that responds. It requires Google Assistant integration with HA, Google Home Routines, and a Google Home integrated music service. I use Google Play Music because I have a subscription. The one caveat is that you have to be able to upload personal music tracks.

For this example, I’m using the Dad finder with the Google Location component, and I have a dummy google account setup on the HA server. I’m sharing all family locations with this account (in the words of Dr. Zzz, “it isn’t stalking if it’s your family”). So starting with the script that will give us the answer:

Where’s dad

script:  
  dad_finder:
    alias: 'dad finder' #when imported into google assistant, this is the name google home will recognize as the script
    sequence:
      - delay: '00:00:08' #longer delay needed than xbmcnut's 2 second recommendation.  We have to allow enough time for google home device selection automation to work

      - service: tts.google_say
        data_template:
          entity_id: '{{ google_home_device }}' #this is the wildcard that will get the actual google home device from an automation we'll set up later
          message: >
            {% if is_state('device_tracker.google_maps_XXXXXXXXXX', 'not_home') %}
              Dad was last seen at {{ (states.device_tracker.google_maps_XXXXXXXXXX.attributes.address) }} at {{ as_timestamp(states.device_tracker.google_maps_XXXXXXXXXX.attributes.last_seen)| timestamp_custom('%I:%M %p') }} #if Dad isn't at a pre-defined HA zone, then this will give address info
            {% else %}
              Dad was last seen at {{ states('device_tracker.google_maps_XXXXXXXXXX') }} at {{ as_timestamp(states.device_tracker.google_maps_XXXXXXXXXX.attributes.last_seen)| timestamp_custom('%I:%M %p') }}. #if Dad IS in a pre-defined HA zone, then the name of that zone will be read
            {% endif %}

You will need to set up a Google routine with every permutation you would expect your family to use for Where’s dad. For whatever reason, we actually have to use last names in this query when we ask by formal name…otherwise we get a nonsensical web response. YMMV. Since so far wild cards aren’t allowed in routines, you’ll need to set up a routine (and script) for each member of the family you want to be able track. And make sure in your Google Assistant integration with HA you have scripts in the included domains.


You’ll want a Google Routine that does two things when asked “where’s dad”? You want it to play a short music track you uploaded to Google Play music on the Google Home you just asked “Where’s dad?” and then you want it to run the script above.

I named the mp3 that I uploaded to GPM “Location Finder dad”. This music track is simply 9-10 seconds of recorded silence. Shorter may work if you have speedy internet, but 9-10 seconds worked perfectly for me.

I had to upload 3 separately recorded music tracks (all 9-10 seconds of silence, one for each member of the family) to EACH user’s GPM account. Even though we have a GPM family account, personal tracks can only be played in “my library”, which is tied to the individual user. When I tried to upload the exact same recording, just with a different name for each family member, GPM recognized them as the same track (due to exactly matching duration) and only kept one.

You see in the routine that the first step is “Play Location Finder dad in my library”. I know this seems needlessly convoluted, but welcome to the bleeding edge (and I sincerely hope that someone much further along with HA, Google Home, etc. responds to this by saying that I have waaayyyyy over-engineered this, and here’s the simple solution).

The next part of this is to set up a virtual sensor for each Google Home device that you want to have the capability to answer. Since we’re looking for media title, we’ll need to use a value template to extract attribute media_title.

#Answering GH sensor for triggering automations
  - platform: template
    sensors:
      office_home_answer:
        value_template: '{{ states.media_player.office_home_2.attributes.media_title }}'

  - platform: template
    sensors:
      den_mini_answer:
        value_template: '{{ states.media_player.den_mini.attributes.media_title }}'

And now for the last part. This is the automation that will populate the entity_id: ‘{{ google_home_device }}’ to the script. It simply says the last GH to get play “Location Finder dad” is the active google home. You will need each media device set up for each family member tracked, so a total of 6 automatons in this example (I had 21 - 7 GH devices X 3 people tracked).

#Select the office home for dad
- alias: Office Home dad finder to respond
  trigger:

    - platform: state
      entity_id: sensor.office_home_answer
      to: 'Location Finder dad'

  action:
    - service: script.dad_finder
      data:
        google_home_device: 'media_player.office_home_2'


#Select the den mini for dad
- alias: Den Mini dad finder to respond
  trigger:

    - platform: state
      entity_id: sensor.den_mini_answer
      to: 'Location Finder dad'

  action:
    - service: script.dad_finder
      data:
        google_home_device: 'media_player.den_mini'

#Select the office home for mom
- alias: Office Home mom finder to respond
  trigger:

    - platform: state
      entity_id: sensor.office_home_answer
      to: 'Location Finder mom'

  action:
    - service: script.mom_finder
      data:
        google_home_device: 'media_player.office_home_2'


#Select the den mini for mom
- alias: Den Mini mom finder to respond
  trigger:

    - platform: state
      entity_id: sensor.den_mini_answer
      to: 'Location Finder mom'

  action:
    - service: script.mom_finder
      data:
        google_home_device: 'media_player.den_mini'

#Select the office home for daughter
- alias: Office Home daughter finder to respond
  trigger:

    - platform: state
      entity_id: sensor.office_home_answer
      to: 'Location Finder daughter'

  action:
    - service: script.daughter_finder
      data:
        google_home_device: 'media_player.office_home_2'


#Select the den mini for daughter
- alias: Den Mini daughter finder to respond
  trigger:

    - platform: state
      entity_id: sensor.den_mini_answer
      to: 'Location Finder daughter'

  action:
    - service: script.daughter_finder
      data:
        google_home_device: 'media_player.den_mini'

As stated in first paragraph, this is about as complex as it gets. If you were looking for a simple sensor response:

You’d need a script for each of those sensors, just like xbmcnut lays out with some minor but important changes:

script:
  speak_garage_status:
    alias: 'Ask Google for Garage Status'
    sequence:
    - delay:
        seconds: 8 #xbmcnut's 2 second delay may be enough.  If you get large gaps of silence, then shorten this time.  GH had to go to cloud to get location data, but sensor info is likely local
    - service: tts.google_say
      entity_id: '{{ google_home_device }}' #this is the wildcard that will get the actual google home device

You would record an 8-10 second mp3 for each sensor you’d want GH response capability for. You’d then have to upload those mp3’s to each users’ GPM (or whatever service your using) library. And note: this may not be required if your music service allows you to share tracks across family members. This was truly a hassle for me to figure out, and annoying - especially since we have a GPM family account we’re paying for.

As to the Google Home setup, the same applies on setting up the garage sensor routine. Set up every permutation your family may use to ask about the garage status. Then setup the Garage Status routine to 1) call the mp3 above - I’m calling it Garage Status, and then 2) turn on the garage status script.

We’d use the same logic for the virtual sensors setup we did for the location finder, so to stick with xbmcnut’s example:

#Answering GH sensor for triggering automations
  - platform: template
    sensors:
      kitchen_home_answer:
        value_template: '{{ states.media_player.kitchen_home.media_title }}'

  - platform: template
    sensors:
      insignia_speaker_answer:
        value_template: '{{ states.media_player.insignia_speaker.media_title }}'  

Then we’d need the automation. But since we’re calling a script for the sensor we would be able to create just the one instance for each GH:

#Select the kitchen home for garage status
- alias: Kitchen Home to respond to garage status request
  trigger:

    - platform: state
      entity_id: sensor.kitchen_home_answer
      to: 'Garage Status' #this would be whatever the title of the mp3 we uploaded is for garage status

  action:
    - service: script.garage_status
      data:
        google_home_device: 'media_player.kitchen_home'

#Select the insignia speaker for garage status
- alias: Insignia Speaker to respond to garage status request
  trigger:

    - platform: state
      entity_id: sensor.insignia_speaker_answer
      to: 'Garage Status'

  action:
    - service: script.garage_status
      data:
        google_home_device: 'media_player.insignia_speaker'
8 Likes

Probably a little hacky, but I use a routine for reading out the todays weather and the like; I have motion sensors in the close proximity of the Google Home’s and so use the last activated motion sensor to determine which one to send the message to. Binary sensor unfortunately doesnt store this data so an input_select needs to be updated to store that value.

1 Like

HI all,

I’ve setup a person sensor and I would like to know if it is possible to let google declare actual state (for example not home) and also last zone entered.

Thank you

I must say I admire the creativity of the solution combining Routines, Sensors, Automations and scripts to achieve this. Wish it was a simpler solution (like “google_home_device” being set directly as a variable when the routine is calling a script in HA).

I am no expert yet, but could this not be extended by using data_template in the automations?

That way, if I understand this correctly, it could also be used to track which user is talking to the device, simply by setting another variable in the Automation. Let’s say that you have a “Location Finder Dad” file that is id3-tagged with “Artist = Ben” in your library, “Artist = Wife” in your wifes library and Artist = “Kid” in your kids library, you could add sensors with some magic like

sensors: 
  office_home_answer:
    value_template: '{{ states.media_player.office_home_2.attributes.media_title }}'
  office_home_user:
    value_teplate: '{{ states.media_player.office_home_2.attributes.media_artist }}'
  den_mini_answer:
    value_template: '{{ states.media_player.den_mini.attributes.media_title }}'
  den_mini_user:
    value_teplate: '{{ states.media_player.den_mini.attributes.media_artist }}'

And then use the sensor-values in the templates for automation and script:

automations: 
  trigger:
    entity_id: sensor.den_mini_answer
    to: 'Location Finder Dad'
  action:
    service: script_dad_finder
    data_template: 
      google_home_device: 'media_player.den_mini'
      google_home_user: {{ sensor.den_mini_user }}

And:

script: 
  {% if google_home_user == 'Ben' %}
    something
  {% elif google_home_user == 'Wife' %}
    something else
  {% endif %}

I have not tested this yet, as I only have one device and us currently just setting this up, but is there any reason that should not work?

1 Like

Could you share the example please? :slight_smile:

Hi, i have a working setup og google assistant Integration with my HA (docker).
I’ve linked with Google Home android app and populated some sensors and switches that can be seen in the app. Switches can be managed (On/OFF/Bright without issues)
The problem is that SENSORS don’t show the current values. I can see ony the entity but not the states (for Puerta) /values (for temperature and humidity). (see image)
I have the component in debug and i can see the updates to the API.

Any help about the solution for this problem??

2020-01-07 14:56:49 DEBUG (MainThread) [homeassistant.components.google_assistant.http] Response on https://homegraph.googleapis.com/v1/devices:reportStateAndNotification with data {‘requestId’: ‘XXXXXXXXXXXXXXXXXXXXXXXX’, ‘agentUserId’: ‘AAAAAAAAAAAAAAAAA’, ‘payload’: {‘devices’: {‘states’: {‘sensor.termostato_temperature’: {‘online’: True, ‘thermostatTemperatureAmbient’: 20.1}}}}} was {
“requestId”: “XXXXXXXXXXXXXXXXXXXXXXXX”
}


1 Like

I logged in hell, I created an account just to say, you sir are an absolute genius. As a software engineer and someone who is just starting to taste the power of HA, I applaud your efforts.

Now the question is… do I set this up now, or wait for a simpler solution…

The question is? Is a simplest solution available?

1 Like

Sam. Is there a simpler solution? Following this thread, i am more confused.