Hi all, I couldn’t find anything in the community on this so creating a new topic, any pointers greatly appreciated.
I currently have Google Home voice assistant in my Home Assistant setup, via Nabu Casa, and mostly it works. However, there are a number of sensors that I’d like to have voice control over, but they are marked as “not supported by this voice assistant”, and I don’t understand how to troubleshoot this.
For example, I have a Kitchen Temperature sensor which is supported in Google, e.g. I can say “Hey Google what is the kitchen temperature?”.
However, I would like to expose my blood glucose measurement so that I or my family can say “Hey Google what is Julian’s blood sugar?”. This is provided by the LibreLinkUp integration GitHub - jrsmile/librelinkup: Home Assistant custom Integration for LibreLinkUp. However, the sensor says it is not supported. I wondered whether it was because Google could not interpret the units (mmol/L), so have experimented with template sensors, including one that used degrees centigrade as a unit, and with input_numbers, but none of these are supported.
So, I guess what I’m asking for is, does anyone know what the criteria are for whether a sensor is supported by the Google Assistant integration, suggestions on what I’m doing wrong, or whether there are logs or config files that I can interrogate to try to troubleshoot?
I can provide YAML for the various sensors and/or log entries if of use. Thanks in advance.
Version info: (though I’ve had this issue for around a year across various versions)
In desperation I’ve been looking at the HA source code where this message appears and tagging one of the developers @piitaya – Paul, if you could explain where I could find the logic for why some sensors are unsupported by Google, or point me to some code that might explain it, I would be really grateful.
Thanks for your reply, Paul. I now have it working. TL;DR: to expose a template sensor you need to add a device_class to the entity via the customize: section, in addition to the template definition
Specifically, I’m trying to expose an entity that is measured in mmol/L, and I figured that the units were why it couldn’t be exposed. Therefore, I created a template sensor that exposes the value as a temperature, rather than glucose reading, with the following.
- sensor:
- name: My Glucose As Temperature
unique_id: my_glucose_as_temperature
unit_of_measurement: "°C"
state_class: measurement
state: >-
{{ states('sensor.julian_lawson_glucose_measurement') | float }}
If I look at this sensor in Developer Tools/States, it looks the same as my Kitchen Temperature entity (which is exposed successfully to Google), except for not having a device class:
In the case of someone finding this topic, hoping for a solution; the workaround is to create a script using TTS as the way to have Google Assistant/Home read the state of the sensor. Then creating a routine in Google Home.
Example:
I have a Tile on my dog’s collar.
Using Bermuda and-or ESPrensense, and a BLE proxy or other device of the same kind, I can see where he is every time he wants to go outside.
The problem is that the integration of Google Assistant, like @piitaya shared, is limited to specific values/states.
But creating a script, in HA, using a template that through TTS reads the state of the sensor - ex: “The dog is in the {{ states(‘sensor.dog_area’) }}” - it will allow the Google speakers to read out loud the value/state of the sensor.
After that you only have to expose that script; create a Routine/Automation in Google Home, with whatever msg you want to ask Google Assistant, and adding the Scene=Script as the answer.
Ex:
“Hey Google, where’s the dog?”
“The dog is in the ______”
Hi Ostracizado, this sounds great. One thing I don’t understand is how to get the TTS to respond to the Google device that asked the question. I’ve set up a script to read out my glucose on a specific device, but ideally I’d like to be able to send the TTS to the triggering device. Were you able to do this or did you just send to a specific device?
For reference if anyone else is trying this, here is the script I used
sequence:
- action: tts.speak
metadata: {}
target:
entity_id: tts.home_assistant_cloud
data:
cache: false
message: >
Your glucose is {{states('sensor.julian_lawson_measurement')|round(1)}}
and {{states('sensor.julian_lawson_trend') }}
media_player_entity_id: media_player.juba_squeezelite
alias: What is my blood sugar?
description: ""
I then exposed the script to Google Assistant (Script Settings->Voice Assistants->Expose). For some reason the script didn’t appear if I went to HA Settings->Voice Assistants->Expose.
I then went to Google Assistant Settings, New Routine, Household, Starter = “When I Say”, Add Action, Adjust Home Devices, Add Scenes, select my Script, Save.
Note that if you try doing this through the Google Home app it’s slightly different (thanks Google!): Add->Automation->Hamburger menu->Previous Household Editor then add starter and action.
This thread has some more commentary on getting this working if that is useful
EDIT: I updated the script to figure out where I am (using the Room Assistant integration) and send the message to the nearest speaker, or if I’m not near a speaker to send a TTS notification to my phone.