Updated to include area mapping. Thanks to @TheFes for the code.
Updated to add null characters to the virtual bulb config.
Updated to generate a spoken error message. Thanks to @HairyPorter23 for the code.
The integration of Google Home (GH) and Home Assistant (HA) has a key limitation: it lacks “source speaker” metadata. When a GH voice routine requests sensor information from HA, it cannot identify which device made the request, preventing a response to the correct speaker
This issue can be resolved by using Google’s Room-Awareness feature to activate location-specific input buttons. This enables a straightforward HA automation to determine the source room and respond to the appropriate speaker. A Virtual Template Light is utilised to manage multiple template text sensors, which contain the replies. These are indexed to the light’s brightness level.
A simple GH routine can be set up to trigger a “generic” input button and adjust the brightness of the virtual bulb.
Because Google Home is “room aware", the correct input button is activated. This button initiates a single HA automation that identifies the room and selects the right speaker. The brightness level determines the correct response through the text sensors.
Create the Entities in Home Assistant
You can use the Helper UI (Settings > Devices & Services > Helpers) to create most of these components, but the Virtual Light is the exception.
- Input Buttons: Create buttons (e.g.,
kitchen_answer,livingroom_answer) and assign them to their respective Areas in HA. - Google Cast Speakers: Ensure your Google speakers are also assigned to the same HA Areas as their corresponding buttons.
- Template Sensors: Create sensors explicitly named 1_voice, 2_voice, 3_voice, etc., via the Template Sensor Helper. The number in the sensor name corresponds to the brightness percentage of the virtual bulb (e.g., 1% triggers sensor.1_voice). In the “State Template” box, enter your message logic (e.g., The car is at {{ states(‘sensor.car_battery’) }} percent).
- Virtual Light: Define this in
configuration.yaml. Note: You cannot use the Helper UI to create this light because the UI does not support creating a template light with the custom brightness settings and empty service calls required for this routing logic.
template:
- light:
- name: "virtual"
unique_id: "virtual"
turn_on: []
turn_off: []
set_level: []
Connect and Sync to Google Home
To make these entities visible in the Google Home app, you need to bridge them using one of the following:
- Nabu Casa (Home Assistant Cloud): The easiest “one-click” method.
- Matter Bridge: Uses the Matter router in Google devices to expose entities locally.
- Google Assistant integration: A free method using a Google Cloud project.
The Crucial Step: Once synced, go into the Google Home App and move each input_button into its corresponding room (e.g., input_button.kitchen_answer goes into the “Kitchen” room).
Set up the Google Home Routine
For every question you want to ask, create a routine in the Google Home app:
- Voice Starter: “What is the car battery level?”
- Action 1: Create a custom command, “Turn on answer” (Google’s room-awareness triggers only the button in your current room).
- Action 2: Turn on the virtual bulb and set the virtual bulb to 1%". (change % to match the voice sensor.)
Automation (with Room Mapping)
This automation routes the audio to the specific media player associated with the triggered button. Change input buttons as required.
Im using tts.speak and tts.google_translate_en_com for the text-to-speech. Change or install as required.
alias: Google Voice Response Handler
triggers:
- trigger: state
entity_id:
- input_button.kitchen_answer
- input_button.livingroom_answer
- input_button.bedroom_answer
- input_button.frontroom_answer
actions:
- action: tts.speak
target:
entity_id: tts.google_translate_en_com
data:
media_player_entity_id: >
{% set area = area_id(trigger.entity_id) %}
{{ integration_entities('cast') | select('in', area_entities(area)) | first }}
cache: true
message: >
{% set index = (state_attr('light.virtual', 'brightness') | int(0) / 2.55) | round(0) %}
{% set entity = "sensor." ~ index ~ "_voice" %}
{% if has_value(entity) %}
{{ states(entity) }}
{% else %}
Error detected. No voice sensor found with Index value {{ index }}
{% endif %}
- action: light.turn_off
target:
entity_id: light.virtual
Understanding the Mapping:
The automation identifies the area_id of the triggered button and filters all cast integration entities to find one residing in that same area.
Note a key limitation: this logic is designed for areas with exactly one Cast speaker. Because the template uses the | first filter, if multiple speakers exist in one HA Area, the automation will always default to the first one found in the list rather than distinguishing between them.
I hope the above is of some use. I have tried using volume hacks or sending silent tones to try and identify the speakers, but I could never make these work reliably.
You may want to add an announcement in the GH routine saying something like ‘Please wait’ at the beginning to slow down the interaction, as sometimes Google speakers miss the first couple of words.
I don’t use the automation editor; I use Node-RED, so please let me know if the automation can be improved/simplified, as I used Google Gemini to help convert my Node-RED flows.

