Just after I saw the AI integration, this idea came into my mind. Sadly, I don’t really understand how to code, I just the visual way to create automations.
I am not sure if it is possible, but it would be amazing if my speakers could say the responses created by Google AI.
This is where I am at:
alias: Google AI tells history
description: ""
trigger:
- platform: time_pattern
hours: "2"
minutes: "5"
seconds: "30"
condition: []
action:
- service: google_generative_ai_conversation.generate_content
metadata: {}
data:
prompt: >-
Pick a random date, and tell about a significant event in the history
happened that day.
response_variable: ai_response
- service: tts.cloud_say
entity_id: media_player.living_room
data_template:
message: null
mode: single
I have a lot of similar automation. Just use service: conversation.process for the prompt and then use message: " {{ ai_response.response.speech.plain.speech }} "
for tts. That’s it.
action:
- service: google_generative_ai_conversation.generate_content
data:
prompt: Can you give me a random history fact?
response_variable: generated_contenthistory
- service: tts.cloud_say
entity_id: media_player.living_room
data_template:
message: "{{ generated_contenthistory['text'] }}"
mode: single
This way I can edit it the way I want, and make it pull any answer by editing the prompt!
How to do to capture the google generative ai generated text, when you write the prompt directly on assist dialog window, and not using generate_content service?
I actually found in Reddit post the solution to this.
See my automation below to capture the google generative ai generated response, when you write the prompt directly on assist dialog window. In my automation (as response LLM easily exceeds 255 characters so Helper - Input text is not compatible) I store the output as value in a custom variable (using GitHub - enkama/hass-variables: Home Assistant variables component), so I can add the LLM output as task to Anylist (but can be any other destination) at will (I have Zigbee button, that on press, adds last LLM answer to Anylist as task, so I can easily follow up on it later).