Hi! Can anyone please post a working solution for random text pick for TTS, working with last HA release? It’s looks like it should be possible to use helpers for this purpose, but I was unable to create a working YAML.
Can anyone share a working solution? Thanks!
have this type of code in your TTS message section:
message: >
'{{ ["Kingsleys flea treatment is due", "Its that time again, Kingsley
wants his flea treatment chew", "Kingsley loves a good flea treatment.
The time has come"] | random }}'
Thank you for for your sample! However, it will be a mess to handle say 50 text entries. I believe it’s possible to utilize input-select helper to work with easy manageable list of entries.
You can split the text out so each one is on it’s own line which would neaten it up. Using an input text helper should be doable but I haven’t tried so don’t have an example. I’ll see if I can figure it out though.
Something along these lines:
message: "{{states('input_text.tts_announcement_text') | random}}"
…but that is untested and likely not quite correct.
EDIT: that is definietly wrong… you would still need all 50 text items in that single input_text
so it is no better than my previous example.
I’ve had some success using Macros to achieve this. Check out this thread:
yeah cool, but it’s not really much easier than my option, I mean you still can’t edit the list via the UI so a restart is likely required for any changes. I liked your idea of a text helper because it would mean editing from the frontend without any restart/reload required.
How can make it not saying the same option if it said it that before for example if it say “Halloween 1” the for the next 10 minutes not say it again. Thanks!!!
So I took slightly different approach to ‘random’, instead of building the list of predefined sentences, I ask OpenAI to generate message for me. This way I have one automation, but endless selection of random text. Here is sample code for entering/leaving specific zone.
- alias: zone ChatGPT TTS notifications
id: zone_chatgpt_tts_notifications
mode: queued
initial_state: true
trigger:
- platform: zone
entity_id: person.mirek_malinowski, person.dorota_malinowska
zone: zone.home
event: enter
- platform: zone
entity_id: person.mirek_malinowski, person.dorota_malinowska
zone: zone.home
event: leave
# More zones goes here
action:
- service: conversation.process
data:
agent_id: id_redacted
text: |
{% if (trigger.event) == "leave" %}
Prepare humorous notification about {{ trigger.to_state.attributes.friendly_name }} being now outside of {{ trigger.zone.attributes.friendly_name }}, using no more than 15 words and not using emoji.
{% else %}
Prepare humorous notification about {{ trigger.to_state.attributes.friendly_name }} being now close to {{ trigger.zone.attributes.friendly_name }}, using no more than 15 words and not using emoji.
{% endif %}
response_variable: chatgpt
- service: tts.cloud_say
data_template:
entity_id: media_player.google_home_mini
language: pl-PL
message: |
{{chatgpt.response.speech.plain.speech | trim | replace('\"','')}}
That is great and far more flexible and scalable.
What integration are using for openAI that enables the conversation.process
service?
Would you consider pulling this chatgpt random script together into a ‘Community Guideline’ and adding it to the Cookbook we are creating here on the forums?
The Home Assistant Cookbook - Index.
I think it would be a fantastic addition, and since it is your code and idea, I ddn’t want to just copy you to get it there.
(It might also make a really nice script blueprint to share as you already have the code debugged…)
@Stiltjack FYI…
Using @mirekmal’s example, I created a script called Get ChatGPT response
with field values for parts of the chatgpt prompt so I can seamlessly reuse the script within my automations.
I couldn’t figure out how to pass the script response directly into another action, so I created an Input Text helper first to house the ChatGPT response each time.
alias: Get ChatGPT response
sequence:
- service: conversation.process
data:
agent_id: id_redacted
text: >-
Prepare a {{tone}} notification about "{{subject}}" using no more than
{{length}} words and not using any emoji.
response_variable: chatgpt
alias: ChatGPT Prompt
enabled: true
- service: input_text.set_value
metadata: {}
data:
value: "{{chatgpt.response.speech.plain.speech | trim | replace('\\\"','')}}"
target:
entity_id: input_text.chatgpt_response
fields:
subject:
selector:
text: null
name: subject
description: What do you want to do?
required: true
tone:
selector:
text: null
name: tone
description: Describes the tone you want to add to the ChatGPT request.
required: true
length:
selector:
text: null
name: length
description: How many words do you want to limit the response to?
default: "15"
required: true
icon: mdi:robot-excited
mode: single
In my automations, I first call the script above, then add the following template wherever I want to include the response:
"{{ states('input_text.chatgpt_response') }}"
Seems to work pretty well so far.
‘tone’ … Thanks for that, I have a new input select
- cheeky
- sarcastic
- informational
- bombastic
- disinterested
- scathing
- amorous
- angry
I really like your script can you give a concrete example of automation using it?
Thanks and apologies for the delay!
- I’ve been using it to wake up my kids with a new funny affirmation each morning.
- I send my wife and I different notifications when we leave the house.
- I have a laundry nag that gets increasingly dramatic if I haven’t moved the clothes from the washer to the dryer.
Basically, I’m looking to inject personality into the messages coming from my home.
Has anyone done one of these to a local Ollama installation?
do you have specific examples “{{ states(‘input_text.chatgpt_response’) }}”
I took Greg’s code and did it for Ollama and llama3.1 locally.
get_ollama_response:
alias: Get ollama response
sequence:
- data:
agent_id: conversation.llama3_1
text: Prepare a {{tone}} notification about "{{subject}}" using no more than
{{length}} words and not using any emoji.
response_variable: llama3_1
alias: ollama Prompt
enabled: true
action: conversation.process
- metadata: {}
data:
value: '{{llama3_1.response.speech.plain.speech | trim | replace(''\"'','''')}}'
target:
entity_id: input_text.llama3_1_response
action: input_text.set_value
fields:
subject:
selector:
text:
name: subject
description: What do you want to do?
required: true
tone:
selector:
text:
name: tone
description: Describes the tone you want to add to the Ollama request.
required: true
length:
selector:
text:
name: length
description: How many words do you want to limit the response to?
default: '15'
required: true
icon: mdi:robot-excited
mode: single
description: 'Ollama response with llama3.1 data file'
Sample outputs. These are from the trace so you can see input and output.
subject: What is the averate temperature in melbotne australia in the spring
tone: australian
length: 50
context:
id: 01JDKJ1ETH2ZQR71QVJTMRP0BE
parent_id: null
user_id: c0fba61d419c44a1892abc47bf3065ae
llama3_1:
response:
speech:
plain:
speech: >-
"G'day mate! In Melbourne, Australia during Spring (September to
November), the average temperature ranges from 12 to 22 degrees
Celsius. So grab your Akubra and enjoy the mild weather, but don't
forget your jumper for those chilly mornings!"
and
length: '15'
subject: what is the average temperature inside
tone: '{{ ("grumpy", "happy", "pissed off") | random }}'
context:
id: 01JDKH80J5M18DNF7B82Z3AFGB
parent_id: null
user_id: c0fba61d419c44a1892abc47bf3065ae
llama3_1:
response:
speech:
plain:
speech: >-
You're looking grumpy today: Current indoor temp is 20 degrees
Celsius.
extra_data: null
I’m new to all this but the Ollama script was created to help me get my llama 3.2 set up to do these notifications. (Thanks again, Sir Goodenough!!)
In case you are asking (because I was slightly confused at first), the
“{{ states(‘input_text.chatgpt_response’) }}”
comes from a helper created in Helpers section of the Devices & services page. You will want to create a text helper from that settings page and call it something close to “chatgpt response” (mine is “Ollama response” for example). Once you have everything created, then you can call the script in your automation and use the line above in your Notification action’s message. It will force it into YAML mode and should look something like this:
action: notify.mobile_app_missys_iphone
metadata: {}
data:
message: "\"{{ states('input_text.ollama_response') }}\""
So my automation actions look like this.
Now what I am trying to do is get the notifications to be more relevant to the calendar event instead of super random, but this is all a great base.
Massive thanks to everyone in this thread - y’all are the real MVPs.