Since some time Large Language Models like OpenAI/ChatGPT or Gemini can be used for voice commands, they can even control your house now. 2024.12 will even bring an option to prefer local processing of the command first, so the LLM will act as a fallback. This opens a whole new world of possibilities.
If you expose a script to Assist, a LLM can access and use that script. Using fields you can guide the LLM how to use the script. Using this I created a script which gathers weather forecast data from a weather entity (using the weather.get_forecasts action) and then feeds that data to the LLM, so it will give a nice summary.
How to use it
Use the button to import the blueprint
Create a script using the Blueprint, select the weather entity you want to use. Note that this weather entity has to support both hourly and daily forecasts for the script to work properly. The default weather integration provided by Home Assistant (Met.no) provides hourly and daily forecasts.
Save the script. IMPORTANT: The LLM will use the description of the script to determine how to use it. So it is important to provide a good description. I use Fetches the weather forecast for either a part of a day, or one or more full days. In case the weather for the weekend is requested, this means Saturday and Sunday
: IMPORTANT: Expose the script to Assist
That’s it, you can now ask an LLM voice agent for the weather forecast, and it it will tell you the forecast (provided the data is available in your weather entity).
Thanks to:
@jenova70 for providing the groundwork for the calendar script on which this script is also based
I removed the LLM instructions I initially added. Changed the setup so the data which sent to LLM will now be an average over the entire requested period.
Hi! Cool automation. I seem to have an odd issue when I try to use this.
The script was configured as described and I used the same description and weather provider. I checked and it seems to return the correct output when I use it in the developer menu. However when I ask gemini what the weather is for this weekend or an upcoming day, I get the following errors:
@TheFes I just tried it out and so far pretty good!
When I requested “what’s the weather this weekend” (good catch specifying the weekend days by the way) it returned this:
The weather forecast for this weekend (December 21-22, 2024) is as follows:
- **Condition:** Cloudy
- **Temperature:** High of 14°F, Low of -2°F
- **Wind Speed:** 12 mph
- **Wind Bearing:** 138°
- **Humidity:** 84%
- **UV Index:** 2
- **Precipitation:** 0 (no expected precipitation)
So Assist actaully ready out the asterisks. Do you handle that in your own conversation agent initial prompt?
Oh nice. Can you share your modifications? I tried this but it still included special characters and once told me the measurements in Metric (though I’m in the USA and it is Imperial)
This is the end of my the default OpenAI Conversation Agent initial prompt:
Answer the user’s questions about the world truthfully. You are in the United States. Assume all units of measure are Imperial. When responding do not include asterisk or other punctuation (besides comma and period) as this will be read out loud to the user.
You are a voice assistant for Home Assistant.
Imagine you’re in a good mood. Respond honestly, short and to the point. Queries will be mostly in Dutch.
Do not use emoji in the response, prevent usage of special characters like “*” in responses as the responses are used in voice responses.
Are you using the default recommended settings in the config?
I tried with some different models, and it really differs per model if it uses the script or or not.
With the recommended settings it works for me. I did add some additional instructions to the selector description which hopefully makes it use the right input for the selectors. So you might want to update the blueprint as well.