Introduction
Since some time Large Language Models like OpenAI/ChatGPT or Gemini can be used for voice commands, they can even control your house now. 2024.12 will even bring an option to prefer local processing of the command first, so the LLM will act as a fallback. This opens a whole new world of possibilities.
If you expose a script to Assist, a LLM can access and use that script. Using fields you can guide the LLM how to use the script. Using this I created a script which gathers calendar event data from one or more calendar entities (using the calendar.get_events
action) and then feeds that data to the LLM, so it will give a nice summary. It will only send the events (ordered by time) to the LLM, so not the calendar it originates from.
How to use it
- Use the button to import the blueprint
- Create a script using the Blueprint, select the calendar entity or entities you want to use.
- Save the script. IMPORTANT: The LLM will use the description of the script to determine how to use it. So it is important to provide a good description. I use
Fetch calendar events from my calendar. In case the data for the weekend is requested, this means Saturday and Sunday
- IMPORTANT: Expose the script to Assist
That’s it, you can now ask an LLM voice agent for calendar entries and it it will tell you what’s planned (provided the data is available in your calendar entity).
Thanks to:
@jenova70 for providing the groundwork for the script