🌦 Weather forecast access for LLM (eg ChatGPT or Gemini)

A question I have is how does the script know which IA to use? It worked great with OpenAI and when I switched to Gemini, it doesn’t. The response it that it does not have access to the data source. I imported the blueprint again and deleted the first script.

Thanks

The script doesn’t know anything, it’s just a tool which is available for any LLM to use. It’s not tied to any of them.

Thanks. I do appreciate the work you put into this.

Update

Version 2.0

  • reworked how the LLM determines the start and end of the period. It now sets a date and time separately, and then the time period (e.g. 1 day, or 6 hours)
  • Added all the descriptions of the fields used in the script to the blueprint settings, to allow you to change these when setting up the blueprint
  • use a datetime comparison instead of a string comparison when grabbing the right items out of the total forecast, to avoid taking the wrong period when the forecast datetimes are in UTC
2 Likes

I updated to the new blueprint and am getting errors.

Logger: homeassistant.helpers.template
Source: helpers/template.py:2758
First occurred: 2:39:04 PM (1 occurrences)
Last logged: 2:39:04 PM

Template variable warning: ‘start_time’ is undefined when rendering ‘{{ ((start_date ~ ’ ’ ~ start_time) | as_datetime | as_local).isoformat() }}’

Logger: homeassistant.components.script.fetch_weather_forecast_data
Source: helpers/script.py:2041
integration: Script (documentation, issues)
First occurred: 2:29:19 PM (7 occurrences)
Last logged: 2:47:18 PM

  • fetch_weather_forecast_data: Error executing script. Error rendering template for variables at pos 1: TypeError: unsupported type for timedelta days component: str
  • fetch_weather_forecast_data: Error executing script. Error rendering template for variables at pos 1: AttributeError: ‘NoneType’ object has no attribute ‘tzinfo’

When this happened, the LLM halucinated and ran a different script for night (turn the lights of, the heat down, etc. You can see in the logs that it was the LLM. I’ve disabled HA control in the LLM for now.

Hmm, which LLM are you using?

Llama 3.2 running locally on a dedicated GPU.

It’s didn’t hallucinate control in the previous version.

It shouldn’t hallucinate toggling devices when a script fails. I really doubt if the script failing caused that. Anyway, I built in a check if are fields are set correct by the LLM in v2.1

Update

Version 2.1

  • Check if all fields are set correctly by the LLM before continuing with fetching the forecast

Great. Thanks! Do I need to setup the script again after pulling the new BP?
Edit: I just set the script up again to be safe. Now it doesn’t appear to understand location.

Can you give me the trace? It probably made the same error and didn’t provide one of the required fields and decided for itself that it’s caused by a missing location

this:
entity_id: script.fetch_weather_forecast_data
state: ‘off’
attributes:
last_triggered: ‘2025-01-24T21:23:38.411322+00:00’
mode: parallel
current: 0
max: 10
friendly_name: fetch_weather_forecast_data
last_changed: ‘2025-01-24T21:23:38.412779+00:00’
last_reported: ‘2025-01-24T21:23:38.412779+00:00’
last_updated: ‘2025-01-24T21:23:38.412779+00:00’
context:
id: 01JJD35ND2ZH02DC72ZP7HF75K
parent_id: null
user_id: cb4709a0939c45699106c329f3e36892
start_date: ‘2025-01-25’
start_time: ‘00:00:00’
time_period_length: 1
time_period_type: daily
context:
id: 01JJD3AGVFH8SR3HBVTV4KYRK2
parent_id: null
user_id: cb4709a0939c45699106c329f3e36892
version: 2.1

@ckhyatt That actually seems fine to me, can you download the entire trace (there’s a download button in the top right corner)

You share it using something like https://dpaste.org

https://dpaste.org/ivQTe

That link doesn’t work unfortunately

Think I found the issue, can you reimport the blueprint and try again

https://dpaste.org/7ROdz

“I don’t have access to realtime forecast…”

@ckhyatt Ah, it provided the time period length as a string, I’ll make sure to cast that to an integer. Will be tomorrow though, time for bed now

Thanks very much for all the work you are putting into this!