🌦 Weather forecast access for LLM (eg ChatGPT or Gemini)

Thanks for all of the amazing work on this. I think I am almost there, but it isn’t recognizing my location. I am using a script based on the stock blueprint and am feeding it an openweather integration. Currently using llama3.1, but I have tried others. I have the script exposed and have given it a couple of aliases that I am using to trigger.

Any thoughts on the location thing?

I should add I’m trying to do this using Home Assistant Voice Preview if that matters.

@WrongHole Hi, the location thing is probably Ollama trying to imagine a reason why it is failing. Can you upload a full trace of the script to https://dpaste.org

Update

Version 2.2

  • Add toggle to set it multiple hours/days of forecasts should be sent as an average, or as multiple items
  • Additional checks if the data provided by the LLM is in the right format and of the right type.
  • (2.2.1) Fix for incorrect check

https://dpaste.org/JR6h7

My bad, fixed

Thanks very much!

Tomorrow works. Today doesn’t. dpaste/tL6pr (Python)

https://dpaste.org/tL6pr

@ckhyatt The LLM provided hourly as type, and this resulted in the forecast period being in the past (1 hour, from 00:00 to 01:00).
The weather forecast action doesn’t provide forecasts for periods completely in the past.

The prompt I provided is “whats the forecast for today” Should it be phrased differently?

You’re not doing anything wrong, the LLM should have provided the type as daily instead of hourly.

I’m still learning how these LLM’s think, I already have some ideas which might work better.

1 Like

Ah, got it. Amusingly, if i ask twice in a row i get two different responses. One slightly more verbose than the other. It understand tomorrow, at any rate. :slight_smile:

with latest version of HA, got below error:

offset-naive and offset-aware datetimes

note that for daily forecast, my weather forecast are simple YYYY-MM-DD /ex: “2025-01-26”.
to fix this I have revisited below test, the other way around for date & time conversion:
from:

              {% if start | as_datetime <= item.datetime | as_datetime < end | as_datetime %}
               ...
              {% endif %}

to

              {% if start <= item.datetime | as_timestamp | timestamp_local < end %}
               ...
              {% endif %}
              {% if start | as_datetime <= item.datetime | as_datetime | as_local < end | as_datetime %}
               ...
              {% endif %}

This will fix it as well, I will upload a new version

FYI: I’m using the Google generative AI as LLM. It alwas responded that it did not have access to weather data. I then changed the prompt so that it includes the script name that should be used. Bingo it now works.
example in German:

Verwende das script llm_weather_forecasts um Wetterdaten zu erhalten.
2 Likes

Update

Version 2.3

  • avoid template errors in case daily forecast only uses date strings, and not a full datetime string
1 Like

Thanks for this BP. It’s really great.
I realized on github that you also developed a blueprint to use LLM to control Music Assistant (LLM Script for Music Assitant voice requests). I tried to use it but it somehow cannot connect to Music Assistant. I also cannot select a default player. Is the BP supposed to be ready for general use?

are you using the Music Assistant core integration? Or still the custom (HACS) one?

I’m using the Music Assistant Server add-on.

Aha, Just realized that there is now an integrated MA. Can I just switch from the add-on the integrated one?