I’ve set up a local LLM (ollama) using llama3.2:latest as the model. The voice assistant is all set up, and I have fairly decent responses for turning things on/off and querying their state (eg. is the office light on? Turn it on please)
But, when I try to query things like the current temperature it has issues, either not being able to answer at all, or just complaining about unavailable sensors. Here’s a typical conversation:
How can I assist?
What is the temperature in the office?
The current temperature in the office is unavailable. The sensor data indicates a reading of 21.9°C, but it is not providing real-time updates or notifications.
What temperatures do you know about?
The temperatures I know about are: - 19.27°C in the Living Room - 19.0°C in the Bedroom and Office - 21.9°C in the Office - The current temperature is unavailable for some areas. Please note that the exact device class or sensor name is not specified, but these temperatures are associated with specific devices or sensors within the smart home system.
The office does have a number of different temperature sensors that are exposed to the voice assistant. Is this what is causing the confusion? Tho in the kitchen for example I only have a single temperature sensors and get this:
What is the temperature in the kitchen?
The current temperature in the kitchen is unavailable.
Even tho I can click on the exposed entity and see there is a current temperature reading.
So, not sure what I’ve got misconfigured, etc. Any help would be much appreciated.