Hi, I’m looking to make an HA-integrated shopping list / to-do list that I can integrate with ChatGPT. Of course, I want the regular functionality of the to-do list that can be read/modified by voice command or by the Home Assistant dashboard. But I also want ChatGPT integration to sort the list by grocery aisle and translate back and forth between different languages.
I installed the OpenAI-Conversation add-on with the hopes to set it as my voice assistant and expose the Home Assistant To-Do List to control my shopping list. However, I was disappointed to find the voice assistant doesn’t have access to the items in the list, so it can’t do any sorting or translating.
After lots of Googling, I’m struggling to find inspiration on another path forward. I was just wondering if anyone here has experience with this, or maybe just some keywords I can add to my Google searches.
But I’m VERY verbose in the prompt telling the LLM about what things are. The second image is a picture of the UI for a self hosted version of Mealie - that is driving the actual shopping list, integrated into HA as shopping (ToDo) lists and that picture was the direct result of the ask.
What I mean by that. You can’t just assume adding llm just knows how to do stuff with no context. It may see the intents but it may not know how to apply them
What have you added to your prompt to explain to the llm what each of your lists are and what they contain or are you running defaults?
Whats going on in your case is the intent to read the list exists (see image) but your llm hasn’t made the connection that that’s exactly what you asked.
In mine I combat that with nearly 8 pages (single space) of typewritten instructions and somewhere in there I explain to the ai that xyz are the shopping lists and they contain items, please use the standard intents to manage the including adding and removing items from the list.
Interesting. How did you expose the list contents to Assist? I’m using the Anylist integration, which helpfully adds the list contents as attributes on the entity - but I can’t see how to expose those attributes to Assist. The only way I’ve managed to do it so far is to create a new Template sensor to read the attributes, and then add instructions to the prompt so that the llm knows to use a different entity to read the list contents. But this is not a satisfactory solution.
I’m having the same issue… Using local to-do or shopping list and ollama api to a local ollama agent. The assistant will do a lot of things, even give the number of items like above, but is adamant it can’t read the contents. Asking to add to a list has resulted in a hallucination, admission of failure, or what I believe is a HA response that it failed to use the tool. Something similar is happening when I ask to set light brightness.
Other things, blinds, lights on/off, etc work fine. But, I’m struggling with lists and light brightness, so I’m not rolling it out to the family.
Edit: I’m running the default system prompt. I searched for hours but couldn’t find any documentation or working examples that seems to help with these. If you have them, sharing them would be helpful…