When using conversation.process
, I can ask about the state of exposed entities. I’d like to tweak this to add the last-updated
value of the sensors as well.
- How is the prompt to the LLM constructed? (I have both Gemini and ChatGPT configured.)
- How can I debug what is being sent to the LLM via the API?
(For #2, I’ve tried setting homeassistant.components.conversation: debug
, but don’t see anything in the logs.)
Any pointers in the code or docs on how to modify the template?