Hmm, I’m not sure I totally understand. My automation uses LLM to analyze the image, and generate the image description. This gets saved to a response variable, which is then called by an action to send a notification to my mobile phone app (HA app).
I m pretty sure there’s a max length. Nit I’m not sure what it is. What I’ve seen as most peoples solution for this is to either reduce the max token count in the LLM response or tell it you have limited space ‘ie.: In 255 characters or less’ in the prompt