Why are my notification descriptions getting cut off?

Hi all, I setup an automation with LLM Vision to interpret a snapshot of my video doorbell, then send that to ChatGPT for the text description.

I regularly find that my descriptions are cut off (see attachment).

Does anyone know how to fix this? Adjusting token size doesn’t seem to make a difference, and I know it’s a pop-up sizing issue.

Are you storing the message in an entity?

State values are limited to 255 characters.

Hmm, I’m not sure I totally understand. My automation uses LLM to analyze the image, and generate the image description. This gets saved to a response variable, which is then called by an action to send a notification to my mobile phone app (HA app).

I m pretty sure there’s a max length. Nit I’m not sure what it is. What I’ve seen as most peoples solution for this is to either reduce the max token count in the LLM response or tell it you have limited space ‘ie.: In 255 characters or less’ in the prompt

1 Like

there is indeed a max length and its going to vary from device to device and its handled on a OS level so apps have no idea when it happens.

the best fix is to tell the LLM to keep it short to 2 sentences.

2 Likes

Thanks! I’ll give it a try!