New assist with LLM combination, lot of tokens?

In the home assistant settings under voice assistant you can configure which entities are exposed to each conversation agent.

@Spegeli and @rossk Thanks for the explanations, I thought a feature like that is already built-in. Sounds like a valid feature request to me… :laughing: I’m not using ChatGPT, so until now I could only read up on this, but not test it myself (maybe I should test it… :thinking:)

Thanks!

Look for a Hacs integration called Fallback Conversation Agent. It wont use open AI for requests like that. It will only reach out to Open Ai when local assist cant complete the command.

None noticeable

Since 2024.6 these templates are not needed as home assistant will send the states of the exposed entities appended to your prompt. So using this template will double your number of tokens.

You can see the full prompt sent by enabling debugging in the OpenAI integration.

1 Like

Yeah but only in the offical OpenAI Integration. On the Extended one you still need this because the Assist Option ist Missing there.

The new gpt-4o-mini is WAY cheaper to use.
And also smarter/better then 3.5-turbo.