Voice assistant fall back to an LLM-based agent

Today I recevied my Voice Assistant PE and I am playing with it, pretty happy so far.
Something I do not fully understand is how the fall back works. I set the assistant to Extended OpenAI with “Prefer local commands”:

When I ask something that Assist doen’t know how to handle, it is just saying “Sorry, I didn’t understand” whithout falling back to LLM, what am I doing wrong ?

What does the debug say when you give it a command it didn’t understand? The debug tools can help to understand if the command was interpreted correct and if was handled locally or by the LLM? Are you sure the LLM works properly? If you disable local handling, what happens?