Ollama (LLM) not using remote AI and local devices not aware of device's area and type

I have a local Ollama and Ollama Integration setup on mostly working. I can ask home assistant to list all my devices and list all my home areas. I don’t have two capabilities setup.

  1. When I ask What is my kitchen temperature? it says I am not aware of any temperature sensor in your smart home... If I ask what is the value of **-**-**-**-**-** Temperature it give me the correct value.

  2. When I ask it Who is the 23rd USA president? it says Not any. It looks like it is not falling back to a remote AI. How do I do this?

For point 1 I need to ask what is the temperature inside the kitchen?

From the link below, did it not work?

Sort-of. Given the video I think tell me a joke is processed remotely. I assumed it was purely local. Learned something new. Running tell me a joke debug mode shows

Natural Language Processing                         19.34s ✅
Input                                         tell me a joke
Prefer handling locally                                 true
Processed locally                                      false

It just can’t answer this question for some reason? Who is the 23rd president of the united states?

Natural Language Processing                                       0.01s ✅
Input                       who is the 23rd president of the united states
Response type                                                 query_answer
Prefer handling locally                                               true
Processed locally                                                     true

Yeah, I recall seeing this question recently, and the answer is that HA has a built-in sentence/intent that catches the “who is” so it gets processed locally.