Cannot select available models in Ollama for alternative conversation agent

I have tried a number of models in Ollama. The first one I ran is selectable as a Conversation agent in Home Assistant. The ones I added later do not appear. I can question them via Ollama and they work fine.

Restarting Home Assistant doesn’t work.

Is Home Assistant fussy about the model you use or is there something else happening.

FYI qwen2.5:7b is the only model that works.

I am running:

  • Core2024.12.5
  • Supervisor2024.12.0
  • Operating System14.1
  • Frontend20241127.8

Here are the other (working) models I have installed in Ollama.

image

I have exactly the same issue. Except the first model I chose was deepseek-r1:7b, which apparently the voice assistant told me doesnt support ‘tools’

so I downloaded 8b, and removed 7b, but in HA, Ollama is only showing 7b.

EDIT:- Turns out you have to go into the Ollama integration and click the add service button to add another model.

You have to add the ollama server again in the ollama integration for every model you have. After you add the address of the server It Will prompt you to select the model.
I did It Yesterday adding 4 models