Llm vision analyzer fails with local llm

In LLM I added an entry with the name Ollama_local. I use this name in LLM_vison_analyzer but I get an error : “Failed to perform the action llmvision.image_analyzer. Provider config not found for entry_id: Ollama_local”

What should be the name of the provider. ?

What do you mean by this specifically. Picture?

I assume the blueprint?

Need a lot more information.

What are yih doing and Where do you see this when it pops.

I use the integration with the name “LLM vision” (see second picture). I could add the entry “Ollama local”, (see first picture )
This is the part of my automation ’ LLM Vision ‘Image Analyzer’ where it should use the ollama llm

remember: false
include_filename: false
target_width: 1280
detail: low
max_tokens: 100
temperature: 0.2
expose_images: false
provider: Ollama_local
model: llama3.2:latest
message: >-
  Beschrijf wat je ziet in één zin. Als je een persoon ziet, beschrijf dan hoe
  hij/zij eruitziet. + Wat is de actuele  systeemdatum en systeemtijd ?
image_file: /config/www/images/snapshots/reolink.jpg

When I run my automation an error can be found in traces saying “Error: Provider config not found for entry_id: Ollama_local”

When I use the ip addres http://192.168.2.14:11434 in a browser I get the message “ollama is running”