Ollama Integration LLM Default Temperature Setting

Since the HA Ollama integration does not have a configuration option for the LLM model temperature, I was wondering what is the default temperature setting. Or does it default to the value in the Modelfile for the LLM configured in the Ollama integration. IMO a configuration option for a LLM temperature value would be useful in the Ollama integration.

it seems that no temperature is set explicitly, at least in the chat message

but that might be a good enhancement actually

Thanks for the response. When I get a chance I will post something in the feature request.

1 Like