Ollama LLM API support

Hi all, could support be added to the Ollama integration to support LLM API calls like the OpenAI conversation integration? Many models and platforms now support instruction calling via the OpenAI API, and it would be cool to control LLM function calling completely locally using Ollama and a model like phi3 instruct.

+1 :grinning: :grinning: :grinning: :grinning:

But you cant control much with it…

Im trying to find a solution where i could say - send message {my_message} and it would sent this message and not try to parse it into the AI…

Looks like support was added in release 2024.8!