Hi all, could support be added to the Ollama integration to support LLM API calls like the OpenAI conversation integration? Many models and platforms now support instruction calling via the OpenAI API, and it would be cool to control LLM function calling completely locally using Ollama and a model like phi3 instruct.