hello ive been going crazy for the last 5 days trying to figure out how to link ollama with my home assistant. ive cannot get it accept the url of either the raspberry pi 5 or the local computer. i am a novice at coding and i don’t know much about it, however im learning. ive tried using docker, ive tried umbutu, ive tried using .tar files and everything else that Chat GPT and tutorials on youtube have offered. i can get ollama to run locally. i can get it to say its running using the 127.0.0 url. i cannot get it to accept the urls into the home assistant integrations, has anyone else had this problem? can someone help me please? thank you!
What is ollama?
May not be the best of answers, as there is not much details to understand which part you are struggling with in detail.
Have a look at LLMVision this ties all the AI / LLM models into one nice way to control them all if you have multiple or just want to swap between different models for a test, with minimal need to change automations