I have been working on a addon that basically lets the user configure devices to charge them / use them, doing forecasting of the weather with data recieved from sensors… The thing is, now I want to create a section where the user can talk and ask things to a LLM. Can anyone tell me if its possible or anything I can use to do it? I work using docker, but I don’t know if I can configure for example a local LLM like Ollama. I have also thought about doing API calls, but that would cost money on the long run right? Can anyone tell me how can I advance?
I have tried to start by adding Ollama in the Devices & Services option, but it says “Unexpected error”. I check http://localhost:11434 on any browser and it says ollama is running
Are you sure you really want to do this?
Do you want AI or just LLMs that just reproduce AI slop?
Experience has shown that LLMs, especially in the extremely fast paced field of Home Automation, often produce suggestions that are embarrassingly wrong and hopelessly outdated.
Having said that, others have tried and a quick search of these forums should give you some ideas.
Yes, there is no problem in trying. Plus it would be to automatize things that are small enough to do well!
I havet tried to look for similar questions, but some of the answers don’t fit my case. I can’t seem to find a good tutorial to follow, and the issue here is the connectivity part. It might be because of ports, but I don’t really know.

