With the recent Ollama integration into Home Assistant, I’ve been exploring its capabilities and finding it quite good. However, I believe there’s even more potential if we could run Ollama directly as an addon on the same hardware. Currently, I’m using an Asus Chromebox 3 with an Intel® Core™ i7-8550U Processor and 16GB of RAM, and running Ollama locally as an addon has been a positive experience. By leveraging small language models like tinyllama
, tinydolphin
, phi
, etc., I’m achieving quick response times of 2-3 seconds from my Assist device, which is an esp32-s3-box
integrated with the new Ollama integration.
Perhaps voting for your own request would be a good idea. I did.
2 Likes
I’ve just put together such an addon: GitHub - SirUli/homeassistant-ollama-addon: Provides an Home Assistant addon configuration for Ollama.
1 Like