Hi everyone,
I’m planning to upgrade my setup (rpi4) and would like some advice on hardware that can smoothly run Home Assistant OS along with Ollama for local AI models.
Right now, I’m considering using Proxmox to host both Home Assistant OS and Ollama in separate VMs or containers, but I’m open to other configurations too.
What I’m looking for:
• Reliable performance for Home Assistant OS (Zigbee, automations, dashboards, etc.)
• Enough CPU/GPU power for running Ollama locally with small or medium AI models
If you’re currently running this kind of setup, I’d love to know:
• What hardware you’re using (Mini PC, NUC, server, etc.)
• How well it performs
• Any tips or caveats with virtualization and resource allocation
Thanks in advance for your recommendations!