Run LLM on another machine?

I want to use the Home Assistant smart speakers, but I don’t wanna change from running a Raspberry Pi 4.

I don’t think I’d get much benefit from upgrading, and I’d have to find a suitable rackmount and PoE solution. Not in the cards for me.

On the other hand, the Home Assistant smart speakers let’s you optionally use a cloud LLM. Does that also mean I could install that on my NAS which has an Eypc server CPU with plenty of horsepower? Or is it possible I could connect a USB Coral to my Raspberry Pi and connect that way?

What are my options for running the LLM without using the Raspberry Pi but keeping the entire thing local in my network?

Hi Sawtaytoes,

Look at Ollama.

Hsving ram isn’t that important it is vram that you need. A good graphic card will do.
You can play with it with disturbing you local HA system.
Search for the forum for ’ local llm for dummies’
Local LLM for dummies