Best hardware to run voice locally?

I’m currently running Home Assistant on a Blue I bought a few years ago, and I’m using Alexa for Voice. I want to switch to Home Assistant Voice and run everything local. This will be a dedicated system with no Docker or VM usage. What is a good prebuilt system to get good performance with voice without being overkill?

Hello morikaweb,

If you are going to locally generate TTS, STT, and run an LLM locally, there is no such thing as going overkill…
You will need all the GPU and CPU’s help you can afford.

2 Likes

Thanks, that’s kinda what I thought. I suppose my real question is what companies are good?

I am looking at something like a HP Elite Desk or a Beelink or Minisfourm system but I’m not sure what company is more reliable. Also what is better for this job, Intel or AMD? I’m very new to this Voice/LLM stuff so any advice is appreciated.

Hi @morikaweb, it really depends on what you want. If you want to run serious LLMs locally, you’ll need a powerful GPU like the 5090 or a workstation GPU. However, if you prefer to rely on commercial LLMs like ChatGPT, there are two options:

  1. Run the voice pipeline on an iGPU – Go with the Intel N100. Most people would agree that the N100 is the most cost-effective option for running Whisper and Piper locally. 16GB of RAM will be enough for large models in Whisper/Piper.
  2. Run the voice pipeline on an NVIDIA GPU – this provides the best performance with blazing-fast STT and TTS. It also depends on the model size, but 8GB of VRAM will be more than enough for most models. However, you’ll need to build your own PC, which can be costly and power-hungry.
1 Like

Thanks again, based on this and my research I have decided to start with a Beelink EQ14. If I switch to a higher spec system down the road the Beelink will be a great work PC.

I also know the LAN sucks, but I can fix that with a dongle. So I just wanted to confer that these specs should be good?

Beelink EQ14