QNAP NAS with Docker or Ubuntu VM on the NAS or standalone computer - Opinions wanted

So I’m running HA right now on a dedicated computer running ubuntu in a python venv. I just ordered a qnap 918+ NAS with the main goal of storage and Plex server. I see that I could run HA on the NAS in a docker. I “think” i could also create an ubuntu vm and install HA on there in a venv like what I currently have. Or maybe I should just leave things like they are. :slight_smile:

I know there is a good sized group of people who are big docker proponents, but I often see people having issues with docker such as, maybe you installed tensorflow in your docker, now you upgrade, and have to install tensorflow again? Or ssl certs? Again, since I’m not running docker I dont know for sure, but I feel like those sort of posts pop up fairly frequently.

I also have seen issues with the NAS where if you restart the NAS, you have to log on via SSH and do some command lines to get your zwave/zigbeee sticks working again. I have power outages every month or so (usually super short), so that would be a pain to have to “fix” my zwave like that.

I know that most NAS also offer vm capabilities, so if I did not want a docker install, I might be able to run a ubuntu VM and python venv. Not sure if anyone is doing that or if it’s just a waste.

or maybe there really is no good reason to move my HA install to the NAS other than it would be fun to try to figure it out. Are there any really good reasons to run HA from a NAS, other than just consolidating.

Really just trying to see what people are doing, and what they have run into pros/cons, etc.

Thanks in advanvce

If they are doing that, they don’t understand Docker.

What about them? They get hosted on the Host OS and mapped through with bind mounts.

Ha, maybe I don’t understand the HA docker either. :slight_smile:
This is how I think it works. All of my configs, automations, etc live in a config directory. When a new version of HA comes out and the docker version is updated I blow away (or rename) the old docker container, and get a new fresh docker which is basically a VM with an OS and Home assistant installed. Sometimes you might need to download a python package for something not 100% HA related, like the vizio component requires an install to send a command to get a key to use with the component. That would be lost when I blow away my docker container, right?

Do you have docker running on a NAS? I assume you dont want to install these kinds of tools on the host OS? but maybe you do?

Thanks for the answers, just trying to learn.

Yes, on the host.


Not really. It’s not a VM, and not a full OS.

In the case of the vizio component, you need to do that exactly ONE time and you don’t have to do it ON the same machine running HA. You can use literally any machine on the network running Python. Most of the components that require running a script to get a key work this way. You don’t install things in Docker.

I run it on a NUC. It doesn’t matter where you run Docker, you don’t need to install anything on the host for that.

What tools?

I was speaking about tools like the vizio component. I was just trying to think of examples, maybe that was a bad one. I can think of several other things in the past that maybe were “one time” between updates. The Alexa media component that is pretty popular had a point in time where some people needed to download an updated version of a code, I recently added an ecobee thermostat to the homekit controller and it required me to upgrade the python homekit to .14 to get the pairing to work. that version looks like it’s coming in .93 of HA.

Edit: Maybe a better example for me would be running a forked version of openzwave to add fuctionality that might be missing (schalge locks have issues with clearing usercodes)

I was asking a question because you said you install the certs on the host OS. I’m using letsencrypt and it was pretty easy to set up, but was not sure how that might work with the NAS and/or docker.

In Docker, if you need to add things to the core image, you create a Dockerfile and you build FROM the official image and script it to add your components. You never install things directly in a running docker container.

I said the host HOSTS THE FILES, not that you INSTALL anything on the host. You can use Docker to generate your certs. The certs themselves are merely files in a directory.

Thank you for the info. Overall I was looking for some feedback from people running Home Assistant on a NAS using docker. I appreciate all the useful information about docker. It shows me that I still have a bit to learn about docker if I choose to ever go that route. I’m sure as you learn more it becomes easier and easier and eventually becomes second nature. For me at this exact moment in time, it’s more complex than just running in a Python VENV. But I’m sure with a bit of time it would not be.

You’ve also made me realize that overall my initial question really should have just been.

Those of you running HA on your NAS (QNAP/Synology) how do you like it. What are the Pros/Cons? If you already had a dedicated NUC or similiar would you move you HA install to the NAS?

It really doesn’t matter where Docker is running. It’s all the same concepts and implementation. In fact, your NAS is just running a Linux distro, and the Docker running on it is just like on any desktop/server.

I have a NAS. I wanted a device dedicated to Home Assistant and the software stack that surrounds it for my needs, so I did not want to put it on my NAS.

1 Like