Addons in separate containers or through Home Assistant?

Hi HA community. I’m new to home assistant and have been using it for about a month now. My current installation is Home Assistant Supervised on a RPi4 and uses a 64 GB USB stick instead of a MicroSD. I installed it using IoT Stack. This also installs Docker, Portainer, and can also be used to install Node Red, Mosquito, and a large variety of other containerized apps.

My question is this: Is there any advantage to installing the addons outside of HA in docker containers? Is it better to just install them through HA? I like the idea of being able to use the Pi for other purposes in addition to HA, but its really not necessary as I have Linux machine also running docker and containers for other things (media server, ERP, website, etc.).

I will move to a new home in the next few months and want to go all-in with HA. I’m trying to find the best software installation solution on my RPi4. As I mentioned above, my main question is whether to keep all addons in separate containers or just install them all through HA.

Thanks!

If there is an add-on for anything you want to install then you might as well install using the add-on. The only time the non-add-on container is a benefit is if you don’t have add-ons available. The add-ons do make things easier to install because you don’t need to figure out any docker config and you end up with (almost…) the same thing in the end. the only differences might be that the add-ons might not be as up to ate as the regular container.

3 Likes

Thanks @finity. I was thinking this might be the idea, but when I found the IoTStack project, I started wondering why you would want to install all these separately and that maybe there was a benefit that I was missing. I am leaning (very much so) towards allowing Home Assistant to install and manage all the addons.

There is a benefit to having your add-ons run separately from your HA install and that’s redundancy. If your lights are all zigbee/z-wave and your HA instance goes down, so does control of your lights. Running zigbee2mqtt and/or zwavejs2mqtt on separate servers (along with MQTT on its own instance as well), you create redundancy as you will be able to still operate most devices even if HA goes down.

Another benefit is ease of deployment as your HA environment grows (and it WILL grow; Ask me how I know :wink: ). Having some of your core services outside of HA allows you to deploy that stack to another machine quickly should something happen or you want to move to faster hardware. This is my basic docker-compose.yml file:

version: '3'
services:
  homeassistant:
    container_name: home-assistant
    image: homeassistant/home-assistant:latest
    volumes:
      - /srv/homeassistant:/config
    environment:
      - TZ=America/New_York
    restart: unless-stopped
    network_mode: host
    depends_on: 
      - eufy-bridge
      - mariadb

  mariadb:
    container_name: mysql
    image: mariadb
    restart: always
    environment:
      MYSQL_ROOT_PASSWORD: [password]
    ports:
      - '3306:3306'

  adminer:
    image: adminer
    restart: always
    ports:
      - 8081:8081

  code-server:
    image: ghcr.io/linuxserver/code-server
    container_name: code-server
    environment:
      - PUID=1000
      - PGID=1000
      - SUDO_PASSWORD=[password]
    volumes:
      - /srv/code-server:/config
      - /srv:/config/files
      - /etc:/config/etc_files
    ports:
      - "8443:8443"
    restart: unless-stopped

  eufy-bridge:
    container_name: eufy-bridge
    image: matijse/eufy-ha-mqtt-bridge
    volumes:
      - /srv/eufy-bridge:/app/data

Using this type of setup, I can move the containers anywhere I need to. If I notice that DB access is slow, I can just snip out the mariadb/adminer containers and move them to another machine. After a simple change in HA’s configuration.yaml to point to the new DB location, and I’m back up and running in minutes.

I’m not saying that running everything in HA is necessarily a bad thing (especially if you take regular snapshots), but having your services segmented out allows for a lot more safety.

2 Likes

Thank you @code-in-progress for this explanation. I’m new to networking, Docker, Linux, etc. I started all of this about 6 months ago because of an interest in home-automation and I’ve convinced myself that I will go all-in on home networking and automation. I currently have an old dual core PC that I have repurposed as a server with Ubuntu Server 20.04 and have a few Docker containers running on it for plex server, Dolibarr (erp system), nginx proxy manager, wordpress, portainer, and a few other experiments. It might make sense to run my HA instance this way with HA on the RPi4 and addon containers separately so they can be moved to the server if need be. It sounds much more complicated for a noob like me, but now that you have mentioned it, I will most likely try to see if I can do it this way to at least see if I can get it done. Thanks.

1 Like

We were all noobs once and there are a lot of VERY knowledgeable people (@finity is most definitely in that list) willing to help you out. Ask LOTS of questions (even the stupid ones).

Since you have that system setup already as a server, yeah, I’d at least move a few things on to it; MQTT would be perfect there along with some other add-ons that don’t require a lot of CPU/network horsepower. Just be aware that if you have Plex setup to transcode, you WILL see some performance issues as Plex is just plain power-hungry. I have Plex running on my Unraid NAS because of how much CPU it uses.

2 Likes

Thanks for that.

But there sure are a lot of times (most…?) I definitely don’t think I’m anywhere close to that. :confounded:

:laughing:

But to the OP, if you have experience running docker I see very little benefit to running a supervised install of HA.

The only one I can really think of is the snapshot functionality.

But I don’t use it myself, either, and I just make regular copies of my config directory.

And there are a few downsides - the biggest one is the automatic and non-optional updates of the supervisor whenever it decides to do it.

There have definitely been a few times when people have woken up to a dead HA system from an unexpected failed supervisor update in the middle of the night.

2 Likes

Thanks for that. I am still learning Docker and was thinking I would use the IoTStack project to deploy all my addons in separate containers, but thats a little more complicated for me than to just install them in HA. I’m not 100% on getting all the containers to communicate with each other. I was thinking of using a container like Duplicati to make copies and upload the configs to Google drive. I may have to try out both options and see which one works best for me. Thanks again!

1 Like

I guess it depends on what containers you want to run but I have over 25 containers running and I haven’t had any issues getting HA to communicate with any of them it needs to.

It’s mostly how you set up the ports in the containers and then use those to communicate.

I’m definitely no docker expert and I’ve really had almost no issues with that that I can recall.

1 Like

You will also get ‘ingress’ functionality with the HA addons. The only container that is HA related that I don’t use the HA addon for is Portainer mainly because if HA is rebooting for some reason I can still log into Portainer to see what is happening on my system. Additionally the HA addon for Portainer is way out=of-date.
Personally if there is a HA addon I would use it. Your snapshots will then also have the addon configurations as well. If you don’t want those benefits then why are you even going to do a supervised install? The whole point of supervised is so you can use the addons…

1 Like

exactly…
:wink:

1 Like