I am in somewhat of a conumdrum with my HA installation. I have had it running for many years on an old PC. I followed the old installation method of installing Linux (I have Ubuntu 18.04) Putting Docker on top and then installing HASSOS
Now that 2022.11 has come along I am left running a Supervised install of HA on Ubuntu using Docker that I can no longer update due to it being unsupported.
I am in the process of replacing my old PC with a new one and so as part of this process I wanted to migrate my HA system into a proper container install without supervisor. But I am left with a conundrum that my current install has Zigbee2MQTT with a lot of devices on and a lot of automations built on those.
Is there anyone who could possibly advise a path that I could split my current install out into a proper supported container installation without having to basically start again. Obviously I understand going container removes addons within HA and as such all my current addons wont be there.
I run the container version without addons and run zigbee2mqtt in a docker container, on Ubuntu 20.04 Fossa. The install instructions to setup zigbee2mqtt in docker are here
I actually use Portainer stacks to setup the docker containers. Installing Portainer gives you a nice UI to manage the docker containers and upgrades (including the Home Assistant container)
This video is also helpful to setup a Home Assistant container install
Your biggest pain point however will be moving the configuration from your addons into the equivalent docker container. I have not personally done this, but for zigbee2mqtt, you should be able to extract zigbee2mqtt’s configuartion.yaml file and put it into your new docker volume. I assume you will be using the same coordinator so your zigbee network is stored there, and devices will not need pairing again, but all the device/entity names are stored in that config file, along with the mqtt settings. If you don’t move that file over you would have to configure all that and update the names from scratch so entity names and automations stay in sync.
You will also need to configure MQTT. I am running Ubuntu and could never get the mqtt docker container working right, so ended up just installing MQTT on bare metal. I believe I followed this guide
What other addons do you use? You will either need to install their equivalent docker container or program on bare metal in Ubuntu since you will lose them.
It will be difficult but ultimately you would have a setup you have far more control over and wouldn’t have to worry about the supervisor whining about having an unsupported install anymore.
Your other options would be to install Debian and go with the supervised install that is supported , or install a VM in Ubuntu and then setup HAOS to run in the VM. I don’t like the VM route personally since it uses more resources then a docker install, but if your hardware supports it that would be an option. I also don’t like supervised for the reasons you are already running into right now. For me, the container install method is best and has worked out well, it just was more difficult to setup.
Hey Tim thanks for the reply, have literally just been watching those Home Automation Guy videos in his playlist, he even goes through MQTT on docker.
Z2M was really only the one I was worried about, so I guess I could install HA core on the new box and then try and do Z2M as suggested moving the config over and plugging in the co-ordinator and seeing if it works. As its new hardware I can just power off the old server whilst testing and at least if I hit a wall I can switch back whilst I troubleshoot.
Only other addon I use are Google drive backup and Duck DNS. But I think I can workaround those.
Yes, you can always test it out. As long as you do not remove or add any new zigbee devices on the new system you should be able to move the coordinator/usb stick to the other machine and easily go back if you run into a problem.
For duckdns/ssl, I use the swag docker container and followed this guide
For backups, take a look at my post here. I use a cronscript and then use the default backup program in Ubuntu to create a “backup of the backup” on a google drive.
Another question, can I do you know migrate my integrations over into a new HA install. Can I do a backup from my supervised install and restore onto the new one or is that not available if I am not using supervised?
There is no “restore” option in Home Assistant container. So what you would need to do is take your backup from the other system, which should be a .tar file, and extract its contents to a folder on the new system. This .tar backup should be your entire home assistant config directory. For example, if you extract the contents of the backup to a folder called /home/d4rthpau/homeassistant , you would just need to map the container install to that volume in your docker run command/compose (docker persists the config files in a volume you map to it). If you put/extract your files in the directory above, starting home assistant with the following compose would then seamlessly run the old config on the new system:
You can easily just copy over the contents of the HA config folder (the one that contains configuration.yaml) including all of the hidden folders (.storage, etc). nothing else outside of that is needed for a Container install.
as long as you get everything in there then the only issue you might have (and you may have it with every way you try to migrate) is a HA database error.
worse case if you do is just delete the db and restart HA and it will create a new one. You lose all your data but at least HA will work again.
Okay, so I have been having problems with my Home Assistant install, and I think I have tracked it back to the SD card filesystem corrupting. But it’s not COMPLETELY gone, so: If I can get this SD card mounted to another Linux system (I can’t get Linux Subsystem for Windows to play nice with me, so I’m thinking at this point I might just try my Steam Deck), is it possible to get to any backups I might have there, and if so, where would I look for them? I just wanna be able to get in, get the backup, and get out. Then I could take the new Home Assistant card I just made, un-tar it to the Configuration folder there, and I should be good to go, yes?
(And what path would the active Configuration folder live in? I have the card connected to my Deck, there’s like six different filesystems on it to mount…I’m guessing it’s in data, system0, or system1, but where?)
There are two options to restore - you either extract the .tar backup file like I mentioned, or you just copy the contents of the HA config folder over like finity mentioned. When copying, some files have root permission, so make sure you use sudo or they won’t copy.
As far as the SD card being corrupt, Do you have a backup that is not on the SD card? You mentioned Google drive, is there a backup there?
If the SD card is corrupt, any backup on the SD card will also likely be corrupt. All it takes is a couple of key files being damaged and unfortunately your Home Assistant config is lost. You might be able to try a repair or recovery utility.
Great, thanks much! I was looking in the right place then. It’s rather odd, because a file I had updated was not showing as updated in there, but I know there were problems with the Supervisor and that may be why. At any rate, I am pretty sure this or one of the backups should be good to use (I hadn’t had occasion to touch this in a bit, and honestly reading this part seems to be fine, I suspect whatever corruption was going on was elsewhere in the system and may have had to do with power outages we have experienced recently) and I’m still going to move this to an A2 card when Amazon drops some at my door in a day or two anyhow, so I’ll take these and see what I can make happen with them. Appreciate the help!
If you’re going to run the new installation on an SD card, and are going to keep the database on the SD card and not use an external hard drive or SSD, I would change the recorder settings to only write data every 30 seconds as mentioned here.
The default recorder settings to constantly write database updates to the card every second will eventually destroy any SD card.