How to Move from Raspberry Pi to Docker


#1

What files do I need to move from my Raspberry Pi to my new Docker installation (running on Dell Inspiron laptop)? Please be gentle, Linux novice here!! I already have HA up and running in the new Docker container, I just want to figure out what files/directories I need/can safely move over.

Thanks!


#2

Your config folder with the files such as configuration.yaml automations.yaml etc


#3

Just make a Full Snapshot and restore it in the new docker installation.
It should include also the config folder.
I’m speaking of Hass.io installation method, also on a docker container.


#4

Do you include the subdirectories under config also (like .storage, etc)?

Thanks!


#5

Also, from my Mac or Windows machines, I can’t see the directories that start with “.” (like .storage). How do I get them?


#6

On a Mac if you are SSH to the Pi in terminal type

ls -la

This will show all the .folders

I would either SCP the files to the static volume on the docker machine with a command similar to below

scp * -r [email protected]<ip address of docker host>:/var/lib/docker/volumes/homeassistant/_data/

This assumes you are in the HA config folder of the pi and you have created a docker volume called homeassistant

On the docker host create the volume with

docker create volume homeassistant

Then start Home Assistant with
docker run -d --name="ha" -v homeassistant:/config -v /etc/localtime:/etc/localtime:ro --net=host --restart always homeassistant/home-assistant:latest

And bob is your uncle :slight_smile:


#7

Why create a volume when you can mount it directly to the host? Using volumes seems like it’s more of a risk when you don’t know how to manage Docker. Storage on the host seems safer and more understandable.


#8

In docker you mount a volume as that is independent of the docker image, and persistent. That way you can update HA without losing your settings.

Creating a volume is merely creating a path that your HA data is saved to keeping it separate from the docker image. It is stored on the host.

For example, you could create a volume in /home/your user/. This would remain even if you deleted the container.


#9

You can do literally the same thing without creating a volume, by mounting directory on the host. You do not need to create a volume to store data outside the docker image.


#10

Agreed. Docker best practice is to though. Especially if you are using stacks etc. (I do this for a living :wink: )

There is more than one way to skin a cat, though and there is no right or wrong. IT is opinions.


#11

As far as I can tell, it doesn’t make a huge difference if you’re not trying to put this in a Dockerfile.

However, if you’re using Docker to manage your system and don’t understand it, you should probably learn or go with something such as Hass.io that abstracts it away.

If you have the files on the host already and just want to use them as-is, mount the folder directly. If you don’t want to manage the directory containing those files on the host, use a named volume. If you don’t care, I generally use a named volume just because I don’t want to worry about accidentally deleting the files or something if I had them in e.g. ~ on the host and decide to clean out the directory.

Also, I don’t think it’s technically recommended to copy files directly into the volume’s data directory, although I’ve done it many times (both in and out) with no issues. If you do run into issues, you would need to mount it into another container that provides access to the volume and copy it through that container. Hass.io provides this through add-ons such as Samba, SSH, and the community IDE add-on.


#12

I manage a few Rancher clusters myself. :wink:

Based on the questions we see here in the forums regarding Docker confusion, I find it better to use the files directly on the host to avoid more confusion than necessary.


#13

Agree 100% the problem with using docker for HA is that we are using an enterprise platform in the home. All of a sudden the convenience of using containers is watered down as it adds a layer of complexity to end users away from the Raspberry Pi running hass.io