Per-installation variant of gen_requirements script?

Hi all,

I’m trying to set up a dockerized setup of homeassistant. In order to shave the startup time on updates, I’m trying to bake my configuration and the runtime detected dependencies into my docker image, but I’m not sure what the smartest way to do it is…

My first thought was to try to come up with a patch to home assistant which would start up all components and then stop, but given that all of the pip magic is async, this seemed like a difficult path to take.

Then I wondered about a modified variant of the gen_components script except that instead of going through the home-assistant sources, it would parse a given configuration and only look for requirements that are needed, and then install those through pip as part of the docker build phase.

I’m still not sure if either of those are really the right way to go, and would appreciate any insight or direction by folks who are more familiar than I am with this part of the project…


Just wondering if you knew there are official Docker images? See:

They install all requirements at build time, not run time so there aren’t any delays due to updates at run time. When there’s a new release, it’s just docker pull homeassistant/home-assistant and then recreate your container.

Maybe I’m not exactly understanding your requirement but why would you try to reinvent the wheel?

I’m aware of it, and my initial setup was based on that, with a baked in config. However:

  1. It includes a lot of modules that I don’t need (making it incredibly bloated!).
  2. AFAIK it only supports a linux-x64 image
  3. It won’t include any non-standard custom components should I happen to have any