Computers may become unresponsive during the compiling, or even crash if theyre low on computing power and RAM. I do have a windows gaming machine that i have installed the esphome onto, even that machine goes fan-mad during compiling.
I bet there are many who are happy when their hardware uses as little power as possible to run 24/7. Its a very normal thing that compiling code makes computers go mad, and kind of requires as many cores as possible with huge RAM, SSD’s. So compiling on a machine that is as light as possible for its main purpose is kind of upside down.
Anyways, i personally dont want to run Home Assistant and related components on more power hungry rig that is necessary.
I have a RPi 3 that i have my Home assistant installed onto, and would love to have the esphome dashboard running too with the ability to compile code, OR if there was a possibility to take the yaml files onto a USB stick, insert the stick into the windows machine, let that machine do the compiling for all the files and then just move them back onto the RPi 3 where everything can be made usable by the dashboard. Even better would be if esphome could contact another computer on the same network with esphome installed and forward the compiling for it to get the compiled code back and ready to upload wherever needed, and with that disable the possibility to even accidentally start compiling on the RPi 3 or similar hardware.
I did once try to run a compile on the RPi 3, and when not crashing, it went 99% unresponsive for the time until done, hours. I guess also the SD card speeds are a factor here?
When first time running a compile, it could warn the user that it may cause a freeze or a crash if the hardware is not on par with recommendations: “Are you sure you want to run this code? It may be too hard for your rig bla bla”->“Dont warn me again? click here to turn the warning off” And there could be a link to most used hardware that is not enough, and a mention about the absolute factors that make a compiler hardware underpowered.
I have no idea if it’s normal for an rpi3 to take that long. I’m surprised you think it’s still slow on a gaming computer. I run on a vm with only 2 cores and 6gb allocated and it never takes more than a couple of minutes. That vm runs all my home automation stuff.
Then just browse to the web interface. It’s completely independent from home assistant, it’s the devices, once programmed, that talk to ha.
Once you’ve done configuring your devices, you can then shut esphome off. Until you want to change the config, or do an update it’s not needed. Keep a copy of your yaml files safe, they can’t be reverse engineered.
No, the gaming machine is only for doing the compiling, it does work fast there without the computer slowing down in other ways, but is still taking lots of resources as all of its fans go to max, almost like im playing some game that eats lots of resources. Thats what im telling.
I want my esphome host machine to be able to show the dashboard 24/7 which again is not available on windows, and im not going to install linux vm onto the windows and totally not going to leave it on 24/7.
The rpi3 may have locked up during compiling and for that reason it lasted that long? I did not watch it all the time it stayed nearly unresponsive. But it certainly did not crash as my putty never disconnected from it, and the Home assistant app did respond with some lag to it, eventually.
Currently i have to use the windows machine using command line there, its very manual way compared to having the Dashboard to keep up with all the esp devices. At least Home Assistant shows whats the firmware version on each, so i have some source to keep up with each.
If i could somehow move all of my esp devices under the command of that rpi3, i would be happy.
I use Rpi4 8GB where I compile all my esp modules and never had a single problem.
Everything works smoothly, including all other HA integrations.
I think Rpi3 will always have a problem, small RAM size.
So RPi 3b+ will also have the same problem due to ram? Im fine if the compiling process lasts long, but the unresponsiveness and lag happening to the whole system is not okay.
Yes, I think the Rpi 3B+ will have the same problem.
The HW is sufficient to run HA, but insufficient for esp compilations and more demanding operations.
To add to the lot, there are likely to be many of us(all) who have a secondary(primary) computer capable of doing this sort of compiling with ease, and is only turned on every now and then, so such a function i suggest here could be really helpful. For those who have a Rpi 4 with bigger ram or another more powerful computer, could gain lots of speed when compiling can be forwarded to happen on another machine even having a Ryzen Threadripper or an Intel Xeon with 64GB of ram to them…at best. I have no idea how this sort of function could be made to happen, maybe its not about ESPHome coding but all the underlying coding that need to allow it.
I was struggling with compile times on my pi4 and was investigating options.
I was chatting to a dude on Discord who said you can run ESPHome on windows in a VM so you get the dashboard, but that you should also be able to run the dashboard via HA add-on on your pi at the same time.
I didn’t investigate further as I ended up upgrading my pi4 to an upgraded ancient laptop. Works great.
Here’s the thread. You’ll need to join the ESPHOME Discord first if you haven’t already.
One thing you could try is a new option (atm only in beta and dev) that allows limiting the cores used to compile :
compile_process_limit (Optional, int): The maximum number of simultaneous compile processes to run. Defaults to the number of cores of the CPU which is also the maximum you can set.
If you just use 1 core (instead of 4?) on your RPi you might be able to continue using HA “normally” while increasing the compile times from esphome in exchange.
I ran HA on an rpi3 for a while. The limiting factor is not the CPU (ok, stuff takes a while) but actually the RAM.
1GB is filled up extremely fast when running AddOns.
The default config for HA is to only have 200MB of swap size. That also runs out quick and when that happens you go into weird unresponsive states which sometimes catch themselves, sometimes they don’t.
The solution is to increase swap size. Sadly there is no easy config for this, but there is a hack.
Try this, it worked wonders for me!
I doubt that heavy swapping is the fastest way to kill a sd card. Quicker approach is definitely to just “deactivate” it physically - for example with the help of some pliers and a hammer
But if short on tools killing valuable flash cells just by hammering on it with heavy writes probably will get the job done over time too.
Actually that isn’t a swap on storage (sd card for example) but instead a compressed zram which is (ab)using cpu cycles instead of a storage.
Also for that reason it might be a good try with the compile_process_limit mentioned in my last post
Im fine with this kind of solution too, as long as it doesnt kill the single core, and that all my compiling could be done in one day after pressing that overly sweet “update all”-button.
A raging wife in the dark hallway isnt fine when the HA motion controlled lights light up 10minutes too late, i could always tell her to have a flashlight as backup
Feel free to just do it and tell us about your mileage. If you want fast and furious updates of all your esphome nodes (if you have a critical amount of them) you surly avoid a raspberry pi 3 anyways
To update my ~100 esphome nodes I just fire up my workstation and let the 16 cores do the work together with 16GB of memory
Just do all your development work on another HA instance running on a Linux VM on your desktop system, and copy the config to the Raspberry Pi when you’re done making constant changes that require recompilation.
But if you’re like me, you never really get done making changes, so maybe that won’t work…
To speed up the compilation process for Home Assistant on a Raspberry Pi, follow these steps:
Start by installing Home Assistant on your Raspberry Pi as usual.
Create a folder called “ESPHome” on your computer’s desktop (assuming you’re using Windows).
Open a command prompt (CMD) and navigate to the newly created folder by running cd desktop/esphome
In the CMD, install the necessary packages by running the following commands:
pip3 install wheel
pip3 install esphome
Confirm that the packages are installed correctly by checking the esphome version with:
esphome version
If you don’t see the esphome version, ensure that you installed the packages in the correct directory and added the installation location to your PATH.
With esphome successfully installed, you can now use it in the CMD. Navigate to your esphome folder again with cd desktop/esphome
Create a new esphome project by running:
esphome wizard <your_project_name>.yaml
This will generate a YAML configuration file (<your_project_name>.yaml) in the esphome folder. You can modify this file to include sensors and other configurations.
After editing the YAML file, compile the code with your faster PC by running:
esphome run <your_project_name>.yaml
Once the compilation is complete, you’ll be prompted to choose whether to upload the firmware “via USB” or “over the air.”
By following these steps, you can significantly reduce the compilation time for new firmware, avoiding the slow process on the Raspberry Pi.
For more detailed information, you can refer to the following resources:
This is basically what the second post said. Run the compiles on your regular desktop or laptop, instead of the machine running HA.
Oh, and if it’s a Windows desktop, use backslashes (\) in the path names, not the forward slashes (/) shown above.
You can still use the ESPHome dashboard in HA for everything else. Or use the one on you desktop or laptop while you’re there, and the HA dashboard the rest of the time.
It’s really a simple process. If you do your development on your desktop or laptop anyway, running ESPHome there is the best option.
Would be soo handy if i could send the compiling to another computer on local. Like if there was a setting where i could choose another computer having esphome installed to do the hard work. Transferring those files over there and back is not a big job at all compared to compiling.