I’m running home assistant on a pi4. I have a lot of devices and I’m adding more and more automations. How do I tell if I should upgrade from pi4 to something faster?
Either when the UI becomes intolerably slow or CPU usage rarely drops below 90% (see Supervisor > System > System Metrics).
So the CPU processors stays about 6-15% and never over 25% but its so slow. I’m wondering if its my control4 integration that added 300 devices and 600 entities.
Can you elaborate? What are you perceiving to be slow?
The Lovelace UI’s rendering speed? The time it takes a light to activate in response to some stimulus?
The UI is very slow, both on PC and in ios. Ios companion app sometimes won’t connect. Automations aren’t triggering. Some devices show up unavailable.
looking at average cpu speed is worthless, you need to be looking at process utilization of a core. You have 4 cores, a 25% cpu usage may very well be 100% cpu usage of a process running on a single core… and even though you have a lot of total cpu headroom left, you will get none of it because that process will not use more cores.
Comparing single threaded performance, an Intel NUC8i5BEH will probably be around 5 times faster than an rpi4, and 6 times faster when multiple cores are under heavy load.
You also have to deal with IO performance, a NUC with a Samsung 960 pro will have orders of magnitude faster IO compared to an rpi with an sd card. Even an rpi with an SSD is still 1/10th the IO of a 960 pro on a cpu that can handle the bandwidth.
I am betting you have an IO issue, and even so are close to maxing out how fast the cpu can deal with the data it gets. Back up all your data, right now, then make a plan to upgrade to a faster system
What exactly is it attempting to render?
My RPI3 easily renders a page of 50 badges and 40 assorted switches, input_booleans, sensors, etc in a blink of an eye. That’s a bog-standard RPi3 with a microSD card.
I have made some pages with less custom components and they seem faster, but at times the system is really slow. The UI can take minutes to load even a non lovelace view, like the configuration page. Is there a way to tell if a certain integration is slowing the system?
Are you storing all the recorder data to the local sqlite db? If so, that could be part of the problem. With that many devices, you’ll be constantly writing a huge stream of data to the drive. The recorder is a service that keeps a log of stuff like “this light was turned on at 9:47am” and “this event was fired at 10:20pm”, and so on. It’s mostly interesting for historical stuff - like charts in lovelace that show the temperature over the past 24 hours. If you don’t mind losing some of that data on an experiment, you could switch to using sqlite in memory mode and give it a try. To do that, you add this to your config:
recorder: db_url: 'sqlite:///:memory:'
Note that, when you do that, your recorder will stop writing to your db and just keep it all in memory - so if you restart, you’ll lose that stuff because your memory will clear. I didn’t want to lose that, so I put together an integration that saves the data to BigQuery, but that’s not needed unless you’re really into your historical data.
If you don’t care about your state history at all, then you could just disable the recorder entirely.
In any case, I’d probably give switching to memory a try and see if that speeds things up.
I deleted the database and set recorder to only record light and switch domains and this has made a huge improvement. Should I turn on the database memory thing still? I also had it set to purge daily but it wasn’t.
Yeah that sounds about right - I’m pretty sure that’s what was happening.
As for whether to do the memory thing - I actually (mostly) prefer it myself. It’s really better in every way except that you lose your history when you restart. But I’ve pretty much covered that with my BQ backup integration.
See one other thing to think about is that sd cards aren’t really made for continuous writing. That actually wears them out pretty fast and you’ll find a lot of posts around about folks using pis with an sd card for the hard drive and them failing. If you take a lot of snapshots and stuff, it’s easy enough to replace the sd card of course, but it’s preferable to avoid the cost and hassle. The recorder is the thing that kills them as it just writes a ton of data to the sd card constantly. The general recommendation is either get an sd card that’s made for handling a ton of writes, or just use a usb hard drive that’s made for that stuff.
But you can also solve that by shifting to using sqlite in memory. That’s actually the main reason I did it.
Do you recall the database’s size before you deleted it?
Well I never actually deleted it - the actual file still just sits there in your config directory. It’s really just that the recorder stops writing to it. I think mine was around 700mb or so when I switched to memory mode.
My question was directed to kwalkington who had indicated:
Ahh haha - sorry I missed that!
When I just deleted it most recently while I was experiencing slowness, it was 800mb. This was about a week after I deleted the database of about 3gb and set it to purge daily but over the week it wasn’t purging daily.
When you switch to sqlite, does it eventually fill up all your ram?
That’s an impressive number of entities from one integration. How many of these entities are frequently reporting data (like temperature, humidity, light levels, etc) compared to entities that infrequently report status changes like when something is turned on or a door is opened? I’m trying to get a sense of what is contributing to create a 3Gb database.
It’s never come up as an issue for me, but I guess you could set it to purge every day or two if you want.
A ton of them are from Phillips hue drivers. Control4 sets up multiple devices for every hue group. So maybe as bulp states change those might be reporting data. The current Control4 home assistant integration brings in everything from the Control4 project but I really only need like 50 virtual bulbs