If io is going over 50% I suspect your SD card has gone kaput.
It’s an SSD so hopefully not. This is why I really want to know, is the PI up to this job, there is a design flaw here, the logic works great, the recorder (which i really like) brings it down. I will look into the NUC approach, or possibly use my laptop, I’m due an upgrade. Would be good if android devices could be used (easily) as I have loads of old phones with really good hardware.
What’s confusing is it runs for hours, all night and then throws a wobbler
As others have said, I think it’s the recorder.
I have a Pi 3b (not plus) running a pretty complex setup with a mixture of about 40 RFXCOM controlled lights and temperature sensors, a lot of Z-Wave stuff (heater control, PIRs, smoke and leak detectors, door and window sensors, etc), MQTT with power monitoring, LTE signal monitoring, BLE presence detection, multiple outdoor radar sensors, an entire security system running on it. Including multiple 4k cameras streaming at 10 to 16Mbps each through the Pi (recording is done on a dedicated NVR though). I have zero performance problems. HA’s CPU utilization is below 1% normally and only goes up when streaming security footage. The system is super snappy and responds pretty much instantly to any triggers. Lovelace served through it is also super responsive. I’m running HA in a venv under Raspbian.
Everything runs on a Sandisk Extreme+ SDCard. Never had any problems, it’s rock solid. I excluded all entities from the recorder and reenabled about 40 of them. Retention period is 5 days. The db file is below 30MB. All my devices are local, no cloud stuff, voice assistants or similar. Not sure if that makes a difference or not.
So unless running CPU intensive things like Tensorflow, I really have a hard time understanding how people can have performance problems on a modern Pi. I mean you read a lot about people posting about how happy they are with their new NUC / Synology / repurposed laptop and how much faster everything runs. I can’t help wondering what exactly runs faster for them and if there might be just a bit of a placebo effect playing in there.
I think it’s frontend response that people like. I am not that fussed, I use this mainly for local control, node red and it is a nice dashboard. Things like lights and energy saving work really well for me and I have WAF in all of this. I have also utilized all of the crappy tech that I thought I would never get working so for it to just fall over is a bit heartbreaking. Luckily I have a good backup plan now and I have fairly painlessly got back up and running in a few hours, I have not re-started mariadb as I don’t know where it is, the standard one is in the config folder and can be deleted any time if trouble happens, and I have set the recorder/logbook up in detail and cut out anything unimportant. Essentially anyone new who throws a bit at this is doomed to fail with the default settings ready to destroy it’s own hardware. There wasn’t anything wrong with my hardware, a fresh more experienced install is all.Thanks @nickrout for the help I have learned some new things.
Indeed. That situation is clearly anything but ideal. Especially since the Pi (and it’s default SD card) is promoted as a typical platform for HA.
Maybe the recorder should use a more restricted (and sane) default configuration that could be overridden for less common use cases. Do I really need to know the exact history of every single one of my light switches over one month ? Probably not. Maybe certain entity classes could be excluded from the recorder by default.
Something like that, through the UI if possible, you never know what’s around the corner, my config.yaml is mostly commented out now apart from my broadlink entities and a few stragglers (and now recorder config), it’s mostly done in the integrations section now. It has moved so quickly since I have been using it, in comparison I also use emby which moves at a snails pace, love them though. I haven’t even linked them together yet…
I used to have the same problems you list when I was on a Pi. I went to a 2011 mac mini with SSD and have had no problems since. I’d suggest an intel box. A NUC will be amazing, and those problems will be gone.
I am considering it, it becomes a big part of your life and not to forget the cheap geeky thrill of a light turning itself on/off etc. I have read even the cheapest will do a good job. What mas migrating over like? Was it just a clean install and restore a backup? One thing I do want out of this is an energy usage history so the recorder is vital to that.
I had to migrate from hassbian to hass.io (homeassistant supervised), so for me, I had to just move my config directory over manually and spend a few days fixing issues but it wasn’t too bad. These days if you are moving from hass.io snapshot to hass.io you should be able to snapshot and restore and be pretty much up and running (may have to change the /dev/ path to some of your USB devices or something). I’d bet not more than a few hours for the migration to a NUC.
I have just used a backup after re installing my pi, yeah pretty painless, how is yours installed? I was thinking of using the Ubuntu and docker method (I have a used NUC on the way).
If you mean a supervised install, only debian is supported.
I am using Ubuntu and hass supervised, but as noted above, I will soon have to migrate to Debian. If I were to do it today, I’d follow this guide, but I’m going to wait until they release the official documentation and latest installer… It’s taking a long time tough.
You won’t have too. Ubuntu will still work, it just won’t be “officially” supported by the dev team. Using Debian 10 is obviously the preferred option, however.
I will go with debian, and at the moment it looks like a bunch of gibberish, but my gut feeling is that I will be better off with docker (which I assume is something like a virtual machine) in the long run and will open my technical world up a bit. I have a working home assistant so I can take my time.
Yeah, but I don’t feel like playing the “wait and see what breaks” game. Ubuntu is definitely using components that are the wrong version and won’t be tested. I’d rather know that it will work 100%.
You should run supervised. Supervised also uses docker but gives you addons and snapshots which are very useful and necessary (ie, you will likely have to do the same with a plain docker install but it will be a lot harder). In short, supervised does everything docker does, but also does more and makes it easier.
I take it this means Hassio with supervisor? That’s the plan. My NUC is here, it’s an old one with no OS. Dual core Celeron 2.13GHz, 4gb ram and 120gb ssd. I want to run it on debian to start learning about a linux operating system. I was under the impression (by the millions of Ubuntu guides) that you then install Hassio in docker and access these with portainer. This really is all new to me but my brain is going nuts with future possibilities if I can get Hassio up and running on debian. i.e a NAS and a media server.
Yes that’s what I mean, install what used to be called hass.io on top of debian (which is what the guide I sent describes).
As for the architecture of hass.io (now called hass supervised or hass OS), it’s a combination of things that run in docker and some glue that runs outside to allow you to control the OS… But you really don’t need to worry much about that, as you will be able to manage everything via the hass ui, including installing and running portainer to manage any additional docker containers that are not already available as hass.io addons, or even the hass.io containers themselves (though this is not generally needed or recommended). If you want to learn more about the architecture, it’s here:
The first post here tells you what the different install methods do Installation Methods & Community Guides Wiki
Then this post explains how to do a suptvised install on debian Installing Home Assistant Supervised on Debian 10
Now you have all you need.
Thanks, 2 night shifts and then it’s on.