Another id ten tee thread.
Home automation is not a destination but a journey.
A journey full of dysentery, poison dart frogs and lost luggage.
I also had two broken sd cards in a short time. Since I installed a UPS to my RPI it is working awesome without any issues since half a year
I really can recommend it!
I had lots of issues with SD cards and the odd one with the Pi’s. All seems to be stable now and for months but there are loads of fake SD cards out there which are crap and for me these definitely play up with HA. I’ve had 16gb / 64gb and 256gb all fake. I tend to now only buy them from reputable shops as approved suppliers of the manufacturer or Amazon direct. Tend to find the Genuine San Disk are fine, will be trying the Samsung and High endurance cards also in the future. I only use genuine Pi 2.5 Amp power supply. There are a lot of Mini Linux / Windows PC’s out there which are good but Pi is cheaper. It was trial and error and lots of backups via win32diskimager and config files via WinSCP.
and lost hair!
I started out with running Home assistant on a Pi and then switched to a Rock64. Always had problems with SD cards crapping out. Just made sure I always made backups.
A few months ago I moved Home assistant over to my FreeNAS in a jail. No problems now and I’m never looking back.
Followed this guide: My (outdated) Quick Start for Home Assistant Core on FreeNAS 11.2
I had HassIO running for a few weeks and it died. I was unable to recover the SD card. Having read this thread, I’m now concerned about my hardware. Many folks have commended on how they use VMs or Linux PCs, or hard drives. I don’t want to have a PC running all the time and would like to persist with the RPi solution.
Has anyone migrated a working HassIO to USB HDD on RPi? Can anyone point me to an idiot’s guide?
Thanks.
That was one of the reasons I wanted on the Pi… I would just get couple of decent genuine SD card (sandisk / samsung evo or better as they come out) and keep a backup. I have fair amount set up but 16gb or 32gb has been fine. I think I’ll back up now…
You can install Raspbian and then install docker and a generic Linux install of hass.io - look for Dale3h’s script. It works great. I used that setup before I got my NUC.
i had no issues on my raspberry pi.
now i am running it in a virtual machine on my nuc… no issues
Please list the SD card brands and types so we know which ones not to use…
Running my hassio for over 18months, same
Raspberry pi, same SD Card. No issues. Even if there were issues it’s all backed up and super easy (and cheap) to restore back to how you had it.
First Pi, first SD, running for over a year. Don’t blame it on Hassio, that is definitely not the problem.
I mean, aren’t we over engineering this issue a bit? Just keep a regular snapshot, a spare SD card and if you’re really paranoid, a spare PI. Okay, you spent an extra $55US for complete redundancy vs. the electricity/space/effort/annoyance/work of putting this hobby on a more-horsepower-than-you-will-ever-need desktop/laptop/NUC/whatever.
That is another option, but as we know, choice is good. Each to their own,
But more fundamental is to have a good power supply.
I’ve been running on a RPi3 for 23 month still on the very first SD card (SanDisk Extreme) without any issues so far. So this isn’t a general problem, it is either the kind of hardware you buy as I haven’t heard of too many cases there the cards dies that quickly or very very bad luck. Just buy some quality brand SD card and that should run at least a year I’d bet without any issues.
And do backups… real backups.
Never used Hass.IO as it seems like less control to me HASSBIAN all the way
Don’t give up… it’s all a learning curve… been running it for 2 years now and my family can literally not live without it! Will do a video soon about full image backups with white space removed
Check out my video for a starter…
Hassbian install guide
Lots of opinions here. I’ll just add mine.
I ran through a few SD cards when I first started with Home-Assistant. One of the culprits was Node-Red. When I was playing with Node-Red on my hassbian installation it killed an SD. I moved to another PI and Node-Red promptly killed one there too. It may have been the logic that I implemented was frequently writing to the SD.
Another issue is the power supply. The 5V supply for the PI should be capable of at least 3 Amps output. Never run a supply at the rated limit. They will brown out when they get hot. That brings me to the other issue with power supplies. Many USB chargers are rated at 5V but may only put out 4.8V. For charging a battery that just doesn’t matter. With a PI, the low Voltage may cause errors, especially when writing to the SD.
If your mains power is subject to transients or blackouts, a battery backup will make a huge difference in stability of you system.
I have been using HA since about version 0.40 I switched to an SSD and a 4A power supply at around version 0.60 my system has been rock solid ever since.
I strongly recommend using a small SSD (40G or 60G) with the PI. They are cheap and robust. They are faster than the SD even though operating on the USB.
Same problem here… SD cards were just to slow on the Pi for stable use… So sometimes the Pi would just hang and I got an angry wife calling me that the lights don’t work automatically… etc etc…
So I bought an Intel Nuc (NUC7i5BEH) with SSD in it… and lots of RAM. Installed ESXi, installed a Ubuntu Server VM voor Home Assistant, one for InfluxDB and Grafana, one for MQTT and one for Node-RED… stable as can be! No more calls from the wife.
Brand is not the issue. Counterfeit cards is the problem. Sandisk and Samsung make the best (in my experience), but if you shop for cheap prices- you probably got a fake.
Also, buy the largest card your OS and budget allow. If your program can run on 16Gb, then buy a 64Gb card. How an SSD works is the reason. I wish I could quote the source, but here is a description of why bigger is better. Note this article was about SSD drives replacing hard-disk drives, but I haven’t found anything that says SSD cards are any different.
The performance of SSDs degrades over time. The reason goes something like this: NAND flash memory is made up of cells. These cells are arranged in a hierarchical fashion. Individuals cells are arranged into strings and, in turn, arrays. These arrays make up pages, at which point the overall number of cells typically clocks in at 32,000-128,000 per page. Those pages are arranged into yet larger blocks, which ultimately measure in megabytes.
While NAND memory can be read and written in pages, it can only be erased in whole blocks. The reason for that is complex, and involves voltage levels and minimizing errors. But the impact on performance is sobering.
Writing to empty memory cells is a breeze. But overwriting existing data is far more laborious. All data from the relevant block must be copied to a cache memory, the full block is erased, and then rewritten with the modified data.
As a drive approaches full capacity, and the availability of empty cells, arrays, pages, and blocks dwindles, it’s not hard to see how this impacts performance. But even when a drive has significant free capacity, performance can be dramatically compromised. Again, it comes down to the fact that only full blocks can be erased. The consequence is that, in the first instance, when pages within a block are deleted at operating system level, on the drive they’re only marked as dead or redundant. To actually erase those pages would require erasing the whole block, and thus caching and rewriting any pages that retain live data. It’s therefore expedient in the short term to simply mark pages as containing redundant data rather than erase them.