I give up, just is not worth the time and effort


#22

I also had two broken sd cards in a short time. Since I installed a UPS to my RPI it is working awesome without any issues since half a year :blush:

I really can recommend it!


#23

I had lots of issues with SD cards and the odd one with the Pi’s. All seems to be stable now and for months but there are loads of fake SD cards out there which are crap and for me these definitely play up with HA. I’ve had 16gb / 64gb and 256gb all fake. I tend to now only buy them from reputable shops as approved suppliers of the manufacturer or Amazon direct. Tend to find the Genuine San Disk are fine, will be trying the Samsung and High endurance cards also in the future. I only use genuine Pi 2.5 Amp power supply. There are a lot of Mini Linux / Windows PC’s out there which are good but Pi is cheaper. It was trial and error and lots of backups via win32diskimager and config files via WinSCP.


#24

and lost hair!


#25

I started out with running Home assistant on a Pi and then switched to a Rock64. Always had problems with SD cards crapping out. Just made sure I always made backups.

A few months ago I moved Home assistant over to my FreeNAS in a jail. No problems now and I’m never looking back.

Followed this guide: My (almost) complete quick start to installing Home Assistant on FreeNAS 11.2 (Including AppDaemon/HA-Dashboard, Hass-Configurator, Mosquitto and TasmoAdmin)


#26

I had HassIO running for a few weeks and it died. I was unable to recover the SD card. Having read this thread, I’m now concerned about my hardware. Many folks have commended on how they use VMs or Linux PCs, or hard drives. I don’t want to have a PC running all the time and would like to persist with the RPi solution.

Has anyone migrated a working HassIO to USB HDD on RPi? Can anyone point me to an idiot’s guide?

Thanks.


#27

That was one of the reasons I wanted on the Pi… I would just get couple of decent genuine SD card (sandisk / samsung evo or better as they come out) and keep a backup. I have fair amount set up but 16gb or 32gb has been fine. I think I’ll back up now…


#28

You can install Raspbian and then install docker and a generic Linux install of hass.io - look for Dale3h’s script. It works great. I used that setup before I got my NUC.


#29

i had no issues on my raspberry pi.
now i am running it in a virtual machine on my nuc… no issues


#30

Please list the SD card brands and types so we know which ones not to use…


#31

Running my hassio for over 18months, same
Raspberry pi, same SD Card. No issues. Even if there were issues it’s all backed up and super easy (and cheap) to restore back to how you had it.


#32

First Pi, first SD, running for over a year. Don’t blame it on Hassio, that is definitely not the problem.


#33

I mean, aren’t we over engineering this issue a bit? Just keep a regular snapshot, a spare SD card and if you’re really paranoid, a spare PI. Okay, you spent an extra $55US for complete redundancy vs. the electricity/space/effort/annoyance/work of putting this hobby on a more-horsepower-than-you-will-ever-need desktop/laptop/NUC/whatever.


#34

That is another option, but as we know, choice is good. Each to their own,

But more fundamental is to have a good power supply.


#35

I’ve been running on a RPi3 for 23 month still on the very first SD card (SanDisk Extreme) without any issues so far. So this isn’t a general problem, it is either the kind of hardware you buy as I haven’t heard of too many cases there the cards dies that quickly or very very bad luck. Just buy some quality brand SD card and that should run at least a year I’d bet without any issues.

And do backups… real backups.


#36

Never used Hass.IO as it seems like less control to me :wink: HASSBIAN all the way :slight_smile:
Don’t give up… it’s all a learning curve… been running it for 2 years now and my family can literally not live without it! Will do a video soon about full image backups with white space removed :wink:
Check out my video for a starter…
Hassbian install guide


#37

Lots of opinions here. I’ll just add mine.
I ran through a few SD cards when I first started with Home-Assistant. One of the culprits was Node-Red. When I was playing with Node-Red on my hassbian installation it killed an SD. I moved to another PI and Node-Red promptly killed one there too. It may have been the logic that I implemented was frequently writing to the SD.
Another issue is the power supply. The 5V supply for the PI should be capable of at least 3 Amps output. Never run a supply at the rated limit. They will brown out when they get hot. That brings me to the other issue with power supplies. Many USB chargers are rated at 5V but may only put out 4.8V. For charging a battery that just doesn’t matter. With a PI, the low Voltage may cause errors, especially when writing to the SD.
If your mains power is subject to transients or blackouts, a battery backup will make a huge difference in stability of you system.
I have been using HA since about version 0.40 I switched to an SSD and a 4A power supply at around version 0.60 my system has been rock solid ever since.
I strongly recommend using a small SSD (40G or 60G) with the PI. They are cheap and robust. They are faster than the SD even though operating on the USB.


#38

Same problem here… SD cards were just to slow on the Pi for stable use… So sometimes the Pi would just hang and I got an angry wife calling me that the lights don’t work automatically… etc etc…

So I bought an Intel Nuc (NUC7i5BEH) with SSD in it… and lots of RAM. Installed ESXi, installed a Ubuntu Server VM voor Home Assistant, one for InfluxDB and Grafana, one for MQTT and one for Node-RED… stable as can be! No more calls from the wife.


#39

Brand is not the issue. Counterfeit cards is the problem. Sandisk and Samsung make the best (in my experience), but if you shop for cheap prices- you probably got a fake.

Also, buy the largest card your OS and budget allow. If your program can run on 16Gb, then buy a 64Gb card. How an SSD works is the reason. I wish I could quote the source, but here is a description of why bigger is better. Note this article was about SSD drives replacing hard-disk drives, but I haven’t found anything that says SSD cards are any different.

The performance of SSDs degrades over time. The reason goes something like this: NAND flash memory is made up of cells. These cells are arranged in a hierarchical fashion. Individuals cells are arranged into strings and, in turn, arrays. These arrays make up pages, at which point the overall number of cells typically clocks in at 32,000-128,000 per page. Those pages are arranged into yet larger blocks, which ultimately measure in megabytes.

While NAND memory can be read and written in pages, it can only be erased in whole blocks. The reason for that is complex, and involves voltage levels and minimizing errors. But the impact on performance is sobering.

Writing to empty memory cells is a breeze. But overwriting existing data is far more laborious. All data from the relevant block must be copied to a cache memory, the full block is erased, and then rewritten with the modified data.

As a drive approaches full capacity, and the availability of empty cells, arrays, pages, and blocks dwindles, it’s not hard to see how this impacts performance. But even when a drive has significant free capacity, performance can be dramatically compromised. Again, it comes down to the fact that only full blocks can be erased. The consequence is that, in the first instance, when pages within a block are deleted at operating system level, on the drive they’re only marked as dead or redundant. To actually erase those pages would require erasing the whole block, and thus caching and rewriting any pages that retain live data. It’s therefore expedient in the short term to simply mark pages as containing redundant data rather than erase them.


#40

At least for Grafana and InfluxDB I would suggest to use a single VM and run those applications as Docket containers. Docker is not my best friend and I tend to stick to VMs as well. But at least those two applications run very reliable in Docker, and that saves a lot of resources. Full VMs just aren’t worth the effort for such simple applications (in my opinion).


#41

I assume that’s a typo and you meant SD cards?

Although the memory technology used in both is similar, a micro SD card’s space constraints limit the controller’s complexity. In other words, whereas SSD’s will have wear-leveling, SD cards typically won’t.

Wear-leveling tries to avoid repeated write/erase cycles to the same block of memory (thereby wearing it out prematurely compared to neighboring blocks). It spreads the wear-and-tear over all available free blocks. Therefore, having an SSD substantially larger than needed provides ample free space for wear-leveling to do its job.

Most micro SD cards don’t have room for sophisticated controllers capable of providing wear-leveling. So the ‘buy bigger’ advice isn’t particularly applicable to micro SD cards.

There are many articles on this topic. Here’s one:


FWIW, as an anecdotal data point, I’ve been using the same micro SD card in an RPI for over a year. It runs openHAB and Node-red. OpenHAB logs each and every event (door open/closed, temperature, lights on/off, brightness increase/decrease, etc). The RPi has a proper power-supply and is protected by a UPS. In addition, I installed log2ram to minimize write-cycles to the micro SD card (Silicon Power Elite 16 Gb). Log2ram maps /var/log to RAM and periodically writes everything it logs to disk (I have it set to once per hour). I would expect similar reliability if used with Home Assistant.