I also had two broken sd cards in a short time. Since I installed a UPS to my RPI it is working awesome without any issues since half a year
I really can recommend it!
I also had two broken sd cards in a short time. Since I installed a UPS to my RPI it is working awesome without any issues since half a year
I really can recommend it!
I had lots of issues with SD cards and the odd one with the Piās. All seems to be stable now and for months but there are loads of fake SD cards out there which are crap and for me these definitely play up with HA. Iāve had 16gb / 64gb and 256gb all fake. I tend to now only buy them from reputable shops as approved suppliers of the manufacturer or Amazon direct. Tend to find the Genuine San Disk are fine, will be trying the Samsung and High endurance cards also in the future. I only use genuine Pi 2.5 Amp power supply. There are a lot of Mini Linux / Windows PCās out there which are good but Pi is cheaper. It was trial and error and lots of backups via win32diskimager and config files via WinSCP.
and lost hair!
I started out with running Home assistant on a Pi and then switched to a Rock64. Always had problems with SD cards crapping out. Just made sure I always made backups.
A few months ago I moved Home assistant over to my FreeNAS in a jail. No problems now and Iām never looking back.
Followed this guide: My (outdated) Quick Start for Home Assistant Core on FreeNAS 11.2
I had HassIO running for a few weeks and it died. I was unable to recover the SD card. Having read this thread, Iām now concerned about my hardware. Many folks have commended on how they use VMs or Linux PCs, or hard drives. I donāt want to have a PC running all the time and would like to persist with the RPi solution.
Has anyone migrated a working HassIO to USB HDD on RPi? Can anyone point me to an idiotās guide?
Thanks.
That was one of the reasons I wanted on the Piā¦ I would just get couple of decent genuine SD card (sandisk / samsung evo or better as they come out) and keep a backup. I have fair amount set up but 16gb or 32gb has been fine. I think Iāll back up nowā¦
You can install Raspbian and then install docker and a generic Linux install of hass.io - look for Dale3hās script. It works great. I used that setup before I got my NUC.
i had no issues on my raspberry pi.
now i am running it in a virtual machine on my nucā¦ no issues
Please list the SD card brands and types so we know which ones not to useā¦
Running my hassio for over 18months, same
Raspberry pi, same SD Card. No issues. Even if there were issues itās all backed up and super easy (and cheap) to restore back to how you had it.
First Pi, first SD, running for over a year. Donāt blame it on Hassio, that is definitely not the problem.
I mean, arenāt we over engineering this issue a bit? Just keep a regular snapshot, a spare SD card and if youāre really paranoid, a spare PI. Okay, you spent an extra $55US for complete redundancy vs. the electricity/space/effort/annoyance/work of putting this hobby on a more-horsepower-than-you-will-ever-need desktop/laptop/NUC/whatever.
That is another option, but as we know, choice is good. Each to their own,
But more fundamental is to have a good power supply.
Iāve been running on a RPi3 for 23 month still on the very first SD card (SanDisk Extreme) without any issues so far. So this isnāt a general problem, it is either the kind of hardware you buy as I havenāt heard of too many cases there the cards dies that quickly or very very bad luck. Just buy some quality brand SD card and that should run at least a year Iād bet without any issues.
And do backupsā¦ real backups.
Never used Hass.IO as it seems like less control to me HASSBIAN all the way
Donāt give upā¦ itās all a learning curveā¦ been running it for 2 years now and my family can literally not live without it! Will do a video soon about full image backups with white space removed
Check out my video for a starterā¦
Hassbian install guide
Lots of opinions here. Iāll just add mine.
I ran through a few SD cards when I first started with Home-Assistant. One of the culprits was Node-Red. When I was playing with Node-Red on my hassbian installation it killed an SD. I moved to another PI and Node-Red promptly killed one there too. It may have been the logic that I implemented was frequently writing to the SD.
Another issue is the power supply. The 5V supply for the PI should be capable of at least 3 Amps output. Never run a supply at the rated limit. They will brown out when they get hot. That brings me to the other issue with power supplies. Many USB chargers are rated at 5V but may only put out 4.8V. For charging a battery that just doesnāt matter. With a PI, the low Voltage may cause errors, especially when writing to the SD.
If your mains power is subject to transients or blackouts, a battery backup will make a huge difference in stability of you system.
I have been using HA since about version 0.40 I switched to an SSD and a 4A power supply at around version 0.60 my system has been rock solid ever since.
I strongly recommend using a small SSD (40G or 60G) with the PI. They are cheap and robust. They are faster than the SD even though operating on the USB.
Same problem hereā¦ SD cards were just to slow on the Pi for stable useā¦ So sometimes the Pi would just hang and I got an angry wife calling me that the lights donāt work automaticallyā¦ etc etcā¦
So I bought an Intel Nuc (NUC7i5BEH) with SSD in itā¦ and lots of RAM. Installed ESXi, installed a Ubuntu Server VM voor Home Assistant, one for InfluxDB and Grafana, one for MQTT and one for Node-REDā¦ stable as can be! No more calls from the wife.
Brand is not the issue. Counterfeit cards is the problem. Sandisk and Samsung make the best (in my experience), but if you shop for cheap prices- you probably got a fake.
Also, buy the largest card your OS and budget allow. If your program can run on 16Gb, then buy a 64Gb card. How an SSD works is the reason. I wish I could quote the source, but here is a description of why bigger is better. Note this article was about SSD drives replacing hard-disk drives, but I havenāt found anything that says SSD cards are any different.
The performance of SSDs degrades over time. The reason goes something like this: NAND flash memory is made up of cells. These cells are arranged in a hierarchical fashion. Individuals cells are arranged into strings and, in turn, arrays. These arrays make up pages, at which point the overall number of cells typically clocks in at 32,000-128,000 per page. Those pages are arranged into yet larger blocks, which ultimately measure in megabytes.
While NAND memory can be read and written in pages, it can only be erased in whole blocks. The reason for that is complex, and involves voltage levels and minimizing errors. But the impact on performance is sobering.
Writing to empty memory cells is a breeze. But overwriting existing data is far more laborious. All data from the relevant block must be copied to a cache memory, the full block is erased, and then rewritten with the modified data.
As a drive approaches full capacity, and the availability of empty cells, arrays, pages, and blocks dwindles, itās not hard to see how this impacts performance. But even when a drive has significant free capacity, performance can be dramatically compromised. Again, it comes down to the fact that only full blocks can be erased. The consequence is that, in the first instance, when pages within a block are deleted at operating system level, on the drive theyāre only marked as dead or redundant. To actually erase those pages would require erasing the whole block, and thus caching and rewriting any pages that retain live data. Itās therefore expedient in the short term to simply mark pages as containing redundant data rather than erase them.
At least for Grafana and InfluxDB I would suggest to use a single VM and run those applications as Docket containers. Docker is not my best friend and I tend to stick to VMs as well. But at least those two applications run very reliable in Docker, and that saves a lot of resources. Full VMs just arenāt worth the effort for such simple applications (in my opinion).
I assume thatās a typo and you meant SD cards?
Although the memory technology used in both is similar, a micro SD cardās space constraints limit the controllerās complexity. In other words, whereas SSDās will have wear-leveling, SD cards typically wonāt.
Wear-leveling tries to avoid repeated write/erase cycles to the same block of memory (thereby wearing it out prematurely compared to neighboring blocks). It spreads the wear-and-tear over all available free blocks. Therefore, having an SSD substantially larger than needed provides ample free space for wear-leveling to do its job.
Most micro SD cards donāt have room for sophisticated controllers capable of providing wear-leveling. So the ābuy biggerā advice isnāt particularly applicable to micro SD cards.
There are many articles on this topic. Hereās one:
FWIW, as an anecdotal data point, Iāve been using the same micro SD card in an RPI for over a year. It runs openHAB and Node-red. OpenHAB logs each and every event (door open/closed, temperature, lights on/off, brightness increase/decrease, etc). The RPi has a proper power-supply and is protected by a UPS. In addition, I installed log2ram to minimize write-cycles to the micro SD card (Silicon Power Elite 16 Gb). Log2ram maps /var/log to RAM and periodically writes everything it logs to disk (I have it set to once per hour). I would expect similar reliability if used with Home Assistant.