I use Homeassistant Core in Docker on a Raspberry Pi. Log files are written permanently (logical). I’m afraid that my SD card will soon break if it is constantly being written to.
Is there a solution for this?
Is it possible to save only the log files somewhere else?
Exactly! Running a Pi 24/7 on an SD card should be forbidden. Go for SSD.
Also it is not related to logfiles. HA is a database-intense application stack which will kill every SD-card sooner or later. Every operating system will kill it. SD is best used in cameras and only there.
Something like this is a easy switch. Doesn’t require a hub. I’ve been running one for 2 years straight now in two different pi’s and I am rather hard on it. Backups to disk every night and logs and databases and…
(Not all USB drives are created equal. Most are simply sdcards in USB form. This isn’t. Instead of using a eMMC – NAND controller, this device actually uses an SSD controller over a SCSI to USB bridge. Making it essentially a Solid State Flash Drive.
Flash your pi’s eeprom to boot from usb and run an SSD of some sort.
You’re in the minority. That’s a fact as well. Yes good sdcards help but it is inevitable.
My point though is there are better tools available for very similar price points with actual controllers to make it less of an issue. Why use an sdcard when you can get SSD USBs these days for similar price points.
I’ve probably gone through 50+ sdcards over the years since the original RPIs came out. No they were not cheapos. Running OSs on them are pretty rough. There are ways to make them better (read only fsoverlays for example) but they aren’t built for an OS 24x7.
Maybe update your ‘facts’. Modern high endurance SD cards use the exact same memory chips as SSDs and use the exact same wear leveling algorithms. The main difference is IO speed on the Pi. Using an SSD on a Pi for HA is completely unnecessary and often comes with more problems than it solves (and power usage !). If you really need more IO bandwidth and processing power, then a NUC with SSD or HDD is the best next step.
I’ve used one. One single SD card since I started with Domoticz years ago. I still use this same SD card with HA to this day. Something is wrong on your side. Buy quality SD cards, not fakes online. Use a quality power supply. Use a large SD card so that the wear leveling has enough headroom.
Logs are unproblematic. The amount of data written is insignificant and the wear leveling will make sure it is never written to the same cell twice. The history database is more of an issue and should be either reduced to a minimum or better, outsourced to a different device (NAS, USB, etc).
Obviously I’m not going to argue. Google is your friend and my experience using reputable cards (ones recommended on the raspberry pi foundation forums as well) always met with death in well under a year. Usually 4-6 months max. Yes I had some lucky ones last longer…
Enlighten us on this magical card that can last 5+ years.
Let me reiterate. You have an issue on your system. A serious one. That is not normal.
It’s a Sandisk extreme plus. Nothing special about it. There are considerably better ones on the market now. 5 years+ is absolutely normal for quality SD cards, and it has been for years.
I have a development system on another Pi, running a second dev instance of HA. It uses another Sandisk Ultra card, also several years old now. This is a heavily abused system. Multiple HA updates and downgrades, it’s running X with a full IDE and g++ toolchain, I’m compiling code on it (compilers create tons of small temporary files), etc. Never had a single issue related to the SD card.
Yes, there are modern SD-cards with improved reliability.
Anyway I still won’t recommend the usage of SD-cards in any 24/7 environment.
At least once a week you can find a post of a whining user here on the forum who has lost his HA-installation due to errors on the SD-card!
And many of them don’t have backups or they have made backups but have not downloaded those backups from the SD-card (this still is big issue which the HA-devs could easily change:
After backup simply ask/remind the user to store the backup somewhere else or even kick off the download dialog…)
And then you have the tens of thousands of HA users with systems working just fine on an SD card, but who don’t post threads about great everything works. The number one reason SD cards fail for people, by a very very large margin, are cheap fake cards. People either cheap out on a vital component on their system and buy the cheapest one they can find on Ali. Or they actually try to find better ones, but don’t realize that pretty much all SD cards you can buy on Amazon, Ebay, etc, are counterfeit. Buy quality cards, don’t cheap out on them and buy them from a reputable vendor. Like a large brick and mortar store.
Other reasons are bad power supplies (although that rarely affects the SD card hardware, unless it’s a cheap fake, but more the filesystem) and selecting an SD card without enough space. If you think you’ll need 16GB, then buy a 64GB card. Give the controller space to work with. SSDs, despite using the same memory chips as high endurance SD cards, will typically have more ‘hidden’ areas that gets swapped in to replace failed cells, because of the larger form factor.
It works for me, it works for Tom, it works for the large silent majority of HA RPi users. There’s no reason it won’t work for you !
Oh and manage your history database if you store it on the card. Remove entities you don’t need. The fact that HA records everything by default is a really stupid design decision.
This means if you want to stick with fragile MLC sdcards, you want to reduce writes and especially deletes. So moving the database elsewhere will help reduce your wear leveling on the sdcard and in theory gain life of the cards. If you spend the time reducing significant writes to the cards, then you too can be in that ‘silent majority’.
Yes, the solution is actually very easy. You are lucky that you are in control of your host OS. Just check to ether use a flash friendly filesystem (f2fs to name one) or set the commit interval to value that will avoid hammering on the sd card in a 5 second interval (including heavy write amplification)
Best would be to utilize log2ram but that’s probably out of your reach as this is inside the docker I expect.
I have SBC’s which running on (cheap A1 rated Sandisk/Samsung) SD-Cards for over 5 years - no problems.
Just because RaspberryOS default filesystem settings are from the stone age (the time with the spinning disks) and users don’t bother about it should it be forbidden?
Better would be that SBC OSes like RaspberryOS or HaOS (at least in version 7) ether change there defaults directly to flash friendly filesystem (like f2fs) or just simply change the ext4 commit level to a sane value.
There is this “theory” that the rpi trading ltd actually has some contracts with flash manufactures… So these things don’t burn fossil fuels to run but sd cards instead
This is why I say “Just run it all from a proper SSD”. Yes the case can be made that someone can configure the OS/App correctly (reduce history/writes/databases/etc) but at the end of the day, the KISS method is use a device that you don’t need to bother as much.
I’ll guess that USB stick is not much better than the SDCard but at least you won’t burn the boot/HA drive. If you want to move the database, it could be moved with this:
That is really the most intensive thing on HA. you’d be better off though running a mysql/mariadb container and pointing the logging at that instead (same link has the info to repoint it). The other main IO intensive thing is backups. I don’t think you can configure it to use a different drive. So maybe backup less.
I do agree with others in the thread that the log file itself isn’t very intensive and I wouldn’t worry about it. I believe you can move the location but I am not sure how. I remember seeing a PR for that a while ago.