Is `home-assistant.log` stored in RAM or on the SD card using HassIO?

I sometimes have a component that writes to the home-assistant.log file every 10 seconds with a message, I was wondering if this file is stored in RAM or on the SD card?

If it’s on the SD card then I can reduve the number or messages going to the log, but if it’s in RAM then I can leave it.

Here’s my info:

arch: armv7
boot: true
custom: false
image: homeassistant/raspberrypi3-homeassistant
ip_address: 172.30.32.1
last_version: 0.94.1
machine: raspberrypi3
port: 8123
ssl: false
version: 0.94.1
wait_boot: 600
watchdog: true

SD card. /config/home-assistant.log

1 Like

And there seems to be no shortage of debate/discussion as to how excessive logging affects SD card life by using up limited writes. I do know I’ve killed one card so far, but with one data point I don’t feel like I can extrapolate anything useful.

At any rate, there’s no good reason to push your luck by constantly writing to the log once you’ve debugged whatever it was that gave you trouble. Unless, of course it’s buggy component that can’t be muted – looking at you BMW Connected Drive throwing an error every time you notice I’ve popped up the sunroof!

I don’t know why HA wants to write so much stuff onto a SD card. I’ve killed three cards so far and I switched to a NUC with a SSD, because … yeah.

It should disable logging and databases on a SD card. Maybe keep the database/logs in RAM and write it to the SD card once a day, but that’s it. Every couple seconds is ridiculous for a SD card.

SD card. /config/homeassistant.log

@tom_l Ah OK I’ll look at reducing the amount on info in the log. Some OS’ which use RPi systems with read only SD card force logs to go to the RAM.

It should disable logging and databases on a SD card. Maybe keep the database/logs in RAM and write it to the SD card once a day , but that’s it. Every couple seconds is ridiculous for a SD card.

I’m already using an external database, MariaDB 10 hosted on a NAS so I figured the less writes to the SD card the better.

I may see if there’s some way to change the log file storage location to RAM, it gets wiped on reboot/restart already so it seems odd that it’s being stored on the SD card.

And there seems to be no shortage of debate/discussion as to how excessive logging affects SD card life by using up limited writes. I do know I’ve killed one card so far, but with one data point I don’t feel like I can extrapolate anything useful.

So far I’ve not killed an SD card using HassIO but even so I’m looking to write to the SD card only when needed.

The ‘logger’ component in HA can be set to only show fatal errors from the more verbose components.

I was getting errors from my my media player every 11 seconds until I added this:

homeassistant.components.squeezebox.media_player: fatal

Unfortunately, with the BMW Connected Drive component, having the sunroof popped open is a “critical” error, not just a source of fresh air on a nice day. Obviously a problem with the component, both because that’s not a critical error and because it’s a bug (that I’ve made a pull request in the upstream library to fix). But it’s a Home Assistant problem as well, since a buggy or simply overly verbose component can’t be completely silenced in the configuration. I’m sure this isn’t limited to just the example I’ve listed either.

The fact that I’ve now killed 2 SD cards causes me to modify my driving behavior is a little crazy, IMHO. :crazy_face:

For what it’s worth, I’ve been running the default SQL-lite db completely in RAM. This recorder db_url option enables it:

recorder:
  db_url: 'sqlite:///:memory:'
  purge_interval: 1
  purge_keep_days: 7

On a restart, my Hassio install on a RasPi3 B+ has about 700mb memory free and as time goes on, available RAM slowly reduces as events are recorded. What also really helps is to only record what you really need and exclude the majority of the domains and entities using those include/exclude recorder options.

I still do a lot of development with Hass and restart it quite often, so I’ve never gotten below 600mb available. I’m pretty sure the scheduled purge will free it back up automatically if no restart is done. An automation could also be added to make sure memory doesn’t drop below whatever.

Bonus - history and logbook are much quicker too! I’ve held off upgrading to a NUC because this is working so well.

7 Likes

That’s interesting. I didn’t realise it was possible. I can see why it wasn’t included in the docs. The memory alert would definitely be required or you’re going to (eventually) have a bad time.

I need to credit this post by @clyra for providing the first mention of this Hass in-memory database option that I have seen.

I’ve been running with this for well over a month on 2 busy Hass systems with no problems at all. The main downside is loss of history after a restart but since I only use the history/logbook for troubleshooting, that’s a non-issue for me, especially now that they display in a few seconds.

I guess if keeping up with the 3 weekly update cycle the frequency of restarts should be sufficient to to prevent issues.

If I may…

Prefer using log2ram, using memory looses history during each HA restart.

The tip I had mentioned was for writing the HASS sql-lite database to memory which is very fast and save a lot of wear and tear on the SD-Card. The downside of course is history is lost after a restart, but for me, that trade-off is worth it. I have been running HASSIO in this mode for several months with no issues at all - on both a pi3B and pi4B.

As I understand it, log2ram is intended for writing Linux system logs (/var/log) to memory. I don’t have any idea how write-intensive that log actually is. I suspect not very much, but since I use HASSIO, I don’t think it is straighforward to install log2ram in any case.

The official HASS installation guide for Raspberry Pi now recommends an “A2” SD-Card that is designed for write-intensive I/O.

Shame there isn’t away to save it back to the file on a shutdown.

There actually is a way. See the SQLite Backup API information which describes how it can be done . Looks like it would be very straightforward to save the database on a shutdown through a script.

The tricky part is running the backup restore to the in-memory database immediately after HASS has initialized it on startup.

One issue with this technique is if HASS crashed and not shutdown. This could be mitigated somewhat by automatic backups nightly or whenever.

Seems like this would be a really good enhancement request, especially for Raspberry Pi users.

1 Like

I think the restore should be run before homeassistant start and just nightly backup should be fine… Anyone know how to come up a script?

Well I’ve discovered that this is definitely not as straightforward as it first seemed. By default, sqLite when using the “:memory:” option, creates a private in memory instance of the database that cannot be shared by other processes. However there is an option to enable sharing (“file::memory:?cache=shared”). But all I could manage to do with this was create a local database on disk with that as the full name!

It turns out that the recorder uses SQLAlchemy library to create the database so you can’t just simply define sqLite parameters and expect to use them as-is. Bottom line - in order to backup/restore a Home Assistant memory database I’m pretty sure is going to require some code changes to recorder.

Anyone successfully mount the database in ram and regularly update to the disk ?
Thanks

Not that I know of.

1 Like