Large homeassistant database files

I was having the same issue, I’m running Hassbian. You need to uncomment the bottom line in your /etc/apt/sources.list, then do a “apt get update” and finally “apt-get install libmariadbclient-dev”

 $ cat /etc/apt/sources.list
deb http://raspbian.raspberrypi.org/raspbian/ stretch main contrib non-free rpi
# Uncomment line below then 'apt-get update' to enable 'apt-get source'
deb-src http://raspbian.raspberrypi.org/raspbian/ stretch main contrib non-free rpi

Hope this helps.

Just wondering. How much memory is your MySQLd using ?
This is giving a nice output of all processes:

ps -eo size,pid,user,command --sort -size | awk '{ hr=$1/1024 ; printf("%13.2f Mb ",hr) } { for ( x=4 ; x<=NF ; x++ ) { printf("%s ",$x) } print "" }' |cut -d "" -f2 | cut -d "-" -f1

Mine is 600Mb and using a lot of my Pi memory.

Thanks!

My DB grew 40 MB per hour until I found out what sensors that caused it.
My Nodon switch created 35 000 entries per hour.

After excluding it, the DB now grows 0,7 MB/hour.

image

1 Like

What was the Nodon switch doing? How did you trace it back to that sensor?

Well, it totally spammed the log. I traced it by looking in the ozw.log and then by running the query by @seren Large homeassistant database files

It was quite obvious that my two Nodon switches were spamming.
I tried to configure them to only report what I needed, but at no avail. So I sold them.

Hi all,

If anyone is using postgresql and can help me with my purge issue it would be much appreciate

I’m having trouble now and then with the size of the HA database - I will then go to the /dev-service tab and use this (takes a while before you will see the results):

4 Likes

Hi, since events table keeps 2/3 of DB size (and it is not useful as historical data), is it possible to record only “states” excluding “events”?

I believe if you turn off logger, you’ll remove the events. I’ve never done it so take this with a grain of salt. Logger is what uses events inside your history.

Have a look at this post.

1 Like

I’ve compressed the tables a few months back. if I recall the event table went from 2.3G to 1.7G. this is the command ran for each table.

ALTER TABLE homeassistant.events ROW_FORMAT=Compressed

how dit you make this sensor?

Have a look into the sql sensor.

There are examples for the databases sizes when you are using postgres or mysql / mariadb.

Hi,

I followed https://www.home-assistant.io/integrations/sql/.

This is my sensor:

- platform: sql
  db_url: !secret db_connection_url
  queries:
    - name: 'ha_db'
      query: 'SELECT table_schema "database", Round(Sum(data_length + index_length) / 1024 / 1024, 1) "value" FROM information_schema.tables WHERE table_schema="homeassistant" GROUP BY table_schema;'
      column: 'value'
      unit_of_measurement: MB

where this is in secrets.yaml:
db_connection_url: mysql://<user in maria db>:<maria db password>@core-mariadb/homeassistant

Good luck!

3 Likes

hi found another solution with filesize integration buth thanks!

If your system on HA is slowing down on an RPi3, don’t start to fiddle around with your database.
The reason is very likely the limited RAM of your Pi.

So the real solution for this problem is to upgrade to an RPi4 with 4 or 8 Gb of RAM. I have done that and my system flies! You can use my extensive installation guide to get HA running on an RPi4:

1 Like

Is it possible to move the SQLite database to a external drive (away from the SD card) using a symlink on HA OS?
It has very limited options on terminal so this is what I was wondering. And I guess even if so this would result in an unsupported state, wouldn’t it?

You can specify the location of the database so you wouldn’t even have to use a symlink.

Example:

recorder:
  db_url: sqlite:////home/user/.homeassistant/test
1 Like

What´s the default path for SQLite on Home Assistant OS?

The SQL in this old post to get info on what devices are causing the most number of events didn’t work for me. Possibly because the post is so old or possibl because I’m using MariaDB.
In any case, this worked for me:

SELECT verrijkt.entity_id, SUM(LENGTH(attributes)) size, COUNT(*) count, SUM(LENGTH(attributes))/COUNT(*) avg

FROM (SELECT new_states.entity_id, events.event_id, new_states.attributes FROM events LEFT JOIN states as new_states ON events.event_id = new_states.event_id) as verrijkt

GROUP BY verrijkt.entity_id

ORDER BY size DESC

I hope I did it right because my SQL writing days are over a decade ago … :wink: