Large homeassistant database files

Well, it totally spammed the log. I traced it by looking in the ozw.log and then by running the query by @seren Large homeassistant database files

It was quite obvious that my two Nodon switches were spamming.
I tried to configure them to only report what I needed, but at no avail. So I sold them.

Hi all,

If anyone is using postgresql and can help me with my purge issue it would be much appreciate

I’m having trouble now and then with the size of the HA database - I will then go to the /dev-service tab and use this (takes a while before you will see the results):

4 Likes

Hi, since events table keeps 2/3 of DB size (and it is not useful as historical data), is it possible to record only “states” excluding “events”?

I believe if you turn off logger, you’ll remove the events. I’ve never done it so take this with a grain of salt. Logger is what uses events inside your history.

Have a look at this post.

1 Like

I’ve compressed the tables a few months back. if I recall the event table went from 2.3G to 1.7G. this is the command ran for each table.

ALTER TABLE homeassistant.events ROW_FORMAT=Compressed

how dit you make this sensor?

Have a look into the sql sensor.

There are examples for the databases sizes when you are using postgres or mysql / mariadb.

Hi,

I followed https://www.home-assistant.io/integrations/sql/.

This is my sensor:

- platform: sql
  db_url: !secret db_connection_url
  queries:
    - name: 'ha_db'
      query: 'SELECT table_schema "database", Round(Sum(data_length + index_length) / 1024 / 1024, 1) "value" FROM information_schema.tables WHERE table_schema="homeassistant" GROUP BY table_schema;'
      column: 'value'
      unit_of_measurement: MB

where this is in secrets.yaml:
db_connection_url: mysql://<user in maria db>:<maria db password>@core-mariadb/homeassistant

Good luck!

3 Likes

hi found another solution with filesize integration buth thanks!

If your system on HA is slowing down on an RPi3, don’t start to fiddle around with your database.
The reason is very likely the limited RAM of your Pi.

So the real solution for this problem is to upgrade to an RPi4 with 4 or 8 Gb of RAM. I have done that and my system flies! You can use my extensive installation guide to get HA running on an RPi4:

1 Like

Is it possible to move the SQLite database to a external drive (away from the SD card) using a symlink on HA OS?
It has very limited options on terminal so this is what I was wondering. And I guess even if so this would result in an unsupported state, wouldn’t it?

You can specify the location of the database so you wouldn’t even have to use a symlink.

Example:

recorder:
  db_url: sqlite:////home/user/.homeassistant/test
1 Like

What´s the default path for SQLite on Home Assistant OS?

The SQL in this old post to get info on what devices are causing the most number of events didn’t work for me. Possibly because the post is so old or possibl because I’m using MariaDB.
In any case, this worked for me:

SELECT verrijkt.entity_id, SUM(LENGTH(attributes)) size, COUNT(*) count, SUM(LENGTH(attributes))/COUNT(*) avg

FROM (SELECT new_states.entity_id, events.event_id, new_states.attributes FROM events LEFT JOIN states as new_states ON events.event_id = new_states.event_id) as verrijkt

GROUP BY verrijkt.entity_id

ORDER BY size DESC

I hope I did it right because my SQL writing days are over a decade ago … :wink:

I use SQLite3.

Here you have how to connect and 2 query samples:

From the Linux command prompt:

sqlite3 home-assistant_v2.db

.header on

.databases

SELECT entity_id, COUNT(*) as count FROM states GROUP BY entity_id ORDER BY count DESC LIMIT 90;

SELECT entity_id, COUNT(*) as count FROM events GROUP BY entity_id ORDER BY count DESC LIMIT 30;

SELECT entity_id, COUNT(*) as count FROM states GROUP BY entity_id ORDER BY count DESC LIMIT 90;
works great but

SELECT entity_id, COUNT(*) as count FROM events GROUP BY entity_id ORDER BY count DESC LIMIT 30;

gives error cause there is no such column in the events table (no such column: entity_id).

1 Like

For anyone struggling with database size - you can try using addon which shows what’s using most space: Dbstats - Addon for database usage analytics

2 Likes