Everywhere I read this table should not consume much space, and therefore is never cleaned-up. Long term history is stored forever. (Which is nice)
I have a quite large (and messy?) setup and the table is now 1.9 GB in size (that’s not small right, of the 4,8 in total). It contains 12.8 Million records. What is wrong here?
The states table was even worse but I started cleaning a lot, excluding lot from the recorder, disabled many entities, reduced update rates, etc… This table is now reduced from 4.4 GB (14.5M records) to 1.9GB (3.5 M records). I’m still cleaning and the recorder history/purge settings will clean this tables over time.
I removed many entities from the developer tools > statistics that didn’t have any states anymore, old, removed, etc. I can;t clean anything else from here.
However, the amount of records in this table is only slightly reduced from 13M to 12.8M.
How can I trim this statistics table (without removing important data)? Is there a way to figure out if there are some specific values overflowing this table?
I keep reading this table should be very small in size, 1,9 GB or about 40% of the total DB size is not small. So, I assume something is wrong?
I used this to count the amount of line of each entity.
There are currently 1190 unique entities.
SELECT
statistics.metadata_id,
statistics_meta.id,
statistics_meta.statistic_id,
COUNT(*)
FROM statistics, statistics_meta
WHERE statistics.metadata_id = statistics_meta.id
GROUP BY
statistics_meta.statistic_id
ORDER BY
COUNT(*) DESC;
some entities have 27K entries in this table, while other have only 3-5K.
27K hourly entries is about 3Y, which might be possible and match the introduction of statistics. So, not sure if this is strange or just normal.
What I do see, is still some old/removed statistics. For example some linkquality entities that I disabled (and “fixed”) in the statistics page. However, the values are still in the table.