Database size sensor?

Correct. No entities I’ve excluded from Recorder (to save storage) show up in the statistics_meta table.

True. But, there are some I haven’t excluded, because I want those in the events and states tables, but only for the duration I set in purge_keep_days.

Unfortunately, data from all these non-excluded entities are automatically saved, apparently forever, in the statistics table.

If you see (running the queries above) any “problem” - I mean entities with a high volume in statistics and statistics_short_term you could make a job and delete them from these tables, maybe once in a day.

But for statistics it should not be a problem (once per day one value only) - in statistics_short_term you have every 5 minutes a value - but also only for the purge_keep_days time…

how you make sensor for mariadb thx

@milandzuris
You can try to use sql_json via HACS.

Here you can build sensors using a SQL query.

im using maria

Yes… this integration is working with MariaDB - check it via HACS → integrations. (if you are using HACS… - if not check it in community )

just for the record … you are aware that the graph (of your DB) you shared shows 35,000 KB , right, or ? … repacking and deleting in this stage seems a little “overkill”

upps, that was ment to @CaptTom

Very true. I was just testing to confirm how it works. I was once burned by a SQLite database on my SD card which was approaching 4GB before I even knew it existed. I ruthlessly excluded all the entities I have no use for history of (the vast majority, as it turned out.) Now I’ve probably got one of the smallest Recorder databases around. Once bitten twice shy and all that.

It was the steady growth of that otherwise lean database which alerted me to the ever-growing statistics table. Again, not a problem for me, just reporting what I found for anyone who might need it.

If you are not attached to your data you can move the recorder database into RAM (lost at every restart). There’s a post about it somewhere.

ok, well me myself and I, are quite new here, and have had my worries about the DB, but seems like i have it Stabilized around 600MB, but still have a bunch of devices, and some automation’s, to integrate … so i guess my “worries” ain’t over yet :slight_smile: … and yes from start i was thinking of installing MariaDB, but removed it again … until im more familiar with HA, im no big fan of a flat DB, specially if it’s only 1 single file … would be great if HA-Devs would consider maybe 2, or maybe a virtual, so there will be a dedicated for i.e “Forecasts” etc “write-read-dump” … i have InFlux, and grafana here for my graphs

If your database is in the GB range you won’t notice the small increase due to long term statistics. The other Tom only noticed because his DB is in the 10s of MB size range.

ok, yeah that sounds alot, as i was consider it from start, i have tried to configure i.e temp sensors to 60 seconds, same for tracking( of switches etc ) all, if not essentiall , at least that should do “half” compared to every 30 seconds

I’ve found 3 minute averages for my indoor temperatures work fine. It depends on being able to configure your sensors to do that though (easy with ESPHome sensors).

I found that 2 of my Pluggs with energy-sensors, lost contact, and the data became unavailable, not in my app, but in HA, so i had to set it back to 30 seconds, and the came back (from unavailable state), but i will try to test more, with the others(specially those nonessential, and some of the temps) … weird is, it was only the “Data” from the sensors, i could still turn on/off the devices from HA … could that be because of the difference in zigbee and wifi devices(sensors) ?