So I did a little search and I didn’t see an answer, I mean there is plenty on how to maintain your database and I’m more than confident on using all of that information to achieve my goal. But what is a sensible size for an install with 1000+ entities?
Lets roll back to earlier this month, I finally found the time to swap my Vbox install over to a baremetal install (reason for this was to use a coral for frigate and try and fix the huge CPU spikes that was causing instability in the system after 4 days of uptime).
So after the change over, I started noticing history retrival had slowed right down, after hunting the usual database issues like duplications etc I came up empty. I did notice my database (Maria DB hosted externally on a NAS), was a whooping 15+GB, and that seemed too large, of course googling found all the wonderful posts people have already made about maintaining the database so I got to work, and after 3 days manged to bring it down to ~6GB, which still seemed a tad excessive, so I further shrunk the keep days to 5 and dropped it to 3GB:
Now 3GB seems reasonable in the face of the 16GB I had, however it does seem pretty large so I’m working on being more surgical now and specifically looking at what’s contributing the most.
POWER, so interestingly the amount of data in the states table contributed by templates/Shelly EM3 Pro/Sungrow inverter is huge (40% of the row counts). So I’ve further dropped sensors from these devices that aren’t required, but many of them are of interest to me, and yes I understand that these are converted to LT stats beyond the keep_days function, but that’s not really my question.
My question as in the topic, what is a sensible database size for an install of my size? Which will lead to, how far down this rabbit hole do I fall.
Note: the database history retrieval can still be a tad slow at times, but not excessive and has been improved.
Thanks for reading, looking forward to hearing about other people experiences with database sizing.