Large statistics database (1,9G, 40% of total DB)

…Which is exactly what I wanted.

I saw that in the release notes. Thank you for the reminder. Some people might want to update to 2024.10.x before trying this.

If you were to actually read my thread, you’ll see that’s exactly what I did.

Please understand, this thread is about different ways to deal with a large amount of LTS data in the database. Mine isn’t the only way. But neither is yours. I only offered my experience for those who might be helped by it.

I never asked for LTS, and never asked that every single entity be retained. This was indeed “forced” on users, and this work-around is hardly intuitive. The great thing about HA is that we can all use it in different ways. That shouldn’t be discouraged.

1 Like

I did read it. You presented it as something you discovered rather than something you were told about over two weeks ago.

Project decisions like this are for the majority. They will keep happening. Fortunately in this case there was something you could do but do not be surprised if they keep happening. The developers have said many times they are not interested in supporting making everything configurable.

1 Like

FYI that work around will likely produce repairs for every entity in 2024.10+ 1 to 4 hours after each restart.

1 Like

I observe these repairs only for a few entities (belonging to Traccar Server integration which is broken) out of ~30…40 entities with customized “state-class: none”.

1 Like

Thank you for the heads-up. I’ll be watching for that when I update in a few days.

If anyone has a better way to disable LTS recording, either by entity or altogether, feel free to chime in!

Hi, please check this feature request for more options to trim LTS.
(and the issue of different options). Especially disabled entities are an issue as those will remain in LTS forever. (Pls vote :wink: )