Snapshots very large (8 gb)

Hi, I’m looking for some help. My snapshots are very large. About 8gb. My database in config is only 300 mb. So this cannot be the problem I guess. When I look at the hassio folders with Samba, my biggest file is the database (300mb). I really do not understand why it is so big. The size of the snapshots give all sorts of problems in diskspace local and google drive.

Hope someone can help. Thanks!

Are you using glances or similar and put that data into influx?
My snapshots got big very fast when i had that running.
I even run out of diskspace on my NUC.

Extract one of the snapshot files to a directory on a windows PC and inspect it with this https://windirstat.net/ for an easy way to see exactly where the problem is. Use https://github.com/shundhammer/qdirstat for Linux.

Thanks. Yes! Same here. My NUC is completely full. So new snapshots do not work. Did you uninstall Glances and was the problem solved? Or did you have to empty Influx also?

Thanks. I have tried that on a mac, but I get all sort of errors. I am now extracting the tar file on a windows computer. Hope that works. I will definitly use the tool you advised.

You can add a retention policies or continuous queries to remove and downsample your data on influxdb…

See; https://docs.influxdata.com/influxdb/v1.7/guides/downsampling_and_retention/

Found the problem! It wsa the Glances database in Influxdb. Thanks mcfroijd!
For now I turned off Glances. I deleted the Glances-database in Influxdb.
Do I also have to delete the database in hassio? I could not realy find the location via configurator. Doe anyone know the path?

You can use the developer tools services menu. recorder.purge is the service you want.

In future if you create a partial rather than a full snapshot you should be able to exclude only the glances DB.