Why is Z-Wave bloating my DB and how to reduce it?

As I explained in another topic here, I’ve recently noticed my Home Assistant DB growing very large. Looking into the DB tables, it seems that by far the most events and states are generated by my few (5) mains powered Z-wave devices. This includes 4 Fibaro wall plugs and an Aeotec Home Energy Monitor.

The point is that the mains powered Z-wave devices are flooding the DB. I do not want to exclude them from the recorder because I would like to be able to plot graphs for things like the total power usage of the house over the last week. But I can’t live it like it is either, because the DB is ridiculously large (over 2 GB for just 5 days of data).

Does anybody know why Z-Wave is flooding the DB so aggressively and how this can be reduced?


Thank you! That is actually a very good idea! More a workaround, not a solution, but given the situation it’s still great. I’ll have to check how to control the frequency of saving template sensor values, I seem to remember that there is a"scan_interval" parameter or something.

Still, I’m curious if others have solved the Z-wave bloating problem directly in z-wave.

Or, if for others Z-wave is not producing a ton of states and events, please also let me know about that, so that at least I know that there’s something wrong with my Z-Wave config…

Turns out this is not as easy to do as it sounded… because the scan_interval is ignored by the template sensors and therefore it’s not possible to control how often the value of template sensors is updated. Adding a template sensor which takes its value from another existing sensor and having the recorder record the template sensor instead of the original sensor results in just as much saved data :frowning:

Of course, the easiest solution to this whole problem would be to be able to override the default recorder behavior of recording a sensor’s value whenever it changes by specifying an attribute like record_interval in the configuration.yaml. Does anybody know where this should be suggested to the devs?

When I’ve tried template sensors in the past they have updated whenever the sensor they are based on it updated.

Looks like there may be another way:

This template contains no entities that will trigger an update (as now() is a function), so we add an entity_id: line with an entity that will force an update - here we’re using a date sensor to get a daily update:

The example is a daily update but you may be able to do something similar, maybe an hourly update?

Thank you! Yes, indeed, it is possible to create a template sensor which does not auto-update, but instead it is updated by an automation on a regular basis. I have not found a better way. However this is a very complicated thing to achieve something that should, in my opinion, be very simple. After all, the goal is simple: to not save the state of a sensor all the time. I would very much welcome built-in HA support for this.

Yep I agree it’s far from ideal.

Some integrations have a ‘scan_interval’ setting:

However it looks like this is being phased out:

These options are being phased out and are only available for single platform integrations.

Hopefully something will replace the ‘scan_interval’ setting, which would do away with the need for template sensors + automations.

Thank you! Indeed, the scan_interval would work great, but unfortunately the template sensor integration does not have it. So the only way to update a recorded template sensor from a not recorded real sensor is via an automation.

Best would be if we could avoid the whole template sensor hack and we could simply optionally specify for each real sensor how often we want the recorder to record its value.

Have you looked at the Statistics sensor yet?

Yepp, I have. The problem with it is that it’s adding more values tot he DB, not reducing it. It is computing things like average, which is great, but the source sensor values will still be in the DB.