HA Recorder - PURGE configuration - any solution?

Hi,

I am new user of HA and have a question according to HA RECORDER configuration.

Is there any way to configure “HA record/purgeDB” to do below action:

  1. Record in DB defined parameters for 10days then purge them from DB (default HA action)
  2. And at the same time to record a few sensors parameters for a year and then purge them?

I have got few power consumption sensors and I would like to collect a whole history only for them.
For whole other sensors history of last 10 days would be necessary for me.

I think the size of my DB would be much smaller then.

Now, AFAIK I have to record whole parameters in DB for a year to collect there history of only few important for me power consumption parameters.

In other words. Is there a way to exclude from purge DB some important parameters?

1 Like

No, it can not be done. I would like this too.
please see here for an alternative method

1 Like

Use InfluxDB for long-term data and the HA database for short-term data.

You can separately choose which entities should be stored in the InfluxDB and which in the HA DB. I wrote a small guide on how I did exactly this in my setup here.

1 Like

If you are willing to experiment with SQL a little, you can use a trigger to insert into an analytics table a copy of the state records that HA creates. I use postgresql for my recorder database in HA and do this, but I believe the same ability exists in the default SQLite database (https://www.sqlitetutorial.net/sqlite-trigger/). I just copy every state record into my analytics database, but I think it would be an easy extension of the trigger code to have the trigger test if the state entity_id is in a lookup table and only copy records for entities you are interested in. I found storage in my analytics database inexpensive enough that I keep all my state records. Once you get the data out of HA’s recorder database and let HA keep that database small, having all the data in my analytics does not cause me any performance issues using jupyter and pandas to do my analysis. And you can certainly cull the analytics database manually or automatically as you see fit.

As an example, not that this is a useful example, below is a plot of about one and a half years of reading from SCE zigbee whole house power meter. It generates about 800 reading per hour. So this query against my analytics database that contains 170,000,000 rows yielded 7,000,000 rows of reading from this sensor. The query and plot in Jupyter took about 8 minutes to generate.

Good hunting!

1 Like

Burningstone, your solution is the one for me.
I did not know it is possible to send parameters from HA to two different DB.
Thank you very much.