Is there any step-by-step guides out there to do what Patrlki suggested, if there is no other way? I find it odd that there is not some kind of system where you could just list all entity types in influxdb, select one or several entities and then just delete all entries of that type from the database
Seems like I wrote a bit over two months ago that my influxdb database size is almost 27 gigabytes. Well, today it’s almost 47 gigabytes so it’s clear that sooner or later my Home Assistant will get corrupted somehow, or at least my SSD will be full! So I have to do something fast…
I cannot understand why it is growing that fast. This is my setup:
Have you used data explorer to take a look at just how much stuff is stored? I have a weather station with about 16 entities; each reading is 4 or more records, including sql-style syntax for the units, value, icon, and device class, recorded every 30 seconds.
So, for me, I need to set a retention period for “Hassbucket” of maybe a month, and filter content down to logical aggregate windows. But, the first step is to set entity_globs for your include and exclude to try to filter out information you will never have use for in the future.
I’ll post back my task here once I get it done and checked.
I have looked at InfluxDB Explorer (in HA plugin) and I find it very difficult to understand and use, at least for someone that has experience only with SQL and relation databases. I know I have tons of unnecessary data in there, although I have included basically only sensors, but I have surveillance cameras, weather station etc that bring a lot of extra stuff.
To me the most logical place to handle history of our entities would be in Home Assistant itself. There is a list of all entities you have, so there could be some sort of option to set the time about how long they will stay in history: maybe two weeks for every entity as a default, and then you could change that time individually for every entity (most important entity records you could set to keep a year, two years or forever, for example). Maybe one day, until then we have to sort these out ourselves
The reason to use an external historian is so you can go beyond the limits of SQLite for storage; time series databases are great for the ability to window fairly natively and intuitively. IMO, much of that advantage is lost if you are using it as a Hass add-on though because of resource limitations generally speaking.
Start by going to “Data Explorer” on Infux and filter for one specific sensor and look at 5-minutes of data in “raw data” mode. It helps to get a sense of just how much information HomeAssistant is storing.
The main tool for consolidating data is aggregateWindow, which can down-sample data to a longer time duration-- mean, max, min, integral, etc. If Hass is exporting information at 30 second intervals, after a few days only 5-minute intervals might be meaningful. With scheduled tasks you can export the information to a separate bucket that might keep the 5-minute (or 15-minute or whatever makes sense for a particular sensor), and with pivots you can do min/max/mean for the interval.
It took me a few months of playing with Influx to not feel completely lost; their community forum has staff that are helpful, but often you need domain-specific help to get better information.
(My data reduction task is still a work-in-progress. My first attempt ended up with an unrefined feel so I need to work a little harder to figure out how to properly manage information like rainfall data.)
Reducing sensor update frequencies will help as well. I.e. temperature sensor once per minute instead of every few seconds. Some integrations allow to select refresh rates. Don’t make them higher then practically required.
Last but not least: for template sensors, specific triggers can be used to limit the update rates (otherwise determined by every change of entities used in the template)
I’m also wondering how to deal with big amount of sensor data from HomeAssistant stored in InfluxDB.
I’m wondering if a first step shouldn’t be to provide an analysis with several queries to know :
list of HA sensors whom values are stored to InfluxDB.
number of data points per sensor
first data time per sensor
last data time per sensor
average data points per hour for a given sensor (or average sampling period)
Unfortunately I’m quite new with InfluxDB and I’m missing answer to first question… getting list of sensors whom data are stored to InfluxDB.
Some people are pointing to InfluxDB doc about delete… Delete data | InfluxDB Cloud (TSM) Documentation well … it doesn’t help me much as I don’t know where I can run influx delete command. I tried in core-ssh Terminal but I get simply command not found
It should be all in my guide above. To look what entities are stored you can use the webinterface of influxdb (i think it was in the explore option, i don’t use influxdb anymore). There you can check every measurement with for example this select statement:
select * from homeassistant.autogen."%" where time > '2022-04-22' and time < '2022-04-24'
But deleting from there did not work for me. To access the influxdb container you need the addon with elevated access, it’ called advanced ssh & web terminal:
Don’t forget to disable protection mode in the settings. In this you can log into your influxdb container:
How to Get a Count of Records per Entity from InfluxDB Using Grafana
If you’re trying to understand the frequency of readings for different entities in InfluxDB, here’s a method I found useful using Grafana.
Steps:
Create a Chart in Grafana: Start by setting up a new chart visualization.
Add a Query for Each Measurement: For your first measurement, use the following query:
SELECT count("value") FROM "autogen"."state" WHERE $timeFilter GROUP BY "entity_id"::tag
Duplicate the Query for Other Measurements: Click on the “Duplicate query” icon and simply change “state” to the next value from the dropdown list. Repeat this for each measurement you’re interested in.
This approach gave me a clear overview of which entities had the most frequent readings. It’s a bit manual, but it got the job done!
Hello,
I’ve tried to use the InfluxDBStudio, but I do not manage to find out which address to enter in the connection setting for my Home assistant InfluxDb (with HASSIO).
Any suggestion ?
Thanks