data in the timeserie-influxdb is not stored as “device” entities(doesn’t work as a relation db), so you would have to spend endless time finding the “data” you want to delete, it’s more like stored with time-stamp/measurement/ID , where a given ID( ID is “applied” from influxdb when it gets a new “source” from it’s sensor(source=HA-entity-name … i.e when you “rename” an entity in HA, influx will add this as new “source” and give it a new ID, start over from scratch) so, basically if you want to delete from influx, you have to define a “timespan” from where you want to delete, and add key-field etc.etc. … Retention= delete when to old, so basically, if you had some devices from last year, their data is gone (thou not in your case as you have no retention) , the universe is said to be infinity, database-size as-well(if you have enough diskspace), retention-policies and “selection-policies” are the means to keep a database in shape … so what you really want is a relation-db(i.e Maria-DB, where you easily can delete , you can use “entities/devices” … you can use this as source for Grafana, but you loose the performance of the time-serie-DB … selection/retention is your best choice to keep database to it’s “minimum”, unless you will sit and “pick” every 6month or so(in a relation-DB)
I get the logics. Thanks for that. But I need the actions on how to.
@boheme61 thanks for your time. I can find that myself as well. I need some real guidance, no liks to documentation. Please, although I appreciate your replies I think I can do without it for now to keep this post open for others.
ok, good luck
The documentation is the official guide.
People doing this: https://community.home-assistant.io/t/influxdb-removing-or-deleting-data/292637 are just following that documentation.
Guide and guidance are different words and have different meaning .
So many times pointed to documentation and guides. Finding a manual of a boeing 747 does not mean I can operate it or repair it. But when watching e.g. a youtube I think I can do some level of maintenance after. See what I mean? If I would understand how to do it based on guides I would not be asking for help here…
did you get a solution? facing the same problems…
I started with these 2 tutorials:
https://dummylabs.com/post/2019-01-13-influxdb-part1/
https://dummylabs.com/posts/2019-05-28-influxdb-part2/
(Found them somewhere here in a post)
To clean up your database is not a short way, i had a look in every measurement what entities are stored and which one i really needed. In the last years i had a include all and exclude some sensors, this approach is really bad if you want to store your data for a long time.
To look into your database you can for example see what entities are stored in one measurement with:
select * from homeassistant.autogen."%" where time > '2022-04-22' and time < '2022-04-24'
Paste this into explore the influxdb addon. Then you see all entities that are stored yesterday with the “%” measurement.
To look what measurements you have use SHOW MEASUREMENTS
into the explore query.
A similar query for °C would be:
select * from homeassistant.autogen."°C" where time > '2022-04-22' and time < '2022-04-24'
When you have entities you want get rid off delete them. This was a bit tricky for me because the delete from did not work in explore in influxdb (i did not find the right syntax). So i logged into the container (see the dummylabs tutorial part 2)
ssh onto your host, execute:
user@host:~$ docker exec -it addon_a0d7b954_influxdb influx -precision rfc3339
Connected to http://localhost:8086 version 1.7.2
InfluxDB shell version: 1.7.2
Enter an InfluxQL query
and then
> auth
username: homeassistant
password:
> use homeassistant
Using database homeassistant
Then you can delete data with:
delete from "%" where entity_id = 'your_sensor'
So you have to go through every measurement, check what is stored and delete everything you don’t need. Don’t forget to change your influxdb config to only a include strategy of the sensors you really need.
To change how much data is stored i wrote a little tutorial which compresses data after 6 months and after 2 years in different retention policies to save space. For me cleaning up changed my db size from 4.5GB to around 350MB for 3 years of recording. For example i do not need the ink levels of a printer that left me before 2 years
Guidance enough?
I do not know where I found it. But I downloaded and installed windows influx db reader and that helped me to manually delete all “old” data. Of entities that are long time gone already… saved me around 2gb of data totally. Not that much compared to the total remaining and still too big to backup with HA backup …
Do you have a name for the Windows based influx db reader? It may help me and others with the problems you faced. . … Thanks
It was literally the 1st hit in google.
It would be great if there was some kind of simple way for handling database entries in Home Assistant. For example, some setting per entity, where you could set if you want to archive the entity’s state or value, and for how long you will want to keep it (or forever).
I switched some time ago to MariaDB + InfluxDB combination, and I have set up Influxdb so that only sensor and calendar domains are included. Still, my MariaDB is about 5,6 GB and InfluxDB database is almost 27 gigabytes! InfluxDB is growing daily by 200 MB. I have set it up so that data will stay forever (I want to keep my temperature data for example), but to me it seems that database will grow and grow until something breaks
Same here.
But the solution is to include instead of exclude…
But “changing” that is somewhat…
Too lazy to read all the replies after seeing so many bad recommendations. The key is to a) try to limit what you send to Influx to reasonable data, and b) to post-process the influx data to down-sample useful information that you want to retain long-term. To do this you create tasks, and they are going to need to be specialized depending on what the data is-- some counters increment daily and you might want to get hourly or quarter-hourly breakdowns rather than semi real-time. You might want deltas between windows or integrals. For some things you might just care about daily min/max/mean. Influx is really where you have the tools to pare data.
Pro tip: put your pared data in a separate bucket from your base data, and set a shorter retention period for base data, and potentially “forever” retention for pared values. You can also go three levels deep in buckets and your pared processes.
Is there any step-by-step guides out there to do what Patrlki suggested, if there is no other way? I find it odd that there is not some kind of system where you could just list all entity types in influxdb, select one or several entities and then just delete all entries of that type from the database
Seems like I wrote a bit over two months ago that my influxdb database size is almost 27 gigabytes. Well, today it’s almost 47 gigabytes so it’s clear that sooner or later my Home Assistant will get corrupted somehow, or at least my SSD will be full! So I have to do something fast…
I cannot understand why it is growing that fast. This is my setup:
include:
domains:
- calendar
- sensor
exclude:
domains:
- automation
- binary_sensor
- button
- camera
- cover
- device_tracker
- light
- media_player
- number
- scene
- switch
- update
Same Problem here, over 50gb data from half a year…
Have you used data explorer to take a look at just how much stuff is stored? I have a weather station with about 16 entities; each reading is 4 or more records, including sql-style syntax for the units, value, icon, and device class, recorded every 30 seconds.
So, for me, I need to set a retention period for “Hassbucket” of maybe a month, and filter content down to logical aggregate windows. But, the first step is to set entity_globs for your include and exclude to try to filter out information you will never have use for in the future.
I’ll post back my task here once I get it done and checked.
I have looked at InfluxDB Explorer (in HA plugin) and I find it very difficult to understand and use, at least for someone that has experience only with SQL and relation databases. I know I have tons of unnecessary data in there, although I have included basically only sensors, but I have surveillance cameras, weather station etc that bring a lot of extra stuff.
To me the most logical place to handle history of our entities would be in Home Assistant itself. There is a list of all entities you have, so there could be some sort of option to set the time about how long they will stay in history: maybe two weeks for every entity as a default, and then you could change that time individually for every entity (most important entity records you could set to keep a year, two years or forever, for example). Maybe one day, until then we have to sort these out ourselves
The reason to use an external historian is so you can go beyond the limits of SQLite for storage; time series databases are great for the ability to window fairly natively and intuitively. IMO, much of that advantage is lost if you are using it as a Hass add-on though because of resource limitations generally speaking.
Start by going to “Data Explorer” on Infux and filter for one specific sensor and look at 5-minutes of data in “raw data” mode. It helps to get a sense of just how much information HomeAssistant is storing.
The main tool for consolidating data is aggregateWindow, which can down-sample data to a longer time duration-- mean, max, min, integral, etc. If Hass is exporting information at 30 second intervals, after a few days only 5-minute intervals might be meaningful. With scheduled tasks you can export the information to a separate bucket that might keep the 5-minute (or 15-minute or whatever makes sense for a particular sensor), and with pivots you can do min/max/mean for the interval.
It took me a few months of playing with Influx to not feel completely lost; their community forum has staff that are helpful, but often you need domain-specific help to get better information.
(My data reduction task is still a work-in-progress. My first attempt ended up with an unrefined feel so I need to work a little harder to figure out how to properly manage information like rainfall data.)