Deleting the whole DB is a bit drastic… I’ll look into editing it directly then.
thanks
This is described in the database backend docs:
If you don’t want to keep certain entities, you can delete them permanently:
sqlite> DELETE FROM states WHERE entity_id="sensor.cpu";
Does that not delete the entity rather than the data?
All that exists in the database is history. No configuration data. You can delete the whole thing with no consequences.
I had to do the same today.
I’m just adding to this that I used the Hass.io SQLite Web add-on with the same DELETE query noted above.
The complete procedure to delete one sensor’s history for me was:
- go to the directory where home-assistant_v2.db is located
cd /home/homeassistant/.homeassistant
- open the database
sudo sqlite home-assistant_v2.db
- check if the events to be deleted are there, e.g., those of entity_id = ‘sensor.sonoff_sht3x_0x44_temperature’
select event_data from events where json_extract(event_data, ‘$.entity_id’) = ‘sensor.sonoff_sht3x_0x44_temperature’;
- delete the events
delete from events where json_extract(event_data, ‘$.entity_id’) = ‘sensor.sonoff_sht3x_0x44_temperature’;
- check if the states to be deleted are there, e.g., those of entity_id = ‘sensor.sonoff_sht3x_0x44_temperature’
select state from states where entity_id = ‘sensor.sonoff_sht3x_0x44_temperature’;
- delete the states
delete from states where entity_id = ‘sensor.sonoff_sht3x_0x44_temperature’;
In my case I wanted to clear some graphs
I needed to use sqlite3:
sqlite3 home-assistant_v2.db
and then I selected specific timeframe from states:
sqlite> select * from states where entity_id='sensor.dust_particle_sensor' and last_updated < '2019-05-26 13:18:09.864454+00:00';
and deleted
sqlite> delete from states where entity_id='sensor.dust_particle_sensor' and last_updated < '2019-05-26 13:18:09.864454+00:00';
To check which sensors are storing too many records here’s a small SQL query to help you figuring out (my router average load was stored each minute)
select entity_id, count(*) as nb_records from states group by entity_id order by nb_records DESC;
Thanks for the solution.
It’s possible to use wildcards?
Unifi integration have created many sensor I would like to cleanup, like:
sensor.apple_tv_rx
sensor.apple_tv_tx
sensor.ecobee_rx
sensor.ecobee_tx
It’s very long to delete one by one.
Thanks
Hello all;
I am trying to delete some bad sensor data from my history. I ahve installed the SQlite hassio addon. I can read the data using;
SELECT *
FROM "states"
where entity_id='sensor.humidity'
…but when I try to delete using;
DELETE
FROM "states"
where entity_id='sensor.humidity'
it throws an error about permissions.
attempt to write a readonly database
How do I remedy this?
Yes, “%” is a wildcard. So
DELETE FROM states WHERE entity_id like "sensor.apple_tv%";
Though I don’t think the states table is where all the data is (at least for me)
You need to change the SQlite add-on config to allow writing. Where it says:
read_only: true
Change it to:
read_only: false
I know this is an old thread, but you never know if someone else might be running into the same issue.
Thanks for this guide. Sadly I found it too late.
I’ve deleted the not needed elements from the STATES table before I tried to delete from the EVENTS table. Maybe that’s the reason for my error:
I have the same …
This helped me with my issue of deleting certain values. Thanks for sharing the code as well. I had no idea how to format to get what I wanted.
If anyone else finds this post years later, as I did, it is now possible to call the recorder.purge service, which is friendlier and safer than running SQL queries manually.
See the recorder integration docs for more info.
ran the delete command from SQLite and it corrupted the db file =(
Does anybody know how to delete entries if you use maria-db?
Shit is that what i just did too.
DELETE FROM states WHERE entity_id=“sensor.electricity_demand_max_monthly_1”
and all my history is gone for everything
I had that happened- it caused by HA detecting DB corruption and recreating db file