See this post.
Or perhaps a service that you can call manually, through a script or automated. Just like the recorder.purge
service. Something like a history.save
or history.download
service, where you can provide JSON data for parameters like export type (CSV, JSON, XML) and/or a list of entities.
The file notification solution has some limitations:
- Using a trigger that fires every five minutes, you don’t get events that happen inbetween (door opens and closes again)
- The value template looks very nasty if you have lots of sensors.
A history component that stores the data in CSV files (one for each day) would be nice. Just like the influxdb component stores the new state of each event for the included entities inside the influxdb. The component could gzip the file of the last day to save storage.
Has this ever been addressed?
Yes, but the pull request was rejected.
Why was that? Can you share a link to the PR?
This may be possible already if you are able to pull values out of the database.
I have something similar in NodeRED which takes MQTT messages, and saves them to an SQL database, but you could output CSV instead if you wanted.
+1 This would be incredibly useful for easily sharing data.
CSV export is a must. I am surprised it is not implemented yet! We have all values in the database, why not export it, instead of looking on the graph?
Home Assistant is an open source project, so anyone can contribute. It will probably be added by whom it itches most. Perhaps that’s you?
How about just extracting the desired data from the SQL database directly? I think there are even online converters for this.
I believe the SQL returns json. So it’s a db within a db. Kinda
I don’t think File Notification will work - that’s going to start outputting, but what I want to do is dump out the data for a variety of sensors for the last month. If I knew I wanted that date a month ago I would have just fixed the problem I’m trying to fix now…
Found a relatively simple way to do this:
- Install SQLite Web add-on (GitHub - hassio-addons/addon-sqlite-web: SQLite Web - Home Assistant Community Add-ons) and open Web UI
- Select “states” table
- Select “Query” tab.
- Enter SQL query and click execute: it will report the first 1000 rows if successful
SELECT * FROM states WHERE entity_id = "sensor.your_sensor_id"
- Click on “Export CSV”: it will have all rows
I believe this custom card supports exporting the shown data to csv:
Good pick up. Yes this History Explorer custom card works for CSV export
NOTE THAT THIS NO LONGER WORKS AS ABOVE AND YOU NEED TO ADJUST THE QUERY - because the table structure has changed in newer versions of HA and you now need to know the metadata_id for your entity.
SELECT * FROM "states" WHERE metadata_id = ( SELECT metadata_id FROM states_meta
WHERE entity_id = "sensor.solaredge_m1_imported_a_kwh") ;
Oh and the easiest way to find out the entity_id is to click it and then click the settings cog icon for that entity and it will be listed there.
Also converting unix timestamp to human date time can be done on the extract.
SELECT metadata_id, datetime(last_updated_ts,'unixepoch') as 'updated', state FROM "states" where metadata_id in ('176', '163');
+1, I think this would be a great feature to the already great Entity History feature.
Thanks for updating this technique! Would you mind explaining how to convert the unix timestamp “on the extract”? I don’t have any real SQL experience, but following your initial instructions, I was able to get my entity’s state results. I just can’t figure out your next step of timestamp conversion. Thanks for your efforts.
As of 2024.2, CSV export is available in the history dashboard.