Import old energy readings for use in energy dashboard

I just wanted to add my wish for making import and export CSV specifically for energy history, available on the GUI. There’s quite a few edge cases where someone may want to either start from scratch but simply retain energy history, or move energy history around different installs easily.

I know it’s do-able via APIs, SQL, whatever, but adding CSV import and export via gui would be real nice. I’ve already had three experiences in 18 months of losing history and having to recover, so for those less able it would also be quite reassuring to have the data tangibly accessible and recoverable.

On the import it could possibly have import (sum) and import (replace) for example. Or mapping fields to existing or new datasets. It could get complex!

I am interested in learning about the API that allows adding “history” at a given time in the past.

I should say 'part-do-able" via API. I’m not aware of any way to import history this way, but was thinking of “export” of data this way.

Key point was via the UI.

Hi, everyone.

I’m interested in this topic too and I have some code that you may find useful.

“Historical Sensors” it’s a helper module to build sensors with historical data. It has the necessary code to import data but the UI for uploading and importing CSV files is still missing. Once it’s figured out how to present the UI it should be trivial.


I’m unfortunately getting an error when loading this integration. A shame, since it really sounds like just what I need.

homeassistant.requirements.RequirementsNotFound: Requirements for delorian not found: ['homeassistant-historical-sensor==0.0.2.dev1']

Hi @bertoxxulous

The ‘delorian’ integration it’s just a test demo for the module, there is no integration ‘per-se’ in my project

The delorian integration depends on homeassistant-historical-sensor module so you have to install the module in the venv of homeassistant running pipenv install -e . or whatever suits you.

Anyway, I have made some changes to make delorian pull homeassistant-historical-sensor from github.

I hope it works for you.

I just stumbled on Spook by Frenck.
The description of service calls look quite interesting.
I have not yet tried it though.

As interesting as this is, i’d LOVE to get a detailed writeup on how to use this websocket API call, so i can write a method to interact with it. I don’t see any documentation on what it expects, examples of usage, anything.

The developer tools shows this for a service call.

1 Like

For that to appear, as you said, you need to install the Spook integration (GitHub - frenck/spook: Spook 👻 Not your homie).

Anyhow, I am not sure how that may work.

I think Spook is the only way to do this.
It has been a long standing decission by developers that import of historic data was not something they wanted to go into.
Frenck has therefore added it to the Spook integration instead, but I guess this is done by a direct SQL call and not through some API that will be available elsewhere.

So after a few days, i finally figured out why i wasn’t getting this service call… I forgot to actually ADD the spook integration (i added it in HCAS, but forgot to do the integration piece). I can now see the screenshot provided above, and the notes in the “Statistics” field tell me what i need to know.

However, i still have questions about the Statistics ID and Source. I believe i can just use my energy sensor, is this correct? Will this overwrite my existing data (not the worst outcome), or create a new entity? Is the “Source” a random field?

The “Has a mean”, “Has a sum” and "Unit of measurement or simple enough to ascertain, as well as what to provide with statistics. I just don’t want to kill my existing database with non-sensical data.

1 Like

@fversteegen, @ahochsteger, @gijbelsy, @slyoldfox

I created a script that can import historical data from the Dutch energy provider Eneco with their Toon thermostat.

The script should be easily adaptable for other providers that use a different export format. The processing logic to change the statistics stays the same.


Did you figured it out ?

Could you provide some same data which you feed into this node.js script as an example? Does the stat just increment the sum for each time stamp?

I got the stats loaded using the websocket recorder API. However, the following morning HA shows a return of the same cumulative amount to the grid. Why is that? I have no sources feeding energy data to HA apart from what I sent.

It’s not really documented, hence the nature of the spook tools, but here’s a sample working data for the service recorder.import_statistics. Seems that you still may need to do more data manipulation once you call this service, but hopefully this can get you started.

service: recorder.import_statistics
  has_mean: false
  has_sum: true
  statistic_id: sensor.gas_meter
  source: recorder
  name: "NULL"
  unit_of_measurement: CCF
    - start: "2023-05-01T00:00:00+06:00"
      state: 517.76
      sum: 0.94
1 Like

So, i was finally able to get this to work!

I use NodeRed to import these statistics, and use the following fuction:

var newmsg={payload:{}}
var smmsg={payload:{}}
var ps=msg.payload.energyData

var c_stat={}
var sum_add = 0

if ( msg.sum != 'unknown'){
    sum_add = msg.sum

function parseDT(rd){
    var stats=[{sum:sum_add}]

    var i=-1
    var hr = 0
    var m=0
        i += 1
        if(i == 0){
            hr = 0
            m = 0
        } else {
            if(m == 0){
                hr += 1
        var val=e.split('-')[0]
        if(i >= 8 && i <=11){
            if (val == ''){
                hr = 1
            } else {
                hr = 2
        if(i >= 12 && i <= 15){
            if(val == '-'){
            } else {
        if(stats.length-1 == hr){
            if (val != '' && val != '-'){
                stats[hr].sum = sum_add
            } else {
                stats[hr].sum = sum_add
        } else {
            if(val != '' && val != '-'){
                sum_add += parseFloat(val)
            } else {
    return stats

var nm=[]
var sm=[]

ps.forEach(function (e){
    var dt = new Date(e.DT)
    var dh = parseDT(e.RD.split(','))
    var ldt = dt

    if (e.RT == "C"){
        for (var i=0;i<=23;i++){
        for (var i = 0; i <= 23; i++) {
            sm.push({ start: dt.toISOString(), sum: dh[i].sum, last_reset: ldt.toISOString() })
return [newmsg,smmsg];

I get the data from SMT via API, into json, and i use jsonata to parse this into the Call Service Node: recorder.import_statistics call:

   "name":"Energy Reading",


This is all on my Dev box, and i still haven’t been able to determine how to get the state of a particular date/time, so that’s my next call.

I will post an update in my github with my flow as soon as get this working in my production.

The statics for any given sensor that has a sum at any given time can be found using this query:

select strftime('%d-%m-%Y %H:%M:%f', ss.start_ts, 'unixepoch') as start_dt, strftime('%d-%m-%Y %H:%M:%f', se.start_ts, 'unixepoch') as end_dt, se.[sum] - ss.[sum] as [difference], m.unit_of_measurement from statistics ss inner join statistics_meta m on = ss.metadata_id inner join statistics se on se.metadata_id = ss.metadata_id and se.start_ts > ss.start_ts and not exists (select * from statistics se2 where se2.metadata_id = ss.metadata_id and se2.start_ts > ss.start_ts and se2.start_ts < se.start_ts) where ss.start_ts <= (julianday('2023-07-06T00:00:00') - 2440587.5)*86400.0 and m.statistic_id = 'sensor.electricity_meter_energy_consumption_tarif_1' order by ss.start_ts desc limit 1;

I have written the query such that you can specify the entity_id of the sensor (stored in statistics_meta.statistic_id) and the date & time of the statistic (stored in statistics.start_ts or statistics_short_term.start_ts) and see the results in human readable format. If you’ve got the exact start_ts value and/or the metadata_id, you don’t need the conversion and/or the join into statistics_meta of course.

The values this query returns are:

  • start_dt - the start of the period the statistic is valid for (in UTC),
  • end_dt - the end of the period the statistic is valid for (in UTC),
  • difference - the amount added or subtracted (when < 0) in this statistic period.
1 Like

Hi everyone,

I have recently created an add-on that uses the recorder/import_statistics route of the WebSocket API to import electric consumption data from french electric meters (AKA Linky).

Because these statistics are not tracked by a sensor, they are considered “external statistics” (there is a : instead of a . in the statistic_id)

However, it looks like I cannot add a static price to my electric consumption (the radio buttons are disabled), and this seems to be by design according to this PR comment:

A price entity or a fixed price is only supported when the consumed gas is tracked by a sensor.

Where can I find more information about this ? I cannot find any other related documentation, and I have no idea how to fix this little problem…