Import old energy readings for use in energy dashboard

The developer tools shows this for a service call.

1 Like

For that to appear, as you said, you need to install the Spook integration (GitHub - frenck/spook: Spook šŸ‘» Not your homie).

Anyhow, I am not sure how that may work.

I think Spook is the only way to do this.
It has been a long standing decission by developers that import of historic data was not something they wanted to go into.
Frenck has therefore added it to the Spook integration instead, but I guess this is done by a direct SQL call and not through some API that will be available elsewhere.

So after a few days, i finally figured out why i wasnā€™t getting this service callā€¦ I forgot to actually ADD the spook integration (i added it in HCAS, but forgot to do the integration piece). I can now see the screenshot provided above, and the notes in the ā€œStatisticsā€ field tell me what i need to know.

However, i still have questions about the Statistics ID and Source. I believe i can just use my energy sensor, is this correct? Will this overwrite my existing data (not the worst outcome), or create a new entity? Is the ā€œSourceā€ a random field?

The ā€œHas a meanā€, ā€œHas a sumā€ and "Unit of measurement or simple enough to ascertain, as well as what to provide with statistics. I just donā€™t want to kill my existing database with non-sensical data.

1 Like

@fversteegen, @ahochsteger, @gijbelsy, @slyoldfox

I created a script that can import historical data from the Dutch energy provider Eneco with their Toon thermostat.

The script should be easily adaptable for other providers that use a different export format. The processing logic to change the statistics stays the same.

4 Likes

Did you figured it out ?

Could you provide some same data which you feed into this node.js script as an example? Does the stat just increment the sum for each time stamp?

I got the stats loaded using the websocket recorder API. However, the following morning HA shows a return of the same cumulative amount to the grid. Why is that? I have no sources feeding energy data to HA apart from what I sent.

Itā€™s not really documented, hence the nature of the spook tools, but hereā€™s a sample working data for the service recorder.import_statistics. Seems that you still may need to do more data manipulation once you call this service, but hopefully this can get you started.

service: recorder.import_statistics
data:
  has_mean: false
  has_sum: true
  statistic_id: sensor.gas_meter
  source: recorder
  name: "NULL"
  unit_of_measurement: CCF
  stats:
    - start: "2023-05-01T00:00:00+06:00"
      state: 517.76
      sum: 0.94
1 Like

So, i was finally able to get this to work!

I use NodeRed to import these statistics, and use the following fuction:

var newmsg={payload:{}}
var smmsg={payload:{}}
var ps=msg.payload.energyData

var c_stat={}
var sum_add = 0

if ( msg.sum != 'unknown'){
    sum_add = msg.sum
}

function parseDT(rd){
    var stats=[{sum:sum_add}]

    var i=-1
    var hr = 0
    var m=0
    rd.forEach(function(e){
        i += 1
        m=i%4
        if(i == 0){
            hr = 0
            m = 0
        } else {
            if(m == 0){
                hr += 1
            }
        }
        var val=e.split('-')[0]
        if(i >= 8 && i <=11){
            if (val == ''){
                hr = 1
            } else {
                hr = 2
            }
        }
        if(i >= 12 && i <= 15){
            if(val == '-'){
                hr=3
            } else {
                hr=2
            }
        }
        //node.warn({"hr":hr,"val":val,"m":m,"len":stats})
        if(stats.length-1 == hr){
            if (val != '' && val != '-'){
                sum_add+=parseFloat(val)
                stats[hr].sum = sum_add
            } else {
                stats[hr].sum = sum_add
            }
        } else {
            if(val != '' && val != '-'){
                sum_add += parseFloat(val)
                stats.push({sum:sum_add})
            } else {
                stats.push({sum:sum_add})
            }
        }
        //stats.push({"sum":e.split(',')[0],"hour":hr})
    })
    return stats
}

var nm=[]
var sm=[]

ps.forEach(function (e){
    var dt = new Date(e.DT)
    var dh = parseDT(e.RD.split(','))
    var ldt = dt

    if (e.RT == "C"){
        for (var i=0;i<=23;i++){
            dt.setHours(i)
            nm.push({start:dt.toISOString(),sum:dh[i].sum,last_reset:ldt.toISOString()})
        }
        //node.warn(nm)
        //newmsg.push(parseDT(e.RD.split(',')))
    }
    else
    {
        for (var i = 0; i <= 23; i++) {
            dt.setHours(i)
            sm.push({ start: dt.toISOString(), sum: dh[i].sum, last_reset: ldt.toISOString() })
        }
    }
})
newmsg.payload.stats=nm
smmsg.payload.stats=sm
//node.warn(newmsg)
//node.warn(nm)
return [newmsg,smmsg];

I get the data from SMT via API, into json, and i use jsonata to parse this into the Call Service Node: recorder.import_statistics call:

{
   "stats":[$.payload.stats],
   "statistic_id":"sensor.energy_reading",
   "source":"recorder",
   "has_mean":false,
   "unit_of_measurement":"kWh",
   "name":"Energy Reading",
   "has_sum":true    

}

This is all on my Dev box, and i still havenā€™t been able to determine how to get the state of a particular date/time, so thatā€™s my next call.

I will post an update in my github with my flow as soon as get this working in my production.

The statics for any given sensor that has a sum at any given time can be found using this query:

select strftime('%d-%m-%Y %H:%M:%f', ss.start_ts, 'unixepoch') as start_dt, strftime('%d-%m-%Y %H:%M:%f', se.start_ts, 'unixepoch') as end_dt, se.[sum] - ss.[sum] as [difference], m.unit_of_measurement from statistics ss inner join statistics_meta m on m.id = ss.metadata_id inner join statistics se on se.metadata_id = ss.metadata_id and se.start_ts > ss.start_ts and not exists (select * from statistics se2 where se2.metadata_id = ss.metadata_id and se2.start_ts > ss.start_ts and se2.start_ts < se.start_ts) where ss.start_ts <= (julianday('2023-07-06T00:00:00') - 2440587.5)*86400.0 and m.statistic_id = 'sensor.electricity_meter_energy_consumption_tarif_1' order by ss.start_ts desc limit 1;

I have written the query such that you can specify the entity_id of the sensor (stored in statistics_meta.statistic_id) and the date & time of the statistic (stored in statistics.start_ts or statistics_short_term.start_ts) and see the results in human readable format. If youā€™ve got the exact start_ts value and/or the metadata_id, you donā€™t need the conversion and/or the join into statistics_meta of course.

The values this query returns are:

  • start_dt - the start of the period the statistic is valid for (in UTC),
  • end_dt - the end of the period the statistic is valid for (in UTC),
  • difference - the amount added or subtracted (when < 0) in this statistic period.
1 Like

Hi everyone,

I have recently created an add-on that uses the recorder/import_statistics route of the WebSocket API to import electric consumption data from french electric meters (AKA Linky).

Because these statistics are not tracked by a sensor, they are considered ā€œexternal statisticsā€ (there is a : instead of a . in the statistic_id)

However, it looks like I cannot add a static price to my electric consumption (the radio buttons are disabled), and this seems to be by design according to this PR comment:

A price entity or a fixed price is only supported when the consumed gas is tracked by a sensor.

Where can I find more information about this ? I cannot find any other related documentation, and I have no idea how to fix this little problemā€¦

Thanks!

One off my energy sockets died and I have a new one which i like to install. I searched everywhere and see a lot off the same questions but no solution. As I understand from HA 2023.4 there is no option anymore to rename the new entity to the old name and keep the energy history.

Is there a way to migrate the energy data from one entity to another? Just like to keep my energy history.

Why are you under the impression that renaming the entity no longer works? If not mistaken that is still the way to do itā€¦

Yes, you are right! .Was testing with a old device which I no longer needed in the Energy dashboard. When you remove the device and add another, change the entity name to the old one it keeps the energy history.
One thing to keep in mind: If you use a new device with kWh history, this history will be added to the Energy dashboard. So always use a new device without history or delete the history, I resetted the history from that new device in the Zigbee2MQTT manager.

1 Like

Thereā€™s still a scenario that simply renaming a sensor does not cover. When you need to split a source sensor into two (for PEAK and OFFPEAK). Itā€™d be great if we could add a sensor to the Energy Dashboard for historical purposes up until a set timestamp, and then new sensors provide the new data. I outlined this in more detail in this post.

1 Like

And what about users that didint know about the renaming scheme and have old and new sensors in the energy dashboard so that they dont lose the information?
I used to have a SDM120 using tasmota and after a few years upgraded to a Shelly EM.
How do users like me merge the information?

sensor.powcasath_energy_total (old sersor)
sensor.shellyem_34945473ec5d_channel_1_energy (new sensor)

You are telling me that if i rename the shelly sensor to the old name it will work and merge?
Do i have to delete the old sensor? wont i lose my data?

I just created an integration to import historical data from csv/tsv-files via a service. Maybe this can help one or the other, see this post.

I did not yet try it with data from the energy dashboard, but as the historical cards work, this should work as well.

Feedback welcome!

Hi, I know itā€™s an old post, but can you tell me where can I find documentations or instructions on the api async_import_statistics and async_add_external_statistics?

Hello Klaus,

I think your integration is exactly what Iā€™m looking for!!. So thanks for creating it!
I havent been able to test it yet. I am currently setting up a test instance of home assistant as I donā€™t want to experiment with the database of my running system:-)
My goal is to import some energy and temperature data from my now decomissioned domoticz system.

Based on the instructions of the integration however I already have two requests/comments. I think the ā€œsumā€ column can be or even should be reconstructed completely after importing the data. Also for the already existing data. So you can leave it out of the CVS file and after importing the state values, the integration then should updat the sum values of the entire statistics table.
At least that is what I understood from this post Home Assistant SQlite - Change statistics data.

Second request is to have an upload button in the integration. It is a minor thing, but it would look much more rounded if you donā€™t first have to upload the cvs seperately to the home assistant folder. Not sure how difficult that is though.

Kind regards,

Bert