Import old energy readings for use in energy dashboard

Hi,

I was following up on this, but I am not sure how to manage the code to match my Excel file.

Anyone could provide some hints on how to address this file to get it imported into HA?

I have around 42700+ lines,with hourly records from 01/01/2020 to 15/11/2022

We just turned december and my last imported gaz consumption was on November 24th, so I have 6 days that are going to count towards consumption in december, which is very annoying as it will make comparing november and december consumption with the previous year invalid.

My gaz distributor did a maintenance/upgrade of his website and while the web grabbing method still worked, the data on the site was not updated over the period - it was just updated today, december 1st.

I noticed that when I added my water meter to the energy dashboard - a function that was recently added - all historical data is available in the energy dashboard - it goes back to june:

So importing historical data was possible for add a water meter to the dashboard.

It should be possible to add meter values to the database at a given date, not just for “now”. That would allow importing long term historical data, or cope with the unavailability of data from the energy provider for whatever reason.

I just wanted to add my wish for making import and export CSV specifically for energy history, available on the GUI. There’s quite a few edge cases where someone may want to either start from scratch but simply retain energy history, or move energy history around different installs easily.

I know it’s do-able via APIs, SQL, whatever, but adding CSV import and export via gui would be real nice. I’ve already had three experiences in 18 months of losing history and having to recover, so for those less able it would also be quite reassuring to have the data tangibly accessible and recoverable.

On the import it could possibly have import (sum) and import (replace) for example. Or mapping fields to existing or new datasets. It could get complex!

1 Like

I am interested in learning about the API that allows adding “history” at a given time in the past.

I should say 'part-do-able" via API. I’m not aware of any way to import history this way, but was thinking of “export” of data this way.

Key point was via the UI.

Hi, everyone.

I’m interested in this topic too and I have some code that you may find useful.

“Historical Sensors” it’s a helper module to build sensors with historical data. It has the necessary code to import data but the UI for uploading and importing CSV files is still missing. Once it’s figured out how to present the UI it should be trivial.

5 Likes

I’m unfortunately getting an error when loading this integration. A shame, since it really sounds like just what I need.

homeassistant.requirements.RequirementsNotFound: Requirements for delorian not found: ['homeassistant-historical-sensor==0.0.2.dev1']

Hi @bertoxxulous

The ‘delorian’ integration it’s just a test demo for the module, there is no integration ‘per-se’ in my project

The delorian integration depends on homeassistant-historical-sensor module so you have to install the module in the venv of homeassistant running pipenv install -e . or whatever suits you.

Anyway, I have made some changes to make delorian pull homeassistant-historical-sensor from github.

I hope it works for you.

I just stumbled on Spook by Frenck.
The description of service calls look quite interesting.
I have not yet tried it though.

As interesting as this is, i’d LOVE to get a detailed writeup on how to use this websocket API call, so i can write a method to interact with it. I don’t see any documentation on what it expects, examples of usage, anything.

The developer tools shows this for a service call.

1 Like

For that to appear, as you said, you need to install the Spook integration (GitHub - frenck/spook: Spook 👻 Not your homie).

Anyhow, I am not sure how that may work.

I think Spook is the only way to do this.
It has been a long standing decission by developers that import of historic data was not something they wanted to go into.
Frenck has therefore added it to the Spook integration instead, but I guess this is done by a direct SQL call and not through some API that will be available elsewhere.

So after a few days, i finally figured out why i wasn’t getting this service call… I forgot to actually ADD the spook integration (i added it in HCAS, but forgot to do the integration piece). I can now see the screenshot provided above, and the notes in the “Statistics” field tell me what i need to know.

However, i still have questions about the Statistics ID and Source. I believe i can just use my energy sensor, is this correct? Will this overwrite my existing data (not the worst outcome), or create a new entity? Is the “Source” a random field?

The “Has a mean”, “Has a sum” and "Unit of measurement or simple enough to ascertain, as well as what to provide with statistics. I just don’t want to kill my existing database with non-sensical data.

1 Like

@fversteegen, @ahochsteger, @gijbelsy, @slyoldfox

I created a script that can import historical data from the Dutch energy provider Eneco with their Toon thermostat.

The script should be easily adaptable for other providers that use a different export format. The processing logic to change the statistics stays the same.

4 Likes

Did you figured it out ?

Could you provide some same data which you feed into this node.js script as an example? Does the stat just increment the sum for each time stamp?

I got the stats loaded using the websocket recorder API. However, the following morning HA shows a return of the same cumulative amount to the grid. Why is that? I have no sources feeding energy data to HA apart from what I sent.

It’s not really documented, hence the nature of the spook tools, but here’s a sample working data for the service recorder.import_statistics. Seems that you still may need to do more data manipulation once you call this service, but hopefully this can get you started.

service: recorder.import_statistics
data:
  has_mean: false
  has_sum: true
  statistic_id: sensor.gas_meter
  source: recorder
  name: "NULL"
  unit_of_measurement: CCF
  stats:
    - start: "2023-05-01T00:00:00+06:00"
      state: 517.76
      sum: 0.94
1 Like

So, i was finally able to get this to work!

I use NodeRed to import these statistics, and use the following fuction:

var newmsg={payload:{}}
var smmsg={payload:{}}
var ps=msg.payload.energyData

var c_stat={}
var sum_add = 0

if ( msg.sum != 'unknown'){
    sum_add = msg.sum
}

function parseDT(rd){
    var stats=[{sum:sum_add}]

    var i=-1
    var hr = 0
    var m=0
    rd.forEach(function(e){
        i += 1
        m=i%4
        if(i == 0){
            hr = 0
            m = 0
        } else {
            if(m == 0){
                hr += 1
            }
        }
        var val=e.split('-')[0]
        if(i >= 8 && i <=11){
            if (val == ''){
                hr = 1
            } else {
                hr = 2
            }
        }
        if(i >= 12 && i <= 15){
            if(val == '-'){
                hr=3
            } else {
                hr=2
            }
        }
        //node.warn({"hr":hr,"val":val,"m":m,"len":stats})
        if(stats.length-1 == hr){
            if (val != '' && val != '-'){
                sum_add+=parseFloat(val)
                stats[hr].sum = sum_add
            } else {
                stats[hr].sum = sum_add
            }
        } else {
            if(val != '' && val != '-'){
                sum_add += parseFloat(val)
                stats.push({sum:sum_add})
            } else {
                stats.push({sum:sum_add})
            }
        }
        //stats.push({"sum":e.split(',')[0],"hour":hr})
    })
    return stats
}

var nm=[]
var sm=[]

ps.forEach(function (e){
    var dt = new Date(e.DT)
    var dh = parseDT(e.RD.split(','))
    var ldt = dt

    if (e.RT == "C"){
        for (var i=0;i<=23;i++){
            dt.setHours(i)
            nm.push({start:dt.toISOString(),sum:dh[i].sum,last_reset:ldt.toISOString()})
        }
        //node.warn(nm)
        //newmsg.push(parseDT(e.RD.split(',')))
    }
    else
    {
        for (var i = 0; i <= 23; i++) {
            dt.setHours(i)
            sm.push({ start: dt.toISOString(), sum: dh[i].sum, last_reset: ldt.toISOString() })
        }
    }
})
newmsg.payload.stats=nm
smmsg.payload.stats=sm
//node.warn(newmsg)
//node.warn(nm)
return [newmsg,smmsg];

I get the data from SMT via API, into json, and i use jsonata to parse this into the Call Service Node: recorder.import_statistics call:

{
   "stats":[$.payload.stats],
   "statistic_id":"sensor.energy_reading",
   "source":"recorder",
   "has_mean":false,
   "unit_of_measurement":"kWh",
   "name":"Energy Reading",
   "has_sum":true    

}

This is all on my Dev box, and i still haven’t been able to determine how to get the state of a particular date/time, so that’s my next call.

I will post an update in my github with my flow as soon as get this working in my production.