How to keep long term energy data when switching databases?

So, I’ve pretty much come to terms with losing all my history data (that I’ve stored on my Synology NAS using MariaDB) once again, when upgrading from 2022.5.5. to 2022.6 or 2022.7
( 2022.6.x killed the Recorder/History for me )

Since the recorder has (at least for me) always been an incredibly buggy mess that needs to be deleted every couple of HA updates, I will probaly just revert to the internal sqlite version and accept that I’ll never have a couple of years or at least a couple of months of data to work with … but at the very least I would like to somehow keep my energy data this time!!! (it’s really important to me to be able to easily compare energy consumption over the months/years)

So, how can I export the energy data from MariaDB and feed it into the integrated database?

It’s so frustrating that I have to completely reset the recorder every couple of months because something breaks - I thought switching to MariaDB and running it on an external device would prevent this from happening, but obviously it doesn’t :frowning:

So in addition to now rescuing my energy data, what should I do in the future to keep as much data as possible? like I said, I specifically chose MariaDB on a NAS so the data is secure and backed up - I don’t care about file size (right now it’s only 13GB anyway - and already Home Assistant fails to migrate it after the update) … I just want the data to be save and easily accessible through HA for as long as possible! Could the problem be the age of the NAS (like I asked in the other thread … it’s a DS411)? or what else could be an issue? I don’t know anything about databases, and other than “the database migration failed” I can’t find more narrow information about what actually went wrong … so I don’t even know where to start looking for a solution. The only thing I am certain of is, that these regular database issues are the one thing that incredibly frustrates me about Home Assistant.

Alternatively (and I know I completely deviate from the original topic) is it possible to move the whole MariaDB database from my NAS to the device running HA (an Odriod N2+) and try to migrate it there, so any network or Synology related issues would be eliminated?!

And btw no, I really don’t want to mess around with influxDB - never got it to work and integrate into HA properly and I would like to not maintain multiple DBs :tired_face:

Never had issue with MariaDb myself, but I see in the other thread that you keep 1000 days of details?
Aren’t long-term statistics enough for your needs? Those are not purged if you keep your history to the the default 10 days.

You might google for “mysql to sqlite”, but I’m afraid there is a flaw in your way to use the HA db, so not sure it will solve anything.

Want it or not, the proper way to keep long-term detailed values is to use InfluxDB, which is meant for that use-case.

There’s no doubt that I messed up somewhere - I just don’t understand where.

I configured MariaDB in March and didn’t change anything substantial about my configuration since then (just added Automations, played around with NodeRED flows and the like … I only changed the database, BECAUSE I wanted to be able to securely store my history pretty much for “ever” on an external device so updates would not interfere with it)
The recorder (and everything else) worked perfectly since then, until the 2022.6 update came out and the database migration suddenly fails. I honestly don’t see what I could have messed up since then. I never got any errors or warnings suggesting that this would happen.

about the 1000 days purge interval - I’ve regularly heard people claim they never purge since switching to MariaDB on their NAS because they (like me) have enough space not to care.

and yes - just for energy, keeping purge at 10 or 30 days would be enough - however, I would also like to be able to compare temperatures, device locations, and some other things over an extended period of time …

also, why would the database even care how many events/values/whatever are stored in it … it’s just a couple of lines more - Google could probably tell me where I was on any given day the last 20 years, where I had dinner then, what battery level my phone was at and what songs I listened to while driving to work … they probably even have stored how many times my phone’s front-camera caught me I picking my nose while browsing what website … so what else, except for storage, is a limiting factor about a database? (I really don’t know it)

The reason Google can store (and find!) so many things is that they use a distributed database designed for the purpose across a midboggling number of storage locations. So Google isn’t having one database, it has tons. But none of them have the complete picture. Also, you wouldn’t want the Google database to handle e.g. bank payments. It isn’t able to do that in a way that your bank balance is always correct and that money isn’t lost or duplicated during transactions. It isn’t designed to be able to.

So as you’ve found out, the recorder isn’t designed for long term storage and trying to use it as such is causing you pain. So you need to decide if you want to get over your reluctance to have multiple databases or live with the consequences of using tools beyond their intended use.

2 Likes

I very much doubt Google has its databases on MySql on a Synology NAS :smiley:

For temperatures, you have long-term statistics. I’m pretty sure it’s enough to know that the hourly average was 22°C the 5th April 2020 between 15h and 16h. If not → InfluxDB
For device locations, not sure how you use that data, but I personally send the locations to a dedicated app for that.

But yeah, back to the root issue, maybe just try to export the data (mysqldump), let HA recreate the DB from scratch, then re-import the data? Maybe a semi-failed upgrade left some inconsitencies in the DB…
Nah, that would just recreate the incositency/issue.

@Edwin_D makes the same point I was trying to make in the thread you linked in the OP. And maybe said it better than I did.

I’m only trying to be helpful, as a former DBA. The HA database is good at what it was designed for, and it’s being improved with every version lately. But it wasn’t designed for what you want to do. And frankly, HA isn’t the best place to do that anyway.

yeah ok - good points :sweat_smile:

wait, what?! how, where, since when?! Never realised that! where can I see what kinds of data are stored long term? is there a way to add entites to long term statistics, that aren’t officially recorded?

Last month I started to write my device locations to a file like suggested here 2022.6.x killed the Recorder/History for me - #7 by CaptTom it’s an okay solution for that - but I haven’t found a way to import the data into a map and show a timeline yet (like I can in HA)

you are - your suggestion about writing location data to a file works good - I’m just not sure how I will be able to nicely work with it yet - but at least it’s securely stored for now :wink:

I already tried that - didn’t work anyway…

Right now I try to just purge the database to 30 days and upgrade afterwards … but the purge command does not seem to do anything - there’s not even an entry anywhere in the logs, that purge has started

service: recorder.purge
data:
  keep_days: 31
  repack: false

I’m using Nexctloud Map. Not for everyone but fits my bill.


- id: '1565779591672'
  alias: Track Me
  trigger:
  - entity_id: person.chris
    platform: state
  action:
  - data_template:
      ACC: '{{ trigger.to_state.attributes.gps_accuracy }}'
      ALT: "{% if state_attr('person.chris', 'altitude') == none %}\n  0\n{% else\
        \ %}\n  {{ trigger.to_state.attributes.altitude }}\n{% endif %}"
      BATT: "{% if state_attr('person.chris', 'battery_level') == none %}\n  0\n{%\
        \ else %}\n  {{ trigger.to_state.attributes.battery_level }}\n{% endif %}"
      LAT: '{{ trigger.to_state.attributes.latitude }}'
      LON: '{{ trigger.to_state.attributes.longitude }}'
      TIMESTAMP: '{{as_timestamp(now())}}'
    service: rest_command.nc_map_publish_me
nc_map_publish_me:
  url: https://mynextcloud.foo.bar/apps/maps/api/1.0/devices
  method: POST
  payload: user_agent=HA_Chris&lat={{LAT}}&lng={{LON}}&accuracy={{ACC}}&timestamp={{TIMESTAMP}}&battery=&altitude=
  content_type: application/x-www-form-urlencoded
  username: !secret nextcloud_user
  password : !secret nextcloud_pwd

I am in a similar situation, want to keep the energy data for the long term, and don’t know how to change the default database.

well that’s interesting … I always assumed, the long term statistics EXCLUSIVELY refered to the energy dashboard :woman_facepalming: this kind of changes my opinion of the recorder

also, nice map - I’ll look into this … but from the first look, it’s cloud only, right?!

nextcloud is your own cloud using your own hardware aka Synology but set up with your own software+ hardware. it was forked from OWncloud

1 Like

have you found ANY kind of information on how we might be able to do that? I actually still have an old home-assistant_v2.db file, I would like to extract the long term statistics from and import it into the current one xD

ah interesting - thanks for the clarification!

Long-run statistics are stored in specific tables (I think only the 1st 2 arerelevant for long-term, as descibed in the doc above)

image

what hardware do you use for the nextcloud?

On a cloud VPS, personally, but there is even an (outdated) HA addon for it

Ok I might have found a solution for saving the energy data… I’ll try to write it down now xD

First of all, I made the update to 2022.7.2 and ignored any database related issues after the restart.
Then I created a new empty database in phpMyAdmin called hass1 (the original was named hass) and edit the configutation to use this one - then restart obviously

Next, in phpMyAdmin I created another one called hass2.

Now I exported the statistics, statistics_meta, statitstics_runs and statistics_short_term tables from the original “hass” db to my PC as SQL

next step was to import them into the hass2 database but changing some default settings in the import tab:

sorry it’s German but basically untick partial import and “Fremdschlüsselüberprüfung” (whatever that might be called in english xD) and set SQL compatibility to MYSQL323

the statistics_short_term did not finish for me, because my login duration timed out and I was kicked from the server, but it didn’t seem to have hurt.

Next define “hass2” as the database in the configuration.yaml and restart again

Done!

My Energy dashboard seems to be there and complete … will have to see, if it stays this way

.

disclaimer: I have no idea what I’m doing