I can do once I clean it up a little more.
One thing I noticed is that there is 0 cost showing in HA, even though I have set it to a static cost. Anyone else noticing this? Maybe it only calculates cost on real time data.
I can do once I clean it up a little more.
One thing I noticed is that there is 0 cost showing in HA, even though I have set it to a static cost. Anyone else noticing this? Maybe it only calculates cost on real time data.
Yes sure, i can share my Template. It took me a couple of Days to clean upā¦
This Script is to just import one Dataset to one new ID.
I had to import several csv, but found it more intuitive to create a sqbrpo-File for each csv, and when everything is done and tested, merge the scripts by appending them one by one. Having the Import-Jobs running one by one made it easier for me to Locate Problems, and scale the whole process.
Maybe it helps someone
<?xml version="1.0" encoding="UTF-8"?><sqlb_project><db path="C:/Users/user/Downloads/DB.Browser.for.SQLite-3.12.2-win64/Umzug - Alte Werte csv/statistics_ready2import_08.05/HA Versuch 9.5. Nach Full Restore auf Testsystem/home-assistant_v2.db" readonly="0" foreign_keys="" case_sensitive_like="" temp_store="" wal_autocheckpoint="" synchronous=""/><attached/><window><main_tabs open="structure browser pragmas query" current="3"/></window><tab_structure><column_width id="0" width="406"/><column_width id="1" width="0"/><column_width id="2" width="100"/><column_width id="3" width="5365"/><column_width id="4" width="0"/></tab_structure><tab_browse><current_table name="0,0:"/><default_encoding codec=""/><browse_table_settings/></tab_browse><tab_sql><sql name="Import Toon data">/*
This is a stripped Version of the Great https://github.com/patrickvorgers/HA-Import-Toon-Data Script.
It does NOT contain all possible important information, for more, head over to the original Script.
ALWAYS Keep a Backup, best Case is to try it on a diffent System, before importing to your productive System.
1) Search and Replace:
import_12_to_418
This is the csv to be imported. The same Filename is used as temporary Name for the import Table.
So, e.g. if you want to import History coming from ID 45, and the ID where you want to import it to has ID 319, name your csv import_45_to_319.csv.
Then do a Search and Replace inside of this sqbpro-File, and replace "import_12_to_418" with "import_45_to_319".
The First Number indicates the old ID the Data came from, the second Number indicates the ID, where the Data will be merged with.
You can use any Name you want, but i found it easy to keep an Overview over different Import Jobs.
As the Table Names are indiviudal, you can just glue multiple sqbpro-Files together by appending each other.
2) Search and Replace:
418
This is the ID where the imported Data should be appended to/merged with.
If you want your Date get imported to 319, do a Search and Replace inside of this sqbpro-File, and replace "418" with "319".
3) Follow all Instructions from the original https://github.com/patrickvorgers/HA-Import-Toon-Data Script.
Most important (but not all!) Steps are:
- Disable HomeAssistant Recorder while making the Backup (Developer Tools/Services/Call Service: Recorder: Disable)
- Create a Backup, download Home Assistant data
- Start Home Assistant SSH Addon
- Make a Connection with WinSCP to Homeassistant
- Stop the Home Assistant Core (Developer tools/Services/Call service: Home Assistant Core Integration: Stop)
- Extract "home-assistant_v2.db" (from "backup.tar" extract "homeassistant.tar.gz" from "data" folder)
Start the "DB Browser for SQLite" Process from above, inject the Data, upload the .db and Reboot Homeassistant ("reboot" command)
*/
/* Remove the temporary tables if they exist */
DROP TABLE IF EXISTS import_12_to_418_NEW;
/* Create temp tables that can hold the difference between the measurements and create a new sum */
CREATE TABLE "import_12_to_418_NEW" (
"ts" INTEGER,
"value" FLOAT,
"diff" FLOAT,
"old_sum" FLOAT,
"new_sum" FLOAT
);
/* Insert the hourly records from Toon - max 5 years */
INSERT INTO import_12_to_418_NEW (ts, value)
SELECT field1, round(field2 / 1, 3)
FROM import_12_to_418;
/* Remove any overlapping records from Toon which are already in Home Assistant */
DELETE FROM import_12_to_418_NEW
WHERE
ts >= (SELECT MIN(start_ts) FROM statistics WHERE metadata_id = 418); /* Change */
/* Insert the data from Home Assistant so that we can adjust the records with the new calculated sum */
INSERT INTO import_12_to_418_NEW (ts, value, old_sum)
SELECT start_ts, state, sum
FROM statistics
WHERE metadata_id = 418; /* Change */
/*
Calculate the difference from the previous record in the table
- For the Toon values calculate the diff from the previous record from the imported values (use value column / old_sum column is empty)
- For the Home Assistant values calculate the diff from the previous record from the existing sum column (use old_sum column / old_sum column is not empty)
*/
WITH CTE_DIFF_import_12_to_418_VALUE AS (
SELECT ts, round(value - (lag(value, 1, 0) OVER (ORDER BY ts)), 3) AS diff
FROM import_12_to_418_NEW
ORDER BY ts
)
UPDATE import_12_to_418_NEW
SET diff = CTE_DIFF_import_12_to_418_VALUE.diff
FROM CTE_DIFF_import_12_to_418_VALUE
WHERE
import_12_to_418_NEW.ts = CTE_DIFF_import_12_to_418_VALUE.ts AND
import_12_to_418_NEW.old_sum IS NULL;
WITH CTE_DIFF_import_12_to_418_SUM AS (
SELECT ts, old_sum - (lag(old_sum, 1, 0) OVER (ORDER BY ts)) AS diff
FROM import_12_to_418_NEW
ORDER BY ts
)
UPDATE import_12_to_418_NEW
SET diff = CTE_DIFF_import_12_to_418_SUM.diff
FROM CTE_DIFF_import_12_to_418_SUM
WHERE
import_12_to_418_NEW.ts = CTE_DIFF_import_12_to_418_SUM.ts AND
import_12_to_418_NEW.old_sum IS NOT NULL;
/* Cleanup possible wrong values:
- Remove the first record if no diff could be determined (Toon data)
- Diff is null => The point where Toon data goes over to Home Assistant data
- Diff < 0 => Probably new meter installed (measurement should be low)
- Diff > 1000 => Incorrect value
First handle the first two cases and then correct to 0 when incorrect value
*/
DELETE FROM import_12_to_418_NEW
WHERE
ROWID IN (
SELECT ROWID FROM import_12_to_418_NEW
WHERE
old_sum IS NULL
ORDER BY ts
LIMIT 1
);
UPDATE import_12_to_418_NEW
SET diff = round(old_sum, 3)
WHERE (diff IS NULL);
UPDATE import_12_to_418_NEW
SET diff = round(value, 3)
WHERE (diff < 0.0) AND (value < 10);
UPDATE import_12_to_418_NEW
SET diff = 0
WHERE (diff < 0.0) OR (diff > 1000.0);
/* Calculate the new sum
It is calculated by calculating the sum until the record that is currently processed
*/
WITH CTE_SUM_import_12_to_418 AS (
SELECT ts, SUM(diff) OVER (ORDER BY ts) AS new_sum
FROM import_12_to_418_NEW
)
UPDATE import_12_to_418_NEW
SET new_sum = round(CTE_SUM_import_12_to_418.new_sum, 3)
FROM CTE_SUM_import_12_to_418
WHERE
import_12_to_418_NEW.ts = CTE_SUM_import_12_to_418.ts;
/* Copy the new information to the statistics table
id => primary key and automatically filled with ROWID
sum => calculated new_sum value
metadata_id => the fixed metadata id of this statistics (see top)
created_ts => set to the timestamp of the statistic
start_ts => timestamp of the statistic
The sum is updated in case the record is already in Home Assistant
"where true" is needed to remove parsing ambiguity
*/
INSERT INTO statistics (state, sum, metadata_id, created_ts, start_ts)
SELECT new_sum, new_sum, 418, ts, ts FROM import_12_to_418_NEW WHERE true /* Change */
ON CONFLICT DO UPDATE SET sum = excluded.sum;
/* Also update the short term statistics.
We calculate the delta with which the sum was changed and add that to the current measurements
*/
UPDATE statistics_short_term
SET sum = sum + (
SELECT (new_sum - sum) as correction_factor
FROM
statistics_short_term as SST, import_12_to_418_NEW AS SN
WHERE
SST.metadata_id = 418 AND /* Change */
SST.start_ts = SN.ts
ORDER BY state DESC
LIMIT 1
)
WHERE
metadata_id = 418; /* Change */
/* Remove the temporary tables */
DROP TABLE IF EXISTS import_12_to_418_NEW;
DROP TABLE IF EXISTS import_12_to_418;
DROP TABLE IF EXISTS gas_quantity_CurrentGasQuantity_10yrdays;</sql><current_tab id="0"/></tab_sql></sqlb_project>
Hello. Thank you for this great tool. It took me a couple of evenings but all DSMR data from domoticz is imported in HA. Data from 2020, I didnāt want to loose this when migrating to HA!!
I have 2 questions:
Is there any way to fix this?
Thank you in advance!!
Good to read that it was helpfull in the migration
Question 1: New meter
This is supported by the script but you just missed the cut-off to detect the case. In the script the cutoff is 10kWh and you have 12kWh. This can be easily corrected. I updated the script to have a higher cutoff to detect this case, you could try the new version and see whether it works for you:
/* Old check which uses 10 as cutoff */
UPDATE NT_ORIG_NEW
SET diff = round(value, 3)
WHERE (diff < 0.0) AND (value < 10);
Question 2: Solar
It depends on how the input data is structured. It would normally ignore the first record because it cannot determine a delta. Can you provide me with the data of the first day?
Thank you for your fast reply.
Response question 1:
Date 2022-06-01: counter T1 = 9015927, counter T2 = 6711605
New Meter
Date 2022-06-03: counter T1 = 12991 counter T2 = 5890
Is SQL command below correct?
UPDATE NT_ORIG_NEW
SET diff = round(value, 3)
WHERE (diff < 0.0) AND (value < 10);
Response question 2:
I can retreive the missing data from a separate solar management portal. I will add this data to the import. This will fix the issue.
I changed the below SQL statement
UPDATE NT_ORIG_NEW
SET diff = round(value, 3)
WHERE (diff < 0.0) AND (value < 10);
to
UPDATE NT_ORIG_NEW
SET diff = round(value, 3)
WHERE (diff < 0.0) AND (value < 25);
The check on the value is there because otherwise negative differences which could be very large would be added as normal usage. Now it is only added when the difference is small and the large negative differences are just ignored/deleted.
As you can see in the new SQL statement the cutoff is 25 instead of 10. Your usage was 12 and was therefore discarded in the old SQL statement and assumed to be incorrect.
Finally, i found the Time to import my historical Temperature, as well as Humidity, based on Lasouls Toon-Script
In Fact, it is way easier than a (total-)increasing Statistics. Anyway, youād have to adjust the Fields and Functions accordingly.
Like with my last altered Script: Make sure to read the Original Toon-Script Instructions in Case of Doubts, and always try it first on a Dev-Machine - or be prepared to Restore a Backup and have HA not up for some Time
For Temperature
Source *.csv:
EPOCH,Mean,Min,Max
1649764800,15.8,15.8,15.8
(Make sure to remove the Header from *.csv before importing!)
DB-Browser Script:
<?xml version="1.0" encoding="UTF-8"?><sqlb_project><db path="C:/Users/user/Downloads/DB.Browser.for.SQLite-3.12.2-win64/Umzug - Alte Werte csv/statistics_ready2import_08.05/HA Versuch 9.5. Nach Full Restore auf Testsystem/home-assistant_v2.db" readonly="0" foreign_keys="" case_sensitive_like="" temp_store="" wal_autocheckpoint="" synchronous=""/><attached/><window><main_tabs open="structure browser pragmas query" current="3"/></window><tab_structure><column_width id="0" width="406"/><column_width id="1" width="0"/><column_width id="2" width="100"/><column_width id="3" width="5365"/><column_width id="4" width="0"/></tab_structure><tab_browse><current_table name="0,0:"/><default_encoding codec=""/><browse_table_settings/></tab_browse><tab_sql><sql name="Import Toon data">/*
This is a stripped Version of the Great https://github.com/patrickvorgers/HA-Import-Toon-Data Script.
It does NOT contain all possible important information, for more, head over to the original Script.
ALWAYS Keep a Backup, best Case is to try it on a diffent System, before importing to your productive System.
1) Search and Replace:
klima_temperatur_innen_to_519
This is the csv to be imported. The same Filename is used as temporary Name for the import Table.
So, e.g. if you want to import History coming from ID 45, and the ID where you want to import it to has ID 319, name your csv import_45_to_319.csv.
Then do a Search and Replace inside of this sqbpro-File, and replace "klima_temperatur_innen_to_519" with "import_45_to_319".
The First Number indicates the old ID the Data came from, the second Number indicates the ID, where the Data will be merged with.
You can use any Name you want, but i found it easy to keep an Overview over different Import Jobs.
As the Table Names are indiviudal, you can just glue multiple sqbpro-Files together by appending each other.
2) Search and Replace:
519
This is the ID where the imported Data should be appended to/merged with.
If you want your Date get imported to 319, do a Search and Replace inside of this sqbpro-File, and replace "519" with "319".
3) Follow all Instructions from the original https://github.com/patrickvorgers/HA-Import-Toon-Data Script.
Most important (but not all!) Steps are:
- Disable HomeAssistant Recorder while making the Backup (Developer Tools/Services/Call Service: Recorder: Disable)
- Create a Backup, download Home Assistant data
- Start Home Assistant SSH Addon
- Make a Connection with WinSCP to Homeassistant
- Stop the Home Assistant Core (Developer tools/Services/Call service: Home Assistant Core Integration: Stop)
- Extract "home-assistant_v2.db" (from "backup.tar" extract "homeassistant.tar.gz" from "data" folder)
Start the "DB Browser for SQLite" Process from above, inject the Data, upload the .db and Reboot Homeassistant ("reboot" command)
*/
/* Remove the temporary tables if they exist */
DROP TABLE IF EXISTS klima_temperatur_innen_to_519_NEW;
/* Create temp tables that can hold the difference between the measurements and create a new sum */
CREATE TABLE "klima_temperatur_innen_to_519_NEW" (
"ts" INTEGER,
"valuemean" FLOAT,
"valuemin" FLOAT,
"valuemax" FLOAT
);
/* Insert the hourly records from Toon - max 5 years */
INSERT INTO klima_temperatur_innen_to_519_NEW (ts, valuemean, valuemin, valuemax)
SELECT field1, round(field2 / 1, 3), round(field3 / 1, 3), round(field4 / 1, 3)
FROM klima_temperatur_innen_to_519;
/* Remove any overlapping records from Toon which are already in Home Assistant */
DELETE FROM klima_temperatur_innen_to_519_NEW
WHERE
ts >= (SELECT MIN(start_ts) FROM statistics WHERE metadata_id = 519);
INSERT INTO statistics (mean, min, max, metadata_id, created_ts, start_ts)
SELECT valuemean, valuemin, valuemax, 519, ts, ts FROM klima_temperatur_innen_to_519_NEW WHERE true
ON CONFLICT DO UPDATE SET sum = excluded.sum;
/* Remove the temporary tables */
DROP TABLE IF EXISTS klima_temperatur_innen_to_519_NEW;
DROP TABLE IF EXISTS klima_temperatur_innen_to_519;
DROP TABLE IF EXISTS gas_quantity_CurrentGasQuantity_10yrdays;</sql><current_tab id="0"/></tab_sql></sqlb_project>
For Humidity
Source *.csv:
EPOCH,Mean,Min,Max
1649764800,71.1,70.8,71.4
(Make sure to remove the Header from *.csv before importing!)
DB-Browser Script:
<?xml version="1.0" encoding="UTF-8"?><sqlb_project><db path="C:/Users/user/Downloads/DB.Browser.for.SQLite-3.12.2-win64/Umzug - Alte Werte csv/statistics_ready2import_08.05/HA Versuch 9.5. Nach Full Restore auf Testsystem/home-assistant_v2.db" readonly="0" foreign_keys="" case_sensitive_like="" temp_store="" wal_autocheckpoint="" synchronous=""/><attached/><window><main_tabs open="structure browser pragmas query" current="3"/></window><tab_structure><column_width id="0" width="406"/><column_width id="1" width="0"/><column_width id="2" width="100"/><column_width id="3" width="5365"/><column_width id="4" width="0"/></tab_structure><tab_browse><current_table name="0,0:"/><default_encoding codec=""/><browse_table_settings/></tab_browse><tab_sql><sql name="Import Toon data">/*
This is a stripped Version of the Great https://github.com/patrickvorgers/HA-Import-Toon-Data Script.
It does NOT contain all possible important information, for more, head over to the original Script.
ALWAYS Keep a Backup, best Case is to try it on a diffent System, before importing to your productive System.
1) Search and Replace:
klima_feuchtigkeit_innen_to_522
This is the csv to be imported. The same Filename is used as temporary Name for the import Table.
So, e.g. if you want to import History coming from ID 45, and the ID where you want to import it to has ID 319, name your csv import_45_to_319.csv.
Then do a Search and Replace inside of this sqbpro-File, and replace "klima_feuchtigkeit_innen_to_522" with "import_45_to_319".
The First Number indicates the old ID the Data came from, the second Number indicates the ID, where the Data will be merged with.
You can use any Name you want, but i found it easy to keep an Overview over different Import Jobs.
As the Table Names are indiviudal, you can just glue multiple sqbpro-Files together by appending each other.
2) Search and Replace:
522
This is the ID where the imported Data should be appended to/merged with.
If you want your Date get imported to 319, do a Search and Replace inside of this sqbpro-File, and replace "522" with "319".
3) Follow all Instructions from the original https://github.com/patrickvorgers/HA-Import-Toon-Data Script.
Most important (but not all!) Steps are:
- Disable HomeAssistant Recorder while making the Backup (Developer Tools/Services/Call Service: Recorder: Disable)
- Create a Backup, download Home Assistant data
- Start Home Assistant SSH Addon
- Make a Connection with WinSCP to Homeassistant
- Stop the Home Assistant Core (Developer tools/Services/Call service: Home Assistant Core Integration: Stop)
- Extract "home-assistant_v2.db" (from "backup.tar" extract "homeassistant.tar.gz" from "data" folder)
Start the "DB Browser for SQLite" Process from above, inject the Data, upload the .db and Reboot Homeassistant ("reboot" command)
*/
/* Remove the temporary tables if they exist */
DROP TABLE IF EXISTS klima_feuchtigkeit_innen_to_522_NEW;
/* Create temp tables that can hold the difference between the measurements and create a new sum */
CREATE TABLE "klima_feuchtigkeit_innen_to_522_NEW" (
"ts" INTEGER,
"valuemean" FLOAT,
"valuemin" FLOAT,
"valuemax" FLOAT
);
/* Insert the hourly records from Toon - max 5 years */
INSERT INTO klima_feuchtigkeit_innen_to_522_NEW (ts, valuemean, valuemin, valuemax)
SELECT field1, round(field2 / 1, 3), round(field3 / 1, 3), round(field4 / 1, 3)
FROM klima_feuchtigkeit_innen_to_522;
/* Remove any overlapping records from Toon which are already in Home Assistant */
DELETE FROM klima_feuchtigkeit_innen_to_522_NEW
WHERE
ts >= (SELECT MIN(start_ts) FROM statistics WHERE metadata_id = 522);
/* Copy the new information to the statistics table
id => primary key and automatically filled with ROWID
sum => calculated new_sum valuemean
metadata_id => the fixed metadata id of this statistics (see top)
created_ts => set to the timestamp of the statistic
start_ts => timestamp of the statistic
The sum is updated in case the record is already in Home Assistant
"where true" is needed to remove parsing ambiguity
*/
INSERT INTO statistics (mean, min, max, metadata_id, created_ts, start_ts)
SELECT valuemean, valuemin, valuemax, 522, ts, ts FROM klima_feuchtigkeit_innen_to_522_NEW WHERE true
ON CONFLICT DO UPDATE SET sum = excluded.sum;
/* Remove the temporary tables */
DROP TABLE IF EXISTS klima_feuchtigkeit_innen_to_522_NEW;
DROP TABLE IF EXISTS klima_feuchtigkeit_innen_to_522;
DROP TABLE IF EXISTS gas_quantity_CurrentGasQuantity_10yrdays;</sql><current_tab id="0"/></tab_sql></sqlb_project>
Thanks again @Lasoul for this great Script and Documentation, am really happy to have a Way to import my old historical Data!
First of all, sorry for my English
And thanks to everyone for the instructions on loading the historical data of electricity productions.
In my case, my Data Base is MariaDB and itās a little different from the SQL sentences.
Iām from Spain and my electricity company is Iberdrola. I could download data from https://www.i-de.es/
and after that, I must work on this data to prepare to load the table.
The Solar Production, I downloaded from the Solar Inversor and from the Shelly meter.
This is de Script that I used
CREATE TABLE IF NOT EXISTS `import_iberdrola` (
`fecha` datetime DEFAULT NULL,
`value` float DEFAULT NULL,
`ts` int(11) DEFAULT NULL,
`diff` float DEFAULT NULL,
`old_sum` float DEFAULT NULL,
`new_sum` float DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_general_ci;
/* importar a mano los datos en las columnas Fecha y Value. Los valores tienen que ser
en Kwh y acumulados
*/
UPDATE import_iberdrola
SET ts=UNIX_TIMESTAMP(fecha ) ;
INSERT INTO import_iberdrola (ts, value, old_sum)
SELECT start_ts, state, sum
FROM statistics
WHERE metadata_id = 53;
UPDATE import_iberdrola a
INNER JOIN
( SELECT ts, round(VALUE - (lag(value, 1) OVER (ORDER BY ts)), 3) AS diff
FROM import_iberdrola b
ORDER BY ts
) AS CTE_DIFF_import_iberdrola_VALUE
ON a.ts = CTE_DIFF_import_iberdrola_VALUE.ts
SET a.diff = CTE_DIFF_import_iberdrola_VALUE.diff
WHERE a.old_sum IS NULL;
UPDATE import_iberdrola a
INNER JOIN
( SELECT ts, old_sum - (lag(old_sum, 1) OVER (ORDER BY ts)) AS diff
FROM import_iberdrola b
ORDER BY ts
) AS CTE_DIFF_import_iberdrola_VALUE
ON a.ts = CTE_DIFF_import_iberdrola_VALUE.ts
SET a.diff = CTE_DIFF_import_iberdrola_VALUE.diff
WHERE a.old_sum IS NOT NULL;
DELETE FROM import_iberdrola
WHERE diff IS NULL;
UPDATE import_iberdrola
SET diff = round(old_sum, 3)
WHERE (diff IS NULL);
UPDATE import_iberdrola SET diff = 0
WHERE (diff < 0.0) OR (diff > 1000.0);
UPDATE import_iberdrola a
INNER JOIN
( SELECT ts, SUM(diff) OVER (ORDER BY ts) AS new_sum
FROM import_iberdrola b
ORDER BY ts
) AS CTE_DIFF_import_iberdrola_VALUE
ON a.ts = CTE_DIFF_import_iberdrola_VALUE.ts
SET a.new_sum = round(CTE_DIFF_import_iberdrola_VALUE.new_sum, 3)
;
INSERT INTO statistics (state, sum, metadata_id, created_ts, start_ts)
SELECT new_sum, new_sum, 53, ts, ts FROM import_iberdrola WHERE true /* Change */
ON DUPLICATE KEY UPDATE sum = import_iberdrola.new_sum;
UPDATE statistics_short_term
SET sum = sum + (
SELECT (SN.new_sum - SST.sum) as correction_factor
FROM
statistics_short_term as SST, import_iberdrola AS SN
WHERE
SST.metadata_id = 53 AND /* Change */
SST.start_ts = SN.ts
ORDER BY state DESC
LIMIT 1
)
WHERE
metadata_id = 53;
Also I put another sentences that I used to help me
SELECT start_ts, FROM_UNIXTIME(start_ts) AS fecha_datetime
FROM statistics
WHERE metadata_id = 53 AND start_ts < 1698451200
ORDER BY start_ts DESC;
DELETE FROM statistics
WHERE start_ts < 1698451200 AND metadata_id = 53
SELECT start_ts, FROM_UNIXTIME(start_ts) AS fecha_datetime
FROM statistics_short_term
WHERE metadata_id = 55
ORDER BY start_ts DESC;
SELECT ts, FROM_UNIXTIME(ts) AS fecha_datetime
FROM import_iberdrola
ORDER BY ts DESC
DELETE FROM import_iberdrola
I hope that I could help to another other person.
Thaks a lot
Thanks for these scripts @JeMaGo !
If I understand correctly, after you created your import_iberdrola table, you manually imported the data from your provider, then you run the remaining sections of your script (starting by converting the dates to Unix format).
Do I need to delete anything from the STATISTICS table before using your scripts?
Can you provide an example file from your provider?
Thank you!!
Hi juskalalie, I use āheidisqlā and this tool have an utilitie that you can import files CSV
This is the first lines that one of my CSV
Blockquote
FECHA-HORA;Total Consumo KW
2021/09/01 01:00;1722,5947
2021/09/01 02:00;1722,9077
2021/09/01 03:00;1723,2317
2021/09/01 04:00;1723,5907
2021/09/01 05:00;1723,8377
2021/09/01 06:00;1724,1707
2021/09/01 07:00;1724,4187
2021/09/01 08:00;1724,7107
2021/09/01 09:00;1725,0237
2021/09/01 10:00;1725,2967
2021/09/01 11:00;1725,5337
You must delete lines (records) from this sensor, I use this sentence
DELETE FROM statistics
WHERE start_ts < 1698451200 AND metadata_id = 53
metadata_id is de number of the sensor. And de start_ts is the datetime that must delete.
I use this other sentence for know de start_ts
SELECT start_ts, FROM_UNIXTIME(start_ts) AS fecha_datetime
FROM statistics
WHERE metadata_id = 53 AND start_ts < 1698451200
ORDER BY start_ts DESC;
I hope that it is usefull for you
Bye
Hey there, just stumbled across and could it potentially be possible to use this to rewrite historic data of an input_number entity as well?
It should work as long as statistics are stored for the particular entity. I checked my install but for my helpers there are no statistics stored so it could be that you have to define it manually (template) so that longterm statistics are stored.
Nice work, @Lasoul. I have a question: I have around 7 years worth of data feeds from OpenEnergyMonitor/EmonCMS. I kept all high-res data, that is, instantaneous power and accumulated energy for every 10 seconds. I can convert it to the required CSV format - but would that be usable as is? Itās about 23M values per feed. Should I convert the feeds to an hourly CSV or let your script use them raw?
Performance wise I would convert them to hourly intervals because that is also the highest resolution interval that Home Assistant uses.
Home Assistant queries all records in a specified time interval and will iterate in code over them. In case you add 10 sec resolution intervals it would result in a very large dataset which would take a lot of time to process.
Thanks, I think Iāll convert the values to hourly averages for import. But Iāll also keep pulling the high resolution data in HA for short term stats after the import to be able to plot the daily hi-res power charts. Hourly doesnāt really cut it. Now if HA had proper zoomable charts upstream, without having to fall back to externally integrated onesā¦ But thatās a different issue altogether.
This looks great! I have already created the import files in .csv. Now Iām waiting for the MariaDB support. Any outlook yet on when this will be supported?
I made a (small) start on it this week. Hopefully I can find some more free time in the upcoming period to get it done.
Update:
I finished today the rewrite of the existing SQLite version into a compatible version that works for MariaDB but it needs some more testing before release.
@fversteegen, are you also still interested?
Count me in!