Sorry for the late reply - I somehow got not notified about your post!
at the step
insert or ignore into statistics
select c.id, c.created, c.created_ts, m.new_id as metadata_id, c.start, c.start_ts, c.mean, c.min, c.max, c.last_reset, c.last_reset_ts, c.state, c.sum
from corrupt.statistics c
inner join (
select o.id as old_id
, n.id as new_id
, case when row_number() over (partition by o.id) > 1 or row_number() over (partition by n.id) > 1 then 0 else 1 end as test
from corrupt.statistics_meta o
inner join statistics_meta n on n.statistic_id = o.statistic_id
) m on m.old_id = c.metadata_id and m.test = 1 order by id;
I get the same error again:
Execution finished with errors.
Result: no such column: c.created_ts
At line 1:
insert or ignore into statistics
select c.id, c.created, c.created_ts, m.new_id as metadata_id, c.start, c.start_ts, c.mean, c.min, c.max, c.last_reset, c.last_reset_ts, c.state, c.sum
from corrupt.statistics c
inner join (....
Do you have an idea to fix it? Do you need a sample or something?
Maybe your DB schema version is old and not compatible. You may need to make your own script to make your old tables compatible first.
Sorry, I can’t help with that.
You may try to make HA open that old DB file, I think it may be able to update the schema for you.
Hey, this is super helpful - greatly appreciate you posting this.
I was able to import all of my historic energy data from my corrupt DB using this. One thing though, I have a gas sensor setup and the values showing in my dashboard are now astronomically high for last year “1,989,615 ft^3”. Even if I divide the 1.9million by 100 (ft^3 → CCF), that value doesn’t seem right, way too high.
My gas meter reads in CCF, I guess there is some unit conversion that went bad when I ran the query. Any idea on where to start troubleshooting?