True. But my posting is one year old. This was way before long term stats became a part of HA.
Of course. But before than somebody deletes his database and loses his history… i would put a flag on this
If you read the steps in my original instructions, there is nothing deleted… Rather, the database is ‘dumped’ in sqlite to an sql file to recreate the database in mariadb, hence backed up during migration.
So from that point of view, the original information is still valid.
Regards.
I have old .db file and for a week I moved to MariaDB.
Is it possible to import old data without removing current data?
My answer was to Jpsy, not you. Jpsy said in old post “Nothing to migrate, delete and boom”. Today DB mantains ALSO energy dashbard history.
Am now a bit lost: what SHOULD be the right way to move the .db to MariaDB including the long term statistics?
Here is a video on youtube that should help you. Pretty simple now with home assistant OS.
I did this migration before HASS OS was around.
There are others available on youtube, i just did a quick search there.
Getting MariaDB up and running is indeed the easy part. The challenge is to maintain the historical data (ie: the actual migration of data from sqlite → MariaDB)
is it possible ?
Thanks for this guide. It seems to work also for migrating long-term statistics (such as energy management), however it seems that after the migration these values are no longer updated. I guess something similar with AUTO_INCREMENT
should be done for the statistics
, statistics_meta
and statistics_run
tables (more information here)? Any idea on how to alter these tables to restore the functionality?
At the time I did the migration in the original post, the long term statistics you mention were not available in HACS yet. as you mentioned, you would probably need to do the same steps for the referenced tables and I would imagine would be similar to the steps for turning off auto index, making note of the current index values, then copy the data, update the auto increment value, and turn auto increment back on.
phpmyadmin will be your friend when figuring out the auto increment structure for the new tables.
I‘m looking for a bullet-proof, up to date (meaning HA 2021.12 release) guide on how to migrate all data from SQLite to MariaDB/MySQL database.
This one here is really the best (because: the only one!) I could find after searching the whole forum. Unfortunately seems to be outdated as stated in the previous comments. Maybe someone can guide on an up to date, proven guide - or at least add the missing modification steps for the „newer“ tables.
I wonder if InfluxDB would be a good option so you could set the number of measurements per entity to keep and avoid having to trim/purge/manage the database over time?
There is much more of us that would like to migrate to MySQL but don’t know actually how to.
I’m running MySQL DB for my webpage and I have basic mysql knowledge but not nearly enough to do this migration myself and these guides don’t really help with newest HA updates.
I think I have just managed to migrate my SQLite DB to a Mariadb on another server (with over 1million rows in the states table and 2.7million in the events table!)… it appears to still have the entire available history in home assistant and is updating without errors in the logs. I had to do some additional steps on the database side to the original poster’s instructions, mainly to include the statistics tables and their foreign key constraints…
Details below:
Prepare the sqlite3 data (as per OP instructions, pasted below for ref):
# The first few steps here are typical install and configuration of mariadb.
sudo apt install mariadb-server mariadb-client
sudo apt install libmariadb-dev sqlite3
sudo mysql_secure_installation
I did the following 2 steps using phpmyadmin
Created a user hass with all permissions
Created a table hass
Stop the hass server process on your server.
sqlite3 home-assistant_v2.db .dump > hadump.sql
git clone https://github.com/athlite/sqlite3-to-mysql
copied sqlite3-to-mysql to same directory as hadump.sql
bash sqlite3-to-mysql hadump.sql > haimport.sql
mysql -u hass -p -h localhost hass < haimport.sql
## the above mysql statement imported about 55K records in both states and events, 360 into recorder_runs, as well as 6 records into schema_changes for my home-assistant with a 2 day history. Your mileage will vary.
“Fix” the data and update tables:
mysql hass
update events set event_data = REPLACE(event_data, '`', '"');
update states set attributes = REPLACE(attributes, '`', '"');
# Remove Foreign Key Constraints
ALTER TABLE `states` DROP FOREIGN KEY `states_ibfk_1`;
ALTER TABLE `states` DROP FOREIGN KEY `states_ibfk_2`;
ALTER TABLE `statistics_short_term` DROP FOREIGN KEY `statistics_short_term_ibfk_1`;
ALTER TABLE `statistics` DROP FOREIGN KEY `statistics_ibfk_1`;
# alter the primary key in the tables to use auto_increment, starting from the last run id +1 (replace N+1 with the maxid+1
select max(run_id) from recorder_runs;
alter table recorder_runs modify column run_id INT NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
select max(event_id) from events;
alter table events modify column event_id int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
select max(state_id) from states;
alter table states modify column state_id int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
SELECT MAX(change_id) FROM schema_changes;
ALTER TABLE schema_changes MODIFY COLUMN change_id int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
SELECT MAX(id) FROM statistics;
ALTER TABLE statistics MODIFY COLUMN id int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
SELECT MAX(id) FROM statistics_meta;
ALTER TABLE statistics_meta MODIFY COLUMN id int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
SELECT MAX(run_id) FROM statistics_runs;
ALTER TABLE statistics_runs MODIFY COLUMN run_id int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
SELECT MAX(id) FROM statistics_short_term;
ALTER TABLE statistics_short_term MODIFY COLUMN id int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=N+1;
# Add back the foreign key constraints
ALTER TABLE`states` ADD CONSTRAINT `states_ibfk_2` FOREIGN KEY (`event_id`)REFERENCES `events`(`event_id`) ON DELETE CASCADE ON UPDATE RESTRICT;
ALTER TABLE`states` ADD CONSTRAINT `states_ibfk_3` FOREIGN KEY (`old_state_id`)REFERENCES `states`(`state_id`) ON DELETE RESTRICT ON UPDATE RESTRICT;
ALTER TABLE`statistics` ADD CONSTRAINT `statistics_ibfk_1` FOREIGN KEY (`metadata_id`)REFERENCES `statistics_meta`(`id`) ON DELETE CASCADE ON UPDATE RESTRICT;
ALTER TABLE `statistics_short_term` ADD CONSTRAINT `statistics_short_term_ibfk_1` FOREIGN KEY (`metadata_id`) REFERENCES `statistics_meta`(`id`) ON DELETE CASCADE ON UPDATE RESTRICT;
Great! Those are the additional steps, right? Together with the above (outdated) guide it worked with HA 2021.XX.Y… ?
Sorry - yes I am currently running 2021.12.3 (and updated instructions above with O/Ps migration steps for a full reference)
Slightly related, I also installed Mariadb on a Mac mini (running as a server) using this guide:
or you can use their docker container (Docker Hub)
nb make sure you have created a user from the mysql command line as well, e.g.:
create database hass;
grant all privileges on hass.* to 'user'@'%' identified by 'password';
flush privileges;
and an easy way to get phpmyadmin running to be able to connect to your Mariadb (if not running the add-on on a Raspberry Pi) is using the phpmyadmin docker image:
docker run --name myadmin -d -e PMA_ARBITRARY=1 -p 9086:80 phpmyadmin
then you can connect to http://server-name:9086 to log in and view the db from a UI.
I also set up InfluxDB with the intention of having a shorter purge_keep_days in order to keep the db at a reasonable size.
Again this is on a server external to the home assistant instance running on a raspberry pi in order to reduce the strain on its SD card. The add-ons for influx, Mariadb and phpmyadmin are awesome though if you have an SSD on your pi.
ex config:
influxdb:
api_version: 2
ssl: false
host: !secret influxdb_host
port: 8086
token: !secret influxdb_token
organization: !secret influxdb_org
bucket: homeassistant
tags:
source: HA
tags_attributes:
- friendly_name
default_measurement: units
I went with Mac home-brew but docker is probably a good alternative too.
Thanks, I’ll definitely give this a go. I’ll be standalone as well (as a vm) vs Docker as that’s how my setup is structured currently.