Migrating home assistant database from sqlite to mariadb

I have gleaned most of the following information from the BurnsHA video regarding migrating Home assistant from sqlite3 to mariadb. I did add a few steps that were not mentioned in his video that were required for my migration.
Many thanks to all who have tread this water previously.

I am using a Raspberry pi 3 as the HA server running hassbian, and I have done the following steps to migrate from sqlite3 to mariadb:

# The first few steps here are typical install and configuration of mariadb.

sudo apt install mariadb-server mariadb-client

sudo apt install libmariadb-dev sqlite3

sudo mysql_secure_installation

I did the following 2 steps using phpmyadmin

Created a user hass with all permissions

Created a table hass

Stop the hass server process on your server.

sqlite3 home-assistant_v2.db .dump > hadump.sql

git clone https://github.com/athlite/sqlite3-to-mysql

copied sqlite3-to-mysql to same directory as hadump.sql

bash sqlite3-to-mysql hadump.sql > haimport.sql

mysql -u hass -p -h localhost hass < haimport.sql

## the above mysql statement imported about 55K records in both states and events, 360 into recorder_runs, as well as 6 records into schema_changes for my home-assistant with a 2 day history. Your mileage will vary.

mysql -u hass -p hass  

## The back-quotes need to be converted into double quotes in the events and states table:

update events set event_data = REPLACE(event_data, '`', '"'); 
update states set attributes = REPLACE(attributes, '`', '"');

NOTE: the 2 updates above have a problem showing the backquote..

select max(run_id) from recorder_runs;

alter table recorder_runs modify column run_id INT NOT NULL AUTO_INCREMENT, 

AUTO_INCREMENT=NNNNN;   ## this is 1 more than the max above from table_recorder

alter table states drop foreign key states_ibfk_1;

select max(event_id) from events;

alter table events modify column event_id int(11) NOT NULL AUTO_INCREMENT,
AUTO_INCREMENT=NNNNN;   ### this is 1 more than the max above from events

select max(state_id) from states;

alter table states modify column state_id int(11) NOT NULL AUTO_INCREMENT,

AUTO_INCREMENT=NNNNN; ###### This is 1 more than the max above from states

### I am not certain that the next step is required, but was in sqlite and not in mariadb.

alter table states add foreign key(event_id) references events (event_id);

#### Finally, modify configuration.yaml to include the following statements:
#### use the server ip address with mysql installed and userid, password, and database you created above

recorder:
  db_url: mysql://hass:[email protected]/hass

start the hass server.

6 Likes

Sorry to say that these instructions did not work for me :frowning:

Installed the official MariaDB addon, then stopped Homeassistant, created and copied the haimport.sql
in to the mariadb docker container then did the following inside it:

mysql -u hass -p -h localhost hass < haimport.sql (Took 6 hours for ~2GB)

Hass couldn’t read the data at this point, so I moved on to the steps below:

update events set event_data = REPLACE(event_data, '', '"'); update states set attributes = REPLACE(attributes, '', '"');

select max(run_id) from recorder_runs;

alter table recorder_runs modify column run_id INT NOT NULL AUTO_INCREMENT,
AUTO_INCREMENT=NNNNN; ## this is 1 more than the max above from table_recorder

alter table states drop foreign key states_ibfk_1;

select max(event_id) from events;

alter table events modify column event_id int(11) NOT NULL AUTO_INCREMENT,
AUTO_INCREMENT=NNNNN; ### this is 1 more than the max above from events

select max(state_id) from states;

alter table states modify column state_id int(11) NOT NULL AUTO_INCREMENT,

AUTO_INCREMENT=NNNNN; ###### This is 1 more than the max above from states
I am not certain that the next step is required, but was in sqlite and not in mariadb.

alter table states add foreign key(event_id) references events (event_id);

Then, homeassistant did load the history components, but not any of the data.

I am not sure what you mean regarding not loading any data. You should have some error or info messages indicating problems if it didn’t work for you in the home assistant logs, either in homeassistant.log, or in syslog.

Another thought: When you created the db, did you give the user hass all permissions on the db?

My version of home assistant is running on hassbian and a raspberry pi 3. As indicated, the db is also located on the same Rpi, hence the reference to localhost above. I am not using any docker, or hassio.

The above steps were all I needed to migrate.

I mean that when I looked at the history and chose a date that’s well-within my “sqlite time”, there were no entries.

I did chance across this though:


Stating

SQLite databases do not support native dates. That’s why all the dates are saved in seconds since the UNIX epoch. Convert them manually using this site or in Python:

from datetime import datetime
datetime.fromtimestamp(1422830502)

Is it possible that this changed since your migration and that the SQLite dates need to be converted?

When I was looking at the raw data I had during conversion, the data that came out of the sqlite3 dump had sql style dates in it.

Note that I did use sqlite3 and not earlier versions.
NOTE 2: I also have history enabled.

here is a sample record from the sqlite3 dump showing the right datetime format before running the conversion:

INSERT INTO “states” VALUES(3712076,‘sensor’,‘sensor.zha_04a87e7c_1_2820’,‘0.0’,’{“unit_of_measurement”: “W”, “friendly_name”: “Zignore”}’,3884192,‘2019-02-02 02:40:44.167261’,‘2019-02-02 02:40:44.167261’,‘2019-02-02 02:40:44.269919’,‘9e7bf549f25f498a9aabd0b736105f3b’,NULL);

Did you take a look at the raw data that you got when doing the dump from sqlite?

This script worked for me :

All history was migrated ! :slight_smile:

Does this work when running home assistant in docker too?
Not sure how I’ll install the mysqlclient in the virtualenv in that case

I am not familiar with hassio / docker. I performed the steps at the top on hassbian.
Note that I am not having any performance problems having mariadb installed on the same raspberry pi 3 as home assistant is installed on.

Nice one jr3us. Worked great. I did have to tweak your back-quotes update. Looked like it wasn’t formed right or was missing a quote around the back-quote or something got lost in the post the way things convert quotes when you copy them. Thanks a bunch!!

JR

Edit: By the way is it normal to see this line in the logs every time I reboot?

2019-11-13 20:07:00 WARNING (Recorder) [homeassistant.components.recorder] Ended unfinished session (id=112 from 2019-11-14 01:58:10)

I noticed the quoting above regarding the missing backquote in the update statement. I believe I have fixed it now turning the block of code above into Preformatted text, and it looks like the back quotes are showing up correctly.

I don’t know about the warning you are referring to. I don’t recall having seen it before.

Regards

Will this also work when you have HA inside a Docker? And what is the reason to change?
Look inside sql with phpmyadmin ? or are there more reasons to change to a mysql db?

I migrated to mariadb for performance issues when using sqlite(the default db).
When going to the history and logbook side tabs, hassbian took forever to load.

I don’t know how it would work to convert docker/HA to use mariadb, but if you know where the sqlite db and configuration.yaml are in the file system, I imagine that would be the hard part.
Your mileage may vary.

Regards

Hi everyone,
I have tried the migration to MariaDB on hassbian using these instructions, but I keep getting the following error:
Error during connection setup: libmariadb.so.3: cannot open shared object file: No such file or directory (retrying in 3 seconds)

I have tried to install all the dependencies but still I keep getting this error. Anybody had a similar experience?
Thank you.

Hi all,
You are talking about history for states/events migrating right?
I went through this before and gave up because the recorder gets purged anyway periodically and it was not worth the hassle.
Did I miss something, or this is not the recorder data?

Edit: In terms of the actual data I log it to influxDB with a longer/infinite retention, so things like temperature etc etc I can access… but not in the logbook.
Also I used Postgres, but that’s just a preference… same principle applies.

What version of MariaDB are you using?

mysql Ver 15.1 Distrib 10.1.38-MariaDB, for debian-linux-gnueabihf (armv7l) using readline 5.2

Ok, that’s a start the docs for the recorder integration may help here:

For MariaDB you may have to install a few dependencies. If you’re using MariaDB version 10.2, libmariadbclient-dev was renamed to libmariadb-dev . If you’re using MariaDB 10.3, the package libmariadb-dev-compat must also be installed. For MariaDB v10.0.34 only libmariadb-dev-compat is needed. Please install the correct packages based on your MariaDB version.

However you are running 10.1.38 so I’m not 100% sure what dependencies are needed, you may need to experiement! Try libmariadb-dev-compat first and see if that helps.

I have tried all combinations, without success…