Migrating from MySQL (mariaDB) back to sqlite

Using sed on Windows ( via MinGW or WSL or whatever method) is much faster than Notepad++ .
Especially if your sql file is very big.

1 Like

using sed could also be done from the HassOS SSH terminal too and make it platform-independent to convert. Its already part of the HassOS system.

Hello, really want to get this working. Love the fact that it only takes the long term statistic values.
However, trying the below command on a HAOS environment I get the error that the folder mysql_host doesn’t exist. Am I doing something wrong or will this just not work ?

$ mysqldump --no-create-info --complete-insert --skip-extended-insert --compact -h <mysql_host> -u <user> -p <ha_db> statistics_meta statistics statistics_runs statistics_short_term > ha_statistics_sql_dump.sql

“<mysql_host>” is a placeholder for the hostname of your MySQL server.

right, should have known… thanks for the reply!

9 posts were split to a new topic: What’s the best database for HA with a lot of entities?

For anybody else trying the great info by mneuron, you need to enable the port in the configuartion of the MAriaDB addon (enter the 3306 on the left):

In addition, the regular expression for notepad++ should be

(_binary\s0x)(\w*)
and
X'\2'
1 Like

I used this method, and it worked great! I only migrated the tabled for the (long term) statistics:

  • statistics
  • statistics_meta
  • statistics_runs
  • statistics_short_term

Therefor I only needed to execute steps 1-7 & 11-17

Great work, thanks!

Have you noticed any performance chances since migrating back to sqllite?

Having migrated back recently, average processor use appears to have increased by about 5 percent with more spikes (I am recording two months worth of data, not just long term statistics), but the logbook view seems snappier.

2 Likes

I’ll need to get back on this when the states and events of the 10 days are recorded. It is just 2 days ago now so not much recorded yet

1 Like

I made a change to my energy monitoring last week which resulted in losing the historical data. I didn’t realise until several days had passed.

I took it as an oppurtunity to do a cold migration back to SQL lite (disabling the MariaDB add on, updating configuration.yaml to remove the integration in the recorder: and restarting.

After a restart the sqlite db was recreated and everything carried on as normal.

The performance improvements are noticeable. Every view on the front end loads instantly; navigation is so much more snappy now.

I still have a very small DB and obviously as the days go on it’ll grow, but I’m happy with the switch so far.

There is also a command line tool to export a MariaDB/MySQL database to a new SQLite3 database.
I’ve successfully used it today to migrate back to SQLite3, all data were migrated so short and long term statistics.
I use Home Assistant Core installation.

Steps should be as follow for Home Assistant OS:

  1. Using Terminal & SSH addon, connect and install Python, Pip and MariaDB-to-SQLite3
apk add python3
apk add py3-pip
pip install mysql-to-sqlite3
  1. Stop Home Assistant
  2. Migrate the database:
    mysql2sqlite -f /config/home-assistant_v2.db -d <database_name> -u <mariadb_user> -p -V --use-buffered-cursors
    Replace <database_name> and <mariadb_user> by the database name and user used by Home Assistant
  3. Comment the db_url in your YAML configuration
  4. Restart Home Assistant core.

Really check what I wrote before executing I.E. for path as I had not the opportunity to test with HA OS but main ideas are here.
All tables are migrated using this command so if you use the same database for Home Assistant statistics and something else, these other data will be migrated.
See the tool documentation to see how to make a more precise migration if needed.

10 Likes

All data will be migrated or only the long term statistics?

All data will be migrated.
I’ve just edited my previous post to precise all data will be migrated.

1 Like

This worked great for me on Home Assistant OS (on a home assistant blue). I ran it from my ubuntu laptop, and it took just a few minutes to convert a multi-gigabyte MariaDB database.

For ease of cleanup, I installed it in a python virtual environment:

python3 -m venv venv
source venv/bin/activate
pip install mysql-to-sqlite3

I could then run it from my Ubuntu laptop against my HA host as follows (after enabling port 3306 in the MariaDB add on settings as shown in the screenshot by @AleXSR700 above):

mysql2sqlite -f home-assistant_v2.db -h <home assistant host> -P 3306 -d homeassistant -u homeassistant -p -V --use-buffered-cursors

It took 7 minutes to create a 3GB database in home-assistant_v2.db, which I then copied into the /config directory over SMB. I then commented out the db_url line that specified the MariaDB details in the recorder section of my configuration.yaml. I restarted Home Assistant and it picked up the new Sqlite3 database, with all historical data intact.

Thanks for the great info!

7 Likes

just curious, what speed, after the migration, are HA running faster now ?

I can’t tell I’m afraid. Things were already fast with no noticeable delays (apart from very slow dB migrations on mariadb during certain monthly upgrades).

1 Like

I haven’t noticed any performance changes since migrating back from MariaDB to SQlite. I also kept an eye out on the memory/cpu usage in proxmox, but no significant changes there.

1 Like

I don’t think it makes a differenc. Have not noticed anything on my RPi4. So if there is a measureable one, then it is not noticeable real-life.
But it is one less addon that needs supporting.

And for some reason, the SQLite version is almost 1 GB smaller than the MariaDB database was.