Anyone managed to get onto something here?
I’m soon planning to migrate since I want more long term statistic and I have mySQL already running on server that is running HASS.
I have no idea who is able to sort out the remaining issues. I still don’t feel confident enough to migrate using this guide (still be best! but not error-free unfortunately) because reliability is a must.
What about using DBeaver for migration - could it be useful?
Heard about it here (Energy Management in Home Assistant - #1156 by ChirpyTurnip) first and saw few others like here (Home Assistant Add-on: PostgreSQL + TimescaleDB - #22 by Ecard) using this software for migrating to PostgreSQL.
Maybe it’s just another tool to achieve the same like on the CLI as shown in former posts.
Seems like Download DataGrip: Cross-Platform IDE for Databases & SQL works just fine with converting sqlite to mysql.
I’ve set it up and am copying sqlite to mysql and HASS seems to work fine with it. Will test now once my second transfer is complete.
First I tested it and it was all seemingly working but now I’m converting the most up to date DB and will continue to use mysql for the following day and see how it goes.
Does the trial version do the trick totally or does one need a license? In other words, any limits on fucntionality, database size, amount of records?
Great to hear! I did some further research and I think those information would guide us to a successful migration:
- Tool 1:
Download DataGrip: Cross-Platform IDE for Databases & SQL - Tool 2:
SQLite to MySQL Conversion and sync. | DBConvert - Knowledge 1:
sqlite - Quick easy way to migrate SQLite3 to MySQL? - Stack Overflow - Knowledge 2:
https://www.reddit.com/r/Database/comments/axyf6e/migrating_from_sqlite_to_mysqlmariadb/
And it also seems to be possible to migrate back/vice versa (MySQL to SQLite).
Keep us posted what your results are @zagi988. Especially if Energy Dashboard and other things (noted as malfunctioning in the former posts) are working fine.
I’m trying to install mysql but I always get this error
sudo: apt: command not found
Results? Working?
I did an initial trial yesterday, and I find the import in mariadb to be extremely slow . My DB was 484491264 bytes, the sql dump 368643027 bytes (less!), the compressed sql for mariadb 44741761 bytes (1/10th).
FYI, I found another record of a conversion: Migrating Home Assistant from sqlite3 to MySQL 8.0 .
I do not want to have a long down time, I think we need to use features offered by sqlite3 that we see at How To Use The SQLite Dump Command and split up the process in steps to limit down time.
- Copying the DB to a workstation (i.e. a faster computer with easy access).
- while the system is live is doable - I’ld prefer rsync but I do not have it on my HAOS.
- Prepare the import on the workstation:
- Create SQL script to create the SCHEMA (all the tables);
- Export the most recent data needed to continue regular functionnality
- Use “.mode insert” and appropriate selects to create the inserts.
- Most recent data to be inserted first.
- use “INSERT OR UPDATE” rather then “INSERT”
- Export all the data (without the SCHEMA) to another file.
- Most recent data to be inserted first.
- use “INSERT OR UPDATE” rather then “INSERT”
- optimize by grouping the inserts/using transactions.
- IGNORE FOREIGN KEYS.
- set autoincrement numbers to biggest value + margin .
The margin allows HA to continue adding some data after the import.
- Perform a first import without bringing HA offline.
- Apply the schema;
- INSERT/UPDATE the most recent data.
- This way the target database will already have an image.
- One could test the data base with a test HA instance on the workstation to see everything is fine.
(In that case, it may be needed to restart the entire import after testing because HA will add data).
-
- Stop HA
- Do step 2 again, but skip step 3 and continue here.
- Import the small file with the most recent data;
- Update the HA configuration
- start HA;
- Import the big file while HA is online.
Most of the data will already be imported, so the user already has the previously imported history.
There may be a gap which corresponds to the delay between the first import and the final import.
The insert/update will ensure only the changes are applied.
Automate all of the above.
I started a script here: https://gist.github.com/38854d24863c1081154cf08d75e6535a . It does not include my proposed procedure above.
For me it seems to be working. Energy dash was ok, everything was ok only that history was slooow.
mySQL is running on a server that holds the database on HDD’s that are not that fast and it was either unable to process that much at the time or it was the hard drives being too slow.
So I switched back to sqlite until SSD for that server arrives.
If history is still slow, the slow queries could be analysed to add indexes to speed them up.
I have been trying to migrate using the steps at the top but because its based on an old database schema and in 2022.4 things are a bit different people will have issues.
I am writing notes as I fix the issues I find but there are a few extra gotchas FYI all.
Meanwhile there have been further changes related to the database scheme. According to release notes of 2022.4 to 2022.6 e. g. attributes and events have been outsourced from states to separate tables. All that needs to be considered to get a stable, working MySQL database after conversion.
So yes, double the question here Migrating home assistant database from sqlite to mariadb - #88 by WeterPeter
Hi,
I have found quite simple solution to do that if you have already running somewhere mysql database
-
Stop home assistant and take backup
-
Take file home-assistant_v2.db and convert SqlLite to Mysql using sqlite3-to-mysql (python sqlite3-to-mysql · PyPI) - it transfer whole database to mysql database
-
Export data from mysql server - I used MySql Workench to export data to file (set Include Create Schema)
-
I had to replace in all files utf8mb4_unicode_ci to utf8mb4_general_ci because I had in first some errors during import related with different database versions.
-
Start fresh mariaDb addon in hass with exposed port to outside world
-
Import data to MariaDb - I used MySql Workench to import data from file.
-
Enable mariadb url in recorder in configuration.yaml
recorder:
db_url: !secret mariadb_url
In the end I did not have to think about foreign key and my long term statistics works fine. I have not noticed any problem however I have finished my migration today few minutes ago
Hi Mariusz, we are a few minutes further ahead in time Did you notice any problems that you care to mention for those interested in migrating as well?
Also very interested in this because… SQLite database s**ks a lot meanwhile.
I had similar problems with purge of the database led to a corrupted db. With your excellent guide I was able to save all the long term data and move to maria db. Thank you very much!
Hi - thanks for this. I tried to run sqlite3mysql and it threw an error saying
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ‘-assistant_v2’ at line 1
Any ideas on this? Wondering why I’m getting the error but assume you did not?
Thanks!
Hi, Yesterday I give this a try. I executed all steps indicated on message from @jr3us last Jan 11th but didn’t success because of this error:
(ERROR) components/recorder/run_history.py
Error executing query: (MySQLdb.IntegrityError) (1364, "Field 'run_id' doesn't have a default value") [SQL: INSERT INTO recorder_runs (start, end, closed_incorrect, created) VALUES (%s, %s, %s, %s)] [parameters: (datetime.datetime(2022, 8, 18, 19, 21, 18, 194665, tzinfo=datetime.timezone.utc), None, 0, datetime.datetime(2022, 8, 18, 19, 21, 18, 415689, tzinfo=datetime.timezone.utc))] (Background on this error at: https://sqlalche.me/e/14/gkpj)
21:21:18 – (ERROR) Recorder
I’ll take a look at this later today, I’m sending this notice in case anyone already knows what is happening and can help me.
Cheers