Migrating from MySQL (mariaDB) back to sqlite

TBH, I fail to understand the purpose of running mariaDB on the HA device itself.
It add all the inconvenience and none of the benefits.

Also, as I understand it, the add-on doesn’t perform a migration.

I run HA behind an Apache proxy so I can access it behind SSL, I’m not sure I see the link with the database being used.

1 Like

The link is the NGINX Proxy Manager addon needs to store its own data somewhere, and the author decided to do so in the same DB server as HA and require the use of MariaDB. It just happens to be a very convenient way to setup external access via SSL using certificates from DuckDNS, including their renewal. That’s what prompted me to use it, way back when.

And no, the integration does not perform a migration. This was less of an issue before the advent of long-term statistics, as everything else rolls off the database pretty quickly. Hence I never attempted a manual migration, and never had the related timestamp issues you referenced. But obviously if you were going MariaDB-external to MariaDB-addon, you could move the databases.

The only inconvenience is applying updates to the addon occasionally (I’m not a fan of auto-update).

I switched to Mariadb (via the MariaDB Addon) long time ago. But now I want to switch back to SQlite without losing the data.

The first step is clear:
Create a new empty SQLite database first. Let HA do this for you by removing the recorder mariadb settings from the configuration.yaml. This is a very important step as explained in the Home Assistant 2023.4 Release Party.

The second step is the migration of the (statistics and other) data.
What is the suggested way to do that?

3 Likes

I’d like to know this too.

2 Likes

Now it probably does not make sense to use anything other then sqllite, but a long time ago, MariaDB was much faster on HA.

1 Like

Because the optimisation by the HA team for SQLite was huge I wanted to switch back from MAriaDB to SQLite…
I finally succeeded in switching (back) from MariaDB to SQLite and keeping all states/events and statistics. (I’m on 2023.4.1 , the database scheme is 41 )
I used HeidiSQL, connected to the MariaDB and
1.export (with data: ‘replace existing data’) all seperate tables (only DATA, no table ).

  1. change configuration.yaml,
    recorder:
    #db_url (now sqlite is the defaultDB)

  2. delete file ‘home-assistant_v2.db’ if it exist in the config folder.

  3. Using ssh: ha core stop.

  4. (ssh): ha core start , now a new ‘home-assistant_v2.db’ is created. After HA is started directly:

  5. ha core stop. Now in HeidiSQL open ‘home-assistant_v2.db’

  6. clear all DATA rows from all the TABLES

  7. create in table EVENTS a key named ‘created’ (without quotes) with datatype ‘datetime’, in table STATES create a key named ‘created’ with datatype ‘datetime’ and a key named ‘domain’ with datatype ‘varchar’ and length 64.

  8. in notepad++ (which I used) edit the file states.sql: and find&replace every _binary 0x018725fccf354f12f37776356bc7c8b5 (a MariaDB binary blob, this is just an example hex number) to a SQLite blob literal: X’018725fccf354f12f37776356bc7c8b5’
    use in notepad++ search mode= regular expression, in the find field: (_binary\s0x)(\w*) and this in the replace field: X’\2’
    and choose replace all.!

  9. repeat the steps from 9 on the file events.sql

  10. import all previously exported and in NP++ changed files in HeidiSQL with ‘File…Run SQL file’.

  11. If everything was succesfull (REMEMBER that you start with all data removed from all the tables) you will now see all the tables filled again with the same data as was contained in MariaDB.

  12. The keys created in step 8 can now be deleted (SQLite doesn’t use them) but I didn’t and everything went fine.

  13. close HeidiSQL

  14. (make sure the home-assistant_v2.db is the newly filled one in the config folder of home assistant.

  15. In SSH: ha core start

  16. All statistiscs/history/states are back & SQLite is database again.

NB. If the exported tables ‘states.sql’ or ‘events.sql’ are very big then NP++ may take a long time to find&replace.

10 Likes

Thank you very much! I was about to abandon all my history data because of a lot of errors using MariaDB after updating home assistant. But now i can enjoy my energy dashboard and look at past data after this migration.

The only thing i did not get to work was the conversion from MariaDB blob to SQLite blob. I received an Error when i tried to import the converted .sql into ‘home-assistant_v2.db’ using HeidiSQL. Instead of replacing existing blob with X’\2’ i replaced it with NULL because the blob was what gave me the error during import. This seems to work fine.

Don’t know if i lost any significant data by replacing the blob with NULL, but the only thing i really cared about was the energy dashboard, and this works like it did. Looks like the history tab works with the older data too, so i am pleased with the result.

My states.sql file were a couple of millions lines (1 GB) and NotePad++ had to work maybe 5 mins to complete the replacement of text. It will say “not responding”, but just sit back and relax. I used the task-manager in windows to see when it was finished (it drops to ≈ 0% CPU load).

Another take, CLI only

Assumptions:

  • You are only interested in migrating long-term statistics, and are not interested in recent states.
  • You are using a debian derivative

Prerequisites:

  • mysqldump: sudo apt-get install default-mysql-client
  • sqlite3: sudo apt-get install sqlite3

Steps:

  1. Shutdown HA
  2. Make a mysql export of the statistics tables:
$ mysqldump --no-create-info --complete-insert --skip-extended-insert --compact -h <mysql_host> -u <user> -p <ha_db> statistics_meta statistics statistics_runs statistics_short_term > ha_statistics_sql_dump.sql

NOTE: Be sure to include --complete-insert. It took me a while to understand this was not working because the order of fields was different between my grand-fathered mariadb and a brand new sqlite.

  1. comment out the db_url of the recorder in configuration.yaml
recorder:
  # db_url: !env_var DB_URL
  1. remove or move any existing home-assistant_v2.db
  2. Start HA once for an empty sqlite database to be created
  3. Immediately after HA has started successfully, shut it down again
  4. import the statistics in the newly created sqlite db:
$ echo "BEGIN TRANSACTION;
DELETE from statistics_meta;
DELETE from statistics;
DELETE from statistics_runs;
DELETE from statistics_short_term;
$(cat ha_statistics_sql_dump.sql)
COMMIT;
" | sqlite3 home-assistant_v2.db
  1. Take a quick coffee break waiting for the import to be completed
  2. Restart HA

DONE!

Rollback procedure:

  1. Remove the commenting of db_url
recorder:
  db_url: !env_var DB_URL
  1. Restart HA
7 Likes

I’ve run into issues with MariaDB following 2023.4, so I’m looking into options to migrate back.

This is probably a dumb question - but does this mean your steps will not work for those on HAOS? If so, how would these steps change to migrate on HAOS?

Your CLI method doesn’t export & convert the states and events tables!

All you need is mysqldump and sqlite3 ; those are available on most unix-related machine (macOS, FreeBSD, Linux of any kind)

He did say:

Yes, he did say that. And I’m wondering why not. Is it technically not possible? The previous method using Notepad++ include those.

That. I mention debian because the apt-get command to install the tools is from debian.

It just adds a level of complexity for something I don’t care about. I didn’t even tried.

Thanks. But the implications are not very clear to me.

What will I loose if only the long term statistics are migrated?

Let’s say, I do the migration on April 15, 2023 at 13:00 o’clock.

Which data will get migrated?

1 Like

Le’s put it otherwise: If you didn’t tweak your recorder settings, you don’t have any detailed state history older than 10 days right now.

If you see anything older, it comes from long-term statistics.

Interesting thread.

Like others, I moved to MariaDB a long time ago (and the performance improvements were massive at the time), but I’m also intrigued in whether it’s worth moving back. For those that have moved back, do you have any quantitive data on the relative performance with sqlite vs MariaDB that would justify the move?

Personally speaking, MariaDB is working OK for me and I probably won’t consider moving unless MariaDB becomes unsupported, or there is a really significant performance uplift from switching.

(My hardware: RPi4 4GB with SD boot, and USB SSD for data. HA 2023.4.3)

I’m in the same boat. SQLite was causing me massive issues, especially if you needed to ever restore that. Has the restore issue been fixed? I mean, if you have to restore and lose the entire database anyway then what is the point? So I moved to MariaDB and have been pretty happy until recently.

I’ve noticed that recently that about 50% of the time if I simply restart HA it will complain the recorder can’t be started and history is broken and that MariaDB is the problem, but if I reboot the entire Pi then it’s fine. I added a button on my dashboard to show me the DB size and it’s pretty big, about 5GB, but I run my Pi 100% from an SSD and sometimes it may be slow starting, it’s mostly been OK. You might be thinking this is an insane DB size, but Maria allowed me to track stats far longer than SQLite ever did so I kind of went crazy with it.

I think this is a great topic to bookmark in case I feel the need to go back.

I’ve just checked and mine is still only 0.5GB (I keep retention quite tight still), so this is perhaps why I have less problems. I’ve not had any particular issues with HA restart connectivity issues that I’ve noticed, or traced to MariaDB / recorder specifically.

Honestly I don’t have any issues outside of occasional slow start ups or the failed recorder, but I am able to work around it. I’ll pare down my database to try to reign things in but it isn’t causing any real issues right now.

1 Like