HASS Container to HASS OS: MariaDB migration

In the new HA instance to import, first select the homeassistant database on the left pane, then:

delete the database by selecting `drop database’

then create a new database and name it homeassistant

go to import and select the file you just exported, disable partial import and then click on the Import button.

done; please let us know if this worked for you as it might help other people in the future

1 Like

Hi! Thank you for your quick response. The details of my domestic server are:

  • Raspberry Pi 4
  • Raspbian
  • Hass Docker
  • MariaDB 10.11.2

I’m using SSH to access the server and my personal computer is a Mac.

So, could the MariaDB version be a problem?

No problem. You just need to use phpMyAdmin to move the database from one instance to another. The SQL file will be saved on your Mac.

Try the official docker image: Docker

Yes, I have the official phpMyAdmin docker image installed. This afternoon, I will try to follow your “manual” :smiley:
As soon as I have a result, I will write it here.

Ok, this is the result.
phpMyAdmin:

Add-on: phpMyAdmin
A web interface for the official MariaDB add-on

Add-on version: 0.8.3
You are running the latest version of this add-on.
System: Home Assistant OS 9.5 (aarch64 / raspberrypi4-64)
Home Assistant Core: 2023.3.2
Home Assistant Supervisor: 2023.03.1

Please, share the above information when looking for help
or support in, e.g., GitHub, forums or the Discord chat.

s6-rc: info: service base-addon-banner successfully started
s6-rc: info: service fix-attrs: starting
s6-rc: info: service base-addon-log-level: starting
s6-rc: info: service fix-attrs successfully started
s6-rc: info: service base-addon-log-level successfully started
s6-rc: info: service legacy-cont-init: starting
s6-rc: info: service legacy-cont-init successfully started
s6-rc: info: service init-phpmyadmin: starting
s6-rc: info: service init-nginx: starting
s6-rc: info: service init-nginx successfully started
s6-rc: info: service init-phpmyadmin successfully started
s6-rc: info: service php-fpm: starting
s6-rc: info: service php-fpm successfully started
s6-rc: info: service nginx: starting
s6-rc: info: service nginx successfully started
s6-rc: info: service legacy-services: starting
s6-rc: info: service legacy-services successfully started
[16:21:28] INFO: Starting PHP-FPM…
[16:21:29] INFO: Starting NGinx…
2023/03/13 16:28:46 [error] 249#249: *33 FastCGI sent in stderr: “PHP message: PHP Fatal error: Maximum execution time of 300 seconds exceeded in /var/www/phpmyadmin/vendor/phpmyadmin/sql-parser/src/Lexer.php on line 236” while reading response header from upstream, client: 172.30.32.2, server: a0d7b954-phpmyadmin, request: “POST /index.php?route=/import HTTP/1.1”, upstream: “fastcgi://127.0.0.1:9001”, host: “192.168.0.100:8123”

Invalid config

The following integrations and platforms could not be set up:

Please check your config and logs.

:frowning:

What is the size of the export file homeassistant.sql.

Got the solution for your issue and is in the documentation.

Option: upload_limit

By default, the size limit for uploads (for operations such as imports) is set to 64MB. This can be increased with this option, for example, 100 would be 100MB.

go to the add-on and click on configuration; change upload_limit to whatever the size of your homeassistant.sql file is plus say 10MB.

It also seems this is a performance issue, the request taking too long to execute.

You need to change a value in the config file to overcome this limit. However possibly the system might take a very long time to respond (could be days or never succeed).

This is a tricky one to resolve as what file needs changing depends on implementation. You need to find the PHP files and/or NGINX files which have an identifier named mysql.connect_timeout and/or max_execution_time. You will see a value of 300, which you need to change to e.g. 6000 or 0 for infinite.

However 300 seconds is too long for SQL operations (e.g. for 1,000,000 ops it should take less than a handful of seconds to complete). Your system is extremely slow or your database consists of hundreds of millions of records.

Attempt the import by enabling Partial import. You then need to check how much data you are missing.
image

Another way is to split the SQL file into chunks, say divide it into 4 roughly equal parts and try to import them one after the other in sequence.

These are all the tables and their relationships you need to populate. You can split your SQL file into 3 chunks, linked tables on the left, linked tables on the right and unlinked tables on the right (these are the smallest).

My sql file has 160Mb. This is a to big file for phpMyAdmin?
Im only interesting to preserve datas from energy dashboard. Maybe Could I to delete any other data from db?

In case you missed my earlier message you need to do the following:

Go to add-ons and open phpMyAdmin. Then go to configuration and enable Show unused optional configuration options.

Type 200 under upload_limit and click SAVE.

Restart phpMyAdmin under Info. Your upload should not fail.

Yes, that’s the first thing I did, to increase the upload limit. :blush:
Im going to try it again by following your recomendations. Thanks.

Okay, I confirm that the problem is related to either the file size or the performance issue, as you suggested. I need more than 300 seconds to complete the task. I don’t think the split option will work since 90% of the total file size is for a single table. How can I increase the time limit beyond 300 seconds? Is it possible to locate and edit the PHP file as you suggested, in order to increase the max_execution_time in HASSOS?
I am going to switch my dedicated HASS hardware from a Raspberry Pi to an Intel Celeron NUC in approximately one month, so I think that I will probably be able to resolve this issue by following your instructions. However, for other users, it would be interesting to explore the solution further to find a way to resolve this on “low-performance” hardware such as a Raspberry Pi.

I need to dedicate a bit of time to investigate where these files are, so I need to setup a docker environment where I can go inside the container. This will take a bit of time. Will try to do this weekend.

You can split the large table into chunks and import the chunks in sequence, however, unless you have to, the effort might not be worth the benefit, given you are getting something which will address the issue very soon.

Ok, I used the split option like you suggested and it worked! :partying_face:
The program I used to split the SQL file exported by PHPMyAdmin is SQLDumpSplitter3, which is available for Linux, Windows, and MacOS. I split the database into 8 separate files, each with a file size of 20MB, but I recommend using a lower value, such as 16MB.
At the end of each import, a confirmation message should be displayed indicating that the import operation was successful.

By the way, I apologize for my English. :grimacing:
I hope this has been useful to someone.

Glad it worked🙂. Very useful utility I knew nothing about. Well done! I suggest you do an export from both systems and you use a file compare utility to make certain the datasets match, or where they do not match it makes sense.

Great suggestion, but unfortunately it’s too late for me. :slight_smile:
At the moment, everything is working perfectly, even after migrating the MariaDB from the Hassio RPi to the Hassio NUC system by restoring either a complete backup or a partial backup of only the MariaDB add-on. Both options worked perfectly.

Thanks for this helpful post. I just converted from HAOS to HA Docker and needed to move my data over. This worked out great!

How did you manage to do this because docker doesn’t support add ons?
I used HA in a docker on my Synology NAS, but because of missing features I’m switching to HA OS on Virtual Machine Manager. Everthing works, only issue is moving the historical energy data.

Every HA addon is itself a docker container. So if you are running HA in docker, you just have to find the ‘normal’ docker container of the addon you want and spin it up. I’m running containers for MariaDB, grafana, influxdb, ESPHome, zigbee2mqtt, zwavejsui, mosquito, etc…. They all interface with the HA docker container.

thanks for the reply, I think this is beyond my capabilities :grinning: