Yes, I have the official phpMyAdmin docker image installed. This afternoon, I will try to follow your âmanualâ
As soon as I have a result, I will write it here.
Add-on: phpMyAdmin
A web interface for the official MariaDB add-on
Add-on version: 0.8.3
You are running the latest version of this add-on.
System: Home Assistant OS 9.5 (aarch64 / raspberrypi4-64)
Home Assistant Core: 2023.3.2
Home Assistant Supervisor: 2023.03.1
Please, share the above information when looking for help
or support in, e.g., GitHub, forums or the Discord chat.
s6-rc: info: service base-addon-banner successfully started
s6-rc: info: service fix-attrs: starting
s6-rc: info: service base-addon-log-level: starting
s6-rc: info: service fix-attrs successfully started
s6-rc: info: service base-addon-log-level successfully started
s6-rc: info: service legacy-cont-init: starting
s6-rc: info: service legacy-cont-init successfully started
s6-rc: info: service init-phpmyadmin: starting
s6-rc: info: service init-nginx: starting
s6-rc: info: service init-nginx successfully started
s6-rc: info: service init-phpmyadmin successfully started
s6-rc: info: service php-fpm: starting
s6-rc: info: service php-fpm successfully started
s6-rc: info: service nginx: starting
s6-rc: info: service nginx successfully started
s6-rc: info: service legacy-services: starting
s6-rc: info: service legacy-services successfully started
[16:21:28] INFO: Starting PHP-FPMâŚ
[16:21:29] INFO: Starting NGinxâŚ
2023/03/13 16:28:46 [error] 249#249: *33 FastCGI sent in stderr: âPHP message: PHP Fatal error: Maximum execution time of 300 seconds exceeded in /var/www/phpmyadmin/vendor/phpmyadmin/sql-parser/src/Lexer.php on line 236â while reading response header from upstream, client: 172.30.32.2, server: a0d7b954-phpmyadmin, request: âPOST /index.php?route=/import HTTP/1.1â, upstream: âfastcgi://127.0.0.1:9001â, host: â192.168.0.100:8123â
Invalid config
The following integrations and platforms could not be set up:
What is the size of the export file homeassistant.sql.
Got the solution for your issue and is in the documentation.
Option: upload_limit
By default, the size limit for uploads (for operations such as imports) is set to 64MB. This can be increased with this option, for example, 100 would be 100MB.
go to the add-on and click on configuration; change upload_limit to whatever the size of your homeassistant.sql file is plus say 10MB.
It also seems this is a performance issue, the request taking too long to execute.
You need to change a value in the config file to overcome this limit. However possibly the system might take a very long time to respond (could be days or never succeed).
This is a tricky one to resolve as what file needs changing depends on implementation. You need to find the PHP files and/or NGINX files which have an identifier named mysql.connect_timeout and/or max_execution_time. You will see a value of 300, which you need to change to e.g. 6000 or 0 for infinite.
However 300 seconds is too long for SQL operations (e.g. for 1,000,000 ops it should take less than a handful of seconds to complete). Your system is extremely slow or your database consists of hundreds of millions of records.
Attempt the import by enabling Partial import. You then need to check how much data you are missing.
Another way is to split the SQL file into chunks, say divide it into 4 roughly equal parts and try to import them one after the other in sequence.
These are all the tables and their relationships you need to populate. You can split your SQL file into 3 chunks, linked tables on the left, linked tables on the right and unlinked tables on the right (these are the smallest).
My sql file has 160Mb. This is a to big file for phpMyAdmin?
Im only interesting to preserve datas from energy dashboard. Maybe Could I to delete any other data from db?
Okay, I confirm that the problem is related to either the file size or the performance issue, as you suggested. I need more than 300 seconds to complete the task. I donât think the split option will work since 90% of the total file size is for a single table. How can I increase the time limit beyond 300 seconds? Is it possible to locate and edit the PHP file as you suggested, in order to increase the max_execution_time in HASSOS?
I am going to switch my dedicated HASS hardware from a Raspberry Pi to an Intel Celeron NUC in approximately one month, so I think that I will probably be able to resolve this issue by following your instructions. However, for other users, it would be interesting to explore the solution further to find a way to resolve this on âlow-performanceâ hardware such as a Raspberry Pi.
I need to dedicate a bit of time to investigate where these files are, so I need to setup a docker environment where I can go inside the container. This will take a bit of time. Will try to do this weekend.
You can split the large table into chunks and import the chunks in sequence, however, unless you have to, the effort might not be worth the benefit, given you are getting something which will address the issue very soon.
Ok, I used the split option like you suggested and it worked!
The program I used to split the SQL file exported by PHPMyAdmin is SQLDumpSplitter3, which is available for Linux, Windows, and MacOS. I split the database into 8 separate files, each with a file size of 20MB, but I recommend using a lower value, such as 16MB.
At the end of each import, a confirmation message should be displayed indicating that the import operation was successful.
By the way, I apologize for my English.
I hope this has been useful to someone.
Glad it workedđ. Very useful utility I knew nothing about. Well done! I suggest you do an export from both systems and you use a file compare utility to make certain the datasets match, or where they do not match it makes sense.
Great suggestion, but unfortunately itâs too late for me.
At the moment, everything is working perfectly, even after migrating the MariaDB from the Hassio RPi to the Hassio NUC system by restoring either a complete backup or a partial backup of only the MariaDB add-on. Both options worked perfectly.
How did you manage to do this because docker doesnât support add ons?
I used HA in a docker on my Synology NAS, but because of missing features Iâm switching to HA OS on Virtual Machine Manager. Everthing works, only issue is moving the historical energy data.
Every HA addon is itself a docker container. So if you are running HA in docker, you just have to find the ânormalâ docker container of the addon you want and spin it up. Iâm running containers for MariaDB, grafana, influxdb, ESPHome, zigbee2mqtt, zwavejsui, mosquito, etcâŚ. They all interface with the HA docker container.