I bet this is not normal. Data is not getting compressed during migration and in terms of storage size it should probably almost be equal in MySQL to SQLite.
Maybe not helpful, but: did you think about shrinking your database? Must be either plenty of data perhaps because of a hell of devices (or chatty ones) or a very huge purge_keep_days setting (Recorder - Home Assistant).
I monitored all the migrations done here and I can’t remind one user having such a huge database. Not sure where the limitations are, maybe “size of database equals required amount of RAM” or something.
I don’t know who is responsible for this increase. I don’t have that many devices. Below I show my settings for the Recorder. But even with Auto_purge and Auto_Repack the DB didn’t shrink. I don’t know if I’m missing something.
Strange is that sqlite3mysql does not return any error, so I don’t know what is going wrong in order to be able to correct or work around it.
What I wanted was to move to MariaDB, to have records of most data for about 30 days, but some sensors keep for 1 year or even more. It would be great for me to be able to compare some data or get annual minimum and maximum values for some sensors.
I didn’t want to use InfluxDB, I preferred to have the data in a single DB.
Is there any way to get it?
Given that your problem has to do with the growth of your sqlite database, and much less to do with exporting your database, You might want to open a new topic in the setup section of the home assistant forum. The configuration or installation sub forums may give you better results for an answer to your issue.
Test one more time, with DEBUG and get an error “MemoryError”
sqlite3mysql --sqlite-file home-assistant_v2.db --mysql-database homeassistant --mysql-host localhost --mysql-port 3307 --mysql-user root --ignore-duplicate-keys -t states --debug
2023-01-10 16:55:11 INFO Transferring table states
Traceback (most recent call last):
File "c:\users\tiser\appdata\local\activestate\cache\6da97e55\lib\runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "c:\users\tiser\appdata\local\activestate\cache\6da97e55\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\tiser\AppData\Roaming\Python\Python39\Scripts\sqlite3mysql.exe\__main__.py", line 7, in <module>
File "C:\Users\tiser\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 1130, in __call__
return self.main(*args, **kwargs)
File "C:\Users\tiser\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 1055, in main
rv = self.invoke(ctx)
File "C:\Users\tiser\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Users\tiser\AppData\Roaming\Python\Python39\site-packages\click\core.py", line 760, in invoke
return __callback(*args, **kwargs)
File "C:\Users\tiser\AppData\Roaming\Python\Python39\site-packages\sqlite3_to_mysql\cli.py", line 204, in cli
converter.transfer()
File "C:\Users\tiser\AppData\Roaming\Python\Python39\site-packages\sqlite3_to_mysql\transporter.py", line 772, in transfer
self._transfer_table_data(sql=sql, total_records=total_records)
File "C:\Users\tiser\AppData\Roaming\Python\Python39\site-packages\sqlite3_to_mysql\transporter.py", line 662, in _transfer_table_data
self._sqlite_cur.fetchall(),
MemoryError
I don’t know what can I do to transfer states table to MariaDB…
I used --chunk 1000 just in case
my sqlite size was 7GB (30d)
I ran the conversion on a Virtualbox in a windows 10 laptop with i5-8350U and it took around 2.5 h
Afterwards I changed to purge_keep_days: 190
It has been a month since.
initially the DB size was around 6.5GB for around 2 weeks
Then it jumped to 10GB and stays at 10GB for the last 10 days…
Hi guys. I’ve just found this thread as yesterday my SQLite db corrupted and upon investigation I found it was a whopping 24GB. Luckily I have backups of my entire HASSOS vm (I use proxmox + proxmox backup server) so I was able to get online and only lost a few hours of data.
Ive got a quick question - to migrate to maria from the existing SQLite with no data loss, can I not just:
Create a home assistant backup
Delete my HASS OS + set up a new instance
Setup the new instance to use Maria and not SQLite
Import the backup - the data should automatically go into the Maria db instance.
Right?
Isn’t all this far easier than the migration instructions given in this thread?
Disclaimer - this could totally be off base as I’m not at all familiar with how all this works under the hood as yet but it seems wise to use the built in tools to achieve that the goal.
Is there a simple way to monitor the size of the Sqlite Db and get some kind of notification if it goes over a certain size? I could write a small script I suppose but wondering what others are doing to avoid these footguns
Of course. For the SQLite (default database) you can simply use the File Size - Home Assistant integration. Just create a sensor and use it in an automation to notify yourself.
Since converting to MariaDB last week I notice that the times in my log are not correct. Regularly events get reported in the future and a lot of events happen at the same time. I can’t see anything in the log files. Any idea what the problem may be?