Migrating home assistant database from sqlite to mariadb.. ONLY if you are VERY familiar with Database administration

I do have one question regarding the new method.

Is home assistant daemon stopped while the conversion is running? I haven’t seen that 100% defined.

If it doesn’t matter, please add that information to your instructions in your step I linked to and should be referenced at the beginning of your instructions. Or if needs to be shutdown, should be one of the first steps of the migration.

Also make note at the beginning about memory requirements and workarounds if dealing with a large Home Assistant DB.

My commentary: It seems like leaving HA running while doing the conversion would leave some holes in the data once migrated.

Regards

1 Like

Just use my method to do it by just duplicating your homeassistant_v2.db so it stops recording, u can also do it while recording, but still doesn’t change anything. i did on 3 different systems 3 different ways. all the same.

Yes there will be holes between the time it converts thats normal, so you will lose some data during the conversion, depends how big your database is.
For me it took more than 2 hours, so yes 2 hours no EXTRA data… but i did duplicate the homeassistant_v2.db database so if i want i can manual import the data i just missed during the conversion, but i did not care.

As for energy, for me at least like solar power or gas or energy, it gets added anyway as its a sum entity, but sensor data is not recorded…

Hello There, I think i found one of your big peroblems. I had the Same Problem, my energymontior was going crazy. I used the python srcipt and transfered the DB with a second homeassisten in Virtualbox. when everythin was done i uploadet the DB with MySql Workbench. I was going crazy becaus the states table was correct and the data was plausible but the statistic had the same issiue. I setup a new Database without any table and let them creat by home assistant to understand what the difference could be. i recognized that the structure was worng. The uploadet Database used float as datatype and the original (createt by homeassistant) double.(for example) I compared all tablestructures and correctet them now the new incomming data looks much better :wink:


I am sorry for my bad english :wink:

1 Like

AMAZING!!!

I checked the the database which was created by script and you can see that the script was the problem. :smiley:


don´t use this structur :wink:

1 Like

Is „the script“ referring to this migration how-to and if so, will it get updated?

Yes I used this migration how-to. you can use ist also, but befor you change the recorder-url from your “live-system” you´ll have to controll the data-types and correct them. I made some screenshots for you, with my “correct” structures.










Yes, the prices mentioned are completely off. I don’t use them, so I didn’t notice.

Edit: Just read about the incorrect database structure. @eddiewebb any suggestions to what/if we need to manually correct? It runs so far, but indeed only prices are affected.

For the prices just use a total entity instead of static, or an entity containing the price will have you not correcting anything.
I did some new tests and worked as expected, so the problem has nothing to do with float value or without.

Its the last-reset whats implemented making the costs go reset if u use a (fixed price to track cost or entity that contains a price) instead of that just start using the total cost entity.

But many of the HASS users don’t even use a cost entity at all, so you can just ignore my posts :stuck_out_tongue:

So basically you won’t have to correct anything if you don’t use a cost entity.

Regards

I did get an error this morning (haven’t checked if I got more, but saw these after a restart:

Looks like these relate to the manual fixes mentioned above?

Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 13, 9, 30, 10, 480968, tzinfo=datetime.timezone.utc), '01GM5D3XNGB91RANM0XT1CGHRQ', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 13, 9, 35, 10, 427349, tzinfo=datetime.timezone.utc), '01GM5DD2JVKS40GFT2K9RQ38TE', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 13, 9, 40, 10, 465530, tzinfo=datetime.timezone.utc), '01GM5DP7K13H400BZ997D4AXE3', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 13, 9, 45, 10, 492551, tzinfo=datetime.timezone.utc), '01GM5DZCJWRZ033YA03ZYEAKFW', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 13, 9, 50, 10, 441131, tzinfo=datetime.timezone.utc), '01GM5E8HG9SPAQHDXGM0GNG8XY', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)

So just changed the allowed length on column ‘event_type’ from 32 to 64. Should resolve it if I understand the error message properly.

You sir, are a hero!

Worked flawlessly. Just a minor edit suggestion: I was searching how to stop recording. Can you add de steps ‘Developer Tools’ > ‘Services’ > Search for Recorder: Disable > Press the ‘Call service’?

1 Like

Hi,

I have a SQLite database with 30GB and I’m trying to convert to MariaDB on a 2GB Rpi4. All attempts with “sqlite3mysql” result in Killed when running: Transferring Table Events.

Most likely it is a lack of memory, since the database is large.

Is it possible to run this script on an external server or another PC?
My idea would be to copy the Sqlite DB to another machine, convert it to MariaDB and then import the new database into RPI.

Thank you and happy new year!

Yes its possible

just like i mentioned before, just create a copy and put it somewhere else and run the script remotely, it works great.
I still have no issue’s

u 2 hapy new year already :slight_smile:

You don’t even have to stop it at all…
Just right click on the db and press duplicate, see my post

For me its running a month now with no single issue,
But the only thing was the cost entity, but thats with homeassistant and nothing to do with this awsome script :slight_smile:

Regards and wishing you already a happy new year

A hint to think about:
I struggled with this for a long time. Looked here and there, saw all the hacks needed to get it (nearyl) running afterwards.
Then I thought: what the heck, it’s December, only a few days left until 2023. Lets start from scratch.
So I dumped my 900MB of sqlite and started fresh on mariadb, got everything up and running without any issues and am ready for 2023. :wink:

Glad you put a :wink: at the end.
Not an option at all.
Meanwhile we don’t just have states and events and stuff in the recorder, but also statistics (lts - long-term statistics). Stuff like the energy dashboard relies on them.
Anyone who cares about that data won‘t start from scratch for sure :wink:

EDIT: never mind; found the answer a few post above (extending the event_type field from 32 to 64)

Since I migrated to MariaDB I keep on getting errors in the logs. Unfortunately I migrated already about a month ago, so reverting back to old DB isn’t feasible. Any tips on how to solve these errors?

Logger: homeassistant.components.recorder.core
Source: components/recorder/core.py:948 
Integration: Recorder (documentation, issues) 
First occurred: 24 December 2022 at 12:55:13 (1382 occurrences) 
Last logged: 08:00:23

Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 29, 6, 40, 22, 189696, tzinfo=datetime.timezone.utc), '01GNE9RG5DKP8M3ZCEJKYN9174', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 29, 6, 45, 20, 619130, tzinfo=datetime.timezone.utc), '01GNEA1KKBD3PGMJ984S8NY632', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 29, 6, 50, 20, 824210, tzinfo=datetime.timezone.utc), '01GNEAARRRNA9VT9VXKF7Z9QHE', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 29, 6, 55, 20, 749736, tzinfo=datetime.timezone.utc), '01GNEAKXND6QFYP0MDZRZ4GCN6', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Unhandled database error while processing task CommitTask(): (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1") [SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 29, 7, 0, 22, 460460, tzinfo=datetime.timezone.utc), '01GNEAX49WN2NJ0TA39PDS8X0J', None, None, None)] (Background on this error at: https://sqlalche.me/e/14/9h9h)
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
    self.dialect.do_execute(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
    cursor.execute(statement, parameters)
  File "/usr/local/lib/python3.10/site-packages/MySQLdb/cursors.py", line 206, in execute
    res = self._query(query)
  File "/usr/local/lib/python3.10/site-packages/MySQLdb/cursors.py", line 319, in _query
    db.query(q)
  File "/usr/local/lib/python3.10/site-packages/MySQLdb/connections.py", line 254, in query
    _mysql.connection.query(self, query)
MySQLdb.DataError: (1406, "Data too long for column 'event_type' at row 1")

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 679, in _process_one_task_or_recover
    return task.run(self)
  File "/usr/src/homeassistant/homeassistant/components/recorder/tasks.py", line 265, in run
    instance._commit_event_session_or_retry()
  File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 929, in _commit_event_session_or_retry
    self._commit_event_session()
  File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 948, in _commit_event_session
    self.event_session.commit()
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 1451, in commit
    self._transaction.commit(_to_root=self.future)
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 829, in commit
    self._prepare_impl()
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 808, in _prepare_impl
    self.session.flush()
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3444, in flush
    self._flush(objects)
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3583, in _flush
    with util.safe_reraise():
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
    compat.raise_(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 210, in raise_
    raise exception
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 3544, in _flush
    flush_context.execute()
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 456, in execute
    rec.execute(self)
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 630, in execute
    util.preloaded.orm_persistence.save_obj(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 245, in save_obj
    _emit_insert_statements(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 1238, in _emit_insert_statements
    result = connection._execute_20(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1705, in _execute_20
    return meth(self, args_10style, kwargs_10style, execution_options)
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/sql/elements.py", line 334, in _execute_on_connection
    return connection._execute_clauseelement(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1572, in _execute_clauseelement
    ret = self._execute_context(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1943, in _execute_context
    self._handle_dbapi_exception(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 2124, in _handle_dbapi_exception
    util.raise_(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 210, in raise_
    raise exception
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context
    self.dialect.do_execute(
  File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute
    cursor.execute(statement, parameters)
  File "/usr/local/lib/python3.10/site-packages/MySQLdb/cursors.py", line 206, in execute
    res = self._query(query)
  File "/usr/local/lib/python3.10/site-packages/MySQLdb/cursors.py", line 319, in _query
    db.query(q)
  File "/usr/local/lib/python3.10/site-packages/MySQLdb/connections.py", line 254, in query
    _mysql.connection.query(self, query)
sqlalchemy.exc.DataError: (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1")
[SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)]
[parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 24, 11, 55, 13, 146629, tzinfo=datetime.timezone.utc), '01GN1ZSDBTPXV8X8B7NKYD7RTV', None, None, None)]
(Background on this error at: https://sqlalche.me/e/14/9h9h)

I think one of your entities is sending data in a different style, or using some other characters in the data.
Do you know what entity is causing this ?

I still have no errors till today, and its exactly 1 month ago i did do the conversion.

It looks like an entity containing to long url / data with a lot of characters in it, or by scraping some data…
Some data i scrape and i record can be to long to save the data in the db and you get the TO LONG error like you have (“Data too long for column ‘event_type’ at row 1”)
Or using a camera stream with a long url and characters and probably included that in the recorder. or even a google map url for location

sqlalchemy.exc.DataError: (MySQLdb.DataError) (1406, "Data too long for column 'event_type' at row 1")
[SQL: INSERT INTO events (event_type, event_data, origin, origin_idx, time_fired, context_id, context_user_id, context_parent_id, data_id) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)]
[parameters: ('recorder_5min_statistics_generated', None, None, 0, datetime.datetime(2022, 12, 24, 11, 55, 13, 146629, tzinfo=datetime.timezone.utc), '01GN1ZSDBTPXV8X8B7NKYD7RTV', None, None, None)]
(Background on this error at: https://sqlalche.me/e/14/9h9h)

Worked perfect for me, thank you!

Hi,

I copy the DB to my PC, install Phyton, … and apparently all go OK without errors, but the sqlite3 database originally have 30GB, and the conversion do MariaDB result on 3GB database, and states table are empty.

It’s normal, the reduced size and empty states table?