Migrating from MySQL (mariaDB) back to sqlite

I’ve just checked and mine is still only 0.5GB (I keep retention quite tight still), so this is perhaps why I have less problems. I’ve not had any particular issues with HA restart connectivity issues that I’ve noticed, or traced to MariaDB / recorder specifically.

Honestly I don’t have any issues outside of occasional slow start ups or the failed recorder, but I am able to work around it. I’ll pare down my database to try to reign things in but it isn’t causing any real issues right now.

1 Like

Right, I set my purge_keep_days 31, this is because I have some cards to display 1 month data (energy consumption).

So, if I only migrate the long term statistics, will I loose the last 31 days data? But, I will not loose the data older than 31 days.

I see mentioned of issues related to backup/restore when using sqlite.

To me the primary advantage for migrating back to sqlite is the ability to backup the entire machine whenever I perform an upgrade.

So if that doesn’t actually work, I don’t see the point of moving from mariadb.

A card will use statistics or live data, not both.
Assuming that card uses statistics, you will loose nothing, as statistics are built along live data.

If it uses live data, just switch to statistics. Especially for energy, there is zero benefit of using live data vs. aggregated data over 1h. Doing so, you free yourself of the dependency of having 1 month of live data, reducing db size and increasing performances.

Not sure why the anxiety, really. You can get back to mariadb in a snap, by uncommenting the db line and restarting, loosing nothing. So just try it :wink:

1 Like

Do you have any other specific guidance for doing this process on an HAOS instance?

I’ve got a headless NUC that I use for HA. So if I’m following correctly, I’ll need to physically connect the unit to a display and keyboard so I can turn off HA core and run this process, is that correct?

I’m not sure I can choose which data to use when creating some cards. I just created some sensors, utility helper, that’s all. I don’t see any choice I can make.

Which card are you using to view your history?

ApexCharts, I use that for all my charts.

It does supports statistics.
Try it, and if it fits your bill, you can reduce the number of days of live data.

I’m confused what the difference would be?

I thought “long term statistics” meant “state history going back a ways”

Unfortunately it also seemed to use windows-only software so that option isn’t available to everyone.

1 Like

No. “Long-term statistics” are specific tables containing aggregated data (min/max/mean).
They are independent of the states table.

You mean “linux-only”, I guess.

Since that’s what their recommended HassOS runs seems like it would be sensible, but also Mac users would need something compatible too…

If you want to convert the events.sql and states.sql files which contain the binary blobs

ex. _binary 0x018725fccf354f12f37776356bc7c8b5 -->  X'018725fccf354f12f37776356bc7c8b5'

On linux & OSX (and windows if sed is available*) you can use:

sed -i.bak -E 's/(_binary\s0x)(\w*)/X'\''\2'\''/g' <NAME OF THE SQL INPUT FILE >

option -i.bak will create a backup of the input file .bak
option -E enables extended regex options
s/ : search & replace
/g: global

*) via mingw64, gnu32 or WSL

1 Like

For Windows, fo you think this would be better to use instead of Notepad++?

https://gnuwin32.sourceforge.net/packages/sed.htm

Using sed on Windows ( via MinGW or WSL or whatever method) is much faster than Notepad++ .
Especially if your sql file is very big.

1 Like

using sed could also be done from the HassOS SSH terminal too and make it platform-independent to convert. Its already part of the HassOS system.

Hello, really want to get this working. Love the fact that it only takes the long term statistic values.
However, trying the below command on a HAOS environment I get the error that the folder mysql_host doesn’t exist. Am I doing something wrong or will this just not work ?

$ mysqldump --no-create-info --complete-insert --skip-extended-insert --compact -h <mysql_host> -u <user> -p <ha_db> statistics_meta statistics statistics_runs statistics_short_term > ha_statistics_sql_dump.sql