Anybody with experience moving off sqlite into a DB, i.e. MySQL/Maria?

Update: Okay, move to MySQL / Maria? I’m learning InfluxDB is used for an entire different purpose of what I’m actually looking for - moving off SQLite.


I’m looking into moving HA into InfluxDB. I’m actually new to InfluxDB, but decided on it since I can also use it for unifi-poller.

How’s your experience? Any catches I should watch out for? Any recommendations on what approaches I should take?

I’m running HA on an NUC via Docker. My plan is to run an independent InfluxDB instance (via Docker) I can share between HA and unifi-poller.

On the InfluxDB side: any recommendations on UI/management clients? I’m on MacOS.

Thanks,

This should point you down the right path

1 Like

That’s a great rundown from Franck. Thanks.

What’s your views on using HA add-ons vs stand-alone when you plan to use third-parties within the same ecosystem?

I don’t suppose there’s an HA migration to move sqlite into InfluxDB?

There isn’t, because they serve two separate purposes. SQLite is used for the Home Assistant history and recorder integrations, whereas InfluxDB is an option to export data to for long-term storage and analysis. You can point the recorder to a different database for better performance (for example, the MariaDB add-on), but the recorder needs to use some type of relational database. InfluxDB is a time-series database that stores its data differently.

However, if you want to do more of your data analysis and visualization using InfluxDB and Grafana, you could set that up and then make the number of days you retain very low for the history component.

1 Like

Oooh. Well dang. Okay, then back to my original top, moving to MySQL/Maria? I thought I could move the sqlite to InfluxDB. Oops.

Sorry. That’s my newbness of InfluxDB. Still learning all this.

I run a separate MariaDB docker, configured the recorder: component to point to my MariaDB instance, and was done. I didn’t bother migrating data from SQLite to MariaDB (MySQL) because at the time I didn’t care about the history. I did this years ago (3+ years?), but I don’t remember having to do anything special except configuring MariaDB with the correct user and granting correct permissions to the new DB. When I first did this there was some serious speed improvements with using MariaDB/MySQL over SQLite. Since the latest releases, the speed improvements (mostly with loading data into history and logbook) have about equalized. The benefit of offloading the DB is reducing read/writes on the SD card (if using a RPi) and the database will be there even if you have to reinstall HA from scratch. Just configure the recorder: section properly and the new HA instance will connect back to it.

1 Like

Is it normal to have a 30gb file? Just want to check. It is quite large. Based on my ability to “connect” into it, it would seem I might be hitting a top limit of connections. I could never get “connected” because the file would lock every few seconds.

the file locking every few seconds might be the recorder writing to it. I don’t know how SQLite functions and if it locks files/DB. It’s possible you’re storing data for a really long time or you have so many sensors that are extremely chatty and store a ton of state changes. My DB is ~4.5GB and no issues with it. Others here have much larger DBs but I don’t know if they use SQLite or something else. I don’t believe there is an upper limit on size, but the larger it is probably impacts performance.

So, I did the same thing as @squirtbrnr: I created a new instance, started fresh. I did find some tools that could import the data (DataGrip - drag/drop), but at 30gigs, it was more work than really necessary – I don’t really need data more than a day out (for now).

I’ll move towards Grafana + InfluxDB for longer-term monitoring for select entities.

I was wondering what setup you finally use @guice.
- HA database with short-term recorder + history: SQLite/MySQL/…?
- Long-term database with recorder + history: InfluxDB?

I integrated InfluxDB and was looking to import the former history data from SQLite before finally purging and repacking the SQLite which grew a lot meanwhile.

In short, I started from scratch. There really wasn’t anything I needed to retain (to be honest).

I went with a MySQL backend for HA, and InfluxDB or long-term. However, one recommendation, or warning, for HA to InfluxDB: disable everything, use an entity allow-list. HA is very spammy, and sends a lot of junk that just doesn’t make sense (or even work). Use an allow-list approach and send over only sensor data you’d want to retain long-term.

Edit: Doh! I meant InfluxDB, not InnoDB. Sorry for the confusion.

Just to make sure: you mixed InnoDB (which is a storage type for MySQL/MariaDB) with InfluxDB, right?

Good hint though. I initially thought “well, it´s a more effective way of storing data, just dump everything to InfluxDB and decide what to store and visualize later”.

I´m still trying to monitor the db size in InfluxDB, found

SELECT sum("diskBytes") FROM "_internal".."tsm1_filestore" WHERE time >= now() - 6h GROUP BY time(30s), "database"

I believe it´s possible to delete data from InfluxDB, isn´t it?

Crap, I’m sorry. I meant InfluxDB for long-term data. I did not mix or integrate databases. HA has InfluxDB integration you can enable, separate from the recorder which can be configured for mysql. I know very little about Influx, to be honest.

I did not do any data migrations. I started everything from fresh empty data.

/shrug. I deleted the db and rebuilt it when I clean to fix data. :laughing:

PS: I’ve updated my comment to reflect InfluxDB, not InnoDB.