For HA on an rpi I use the Synology DSM’s maria db database. Now a backup of the Synology including maria db is made every night. Then there are brief breaks between the rpi and the database on the Synology. The database sensor does not receive any data for a short time either.
The error messages then look like this:
Logger: homeassistant.components.recorder
Source: components/recorder/__init__.py:441
Integration: Recorder (documentation, issues)
First occurred: 11:57:44 (1 occurrences)
Last logged: 11:57:44
Error in database connectivity during keepalive: (MySQLdb._exceptions.OperationalError) (1927, 'Connection was killed') [SQL: SELECT 1] (Background on this error at: http://sqlalche.me/e/13/e3q8)
Logger: homeassistant.components.sql.sensor
Source: components/sql/sensor.py:146
Integration: sql (documentation, issues)
First occurred: 11:57:47 (1 occurrences)
Last logged: 11:57:47
Error executing query SELECT table_schema "database", Round(Sum(data_length + index_length) / 1024 / 1024, 2) "value" FROM information_schema.tables WHERE table_schema="homeassistant_rpi" GROUP BY table_schema LIMIT 1;: (MySQLdb._exceptions.OperationalError) (1927, 'Connection was killed') [SQL: SELECT table_schema "database", Round(Sum(data_length + index_length) / 1024 / 1024, 2) "value" FROM information_schema.tables WHERE table_schema="homeassistant_rpi" GROUP BY table_schema LIMIT 1;] (Background on this error at: http://sqlalche.me/e/13/e3q8)
The 120s for the reconnect are sufficient. However, the first error message remains if the connection fails briefly.
How can I prevent these error messages from occurring?
Probably I have to lower the log severity level for the recorder so that I don’t get any more errors. Or does someone have a better idea?
@bungo63 Thank you for this guide! I set it up using my QNAP NAS using a very similar procedure. As @Stiltjack mentioned above, MariaDB now requires a strong password creation right away. I had no problems following your original post, modifying as necessary to accommodate QNAP and the new MariaDB. Then I saw users below documented the fixes I already needed to use.
After restarting HA, I immediately lost my logger and logbook. Although not IMO well documented, apparently creating a recorder entry overrides some settings of the “default config:” entry in the configuration.yaml. In my case this seemed to stop both logger and logbook. I added logger: to the configuration.yaml, and it started logging again right away. However, since the default logger configuration is debug, the logs were much more detailed than before. I changed the entry to
logger:
default: info
logbook:
The logger now seems to work fine, but the last logbook entry is from 30 minutes ago despite a recent restart that should have given me many new logbook entries.
I kept trying things as I wrote this, and after many HA restarts the logbook started logging again. I also refreshed the web page, making me wonder if it was somehow a browser cache issue. After refreshing, I had logbook entries going back more than 45 minutes, meaning the logbook had been running, but not properly displaying, during that time. Logger, on the other hand, was definitely not logging, because the logfile time stamp stayed fixed at the time when I first restarted after changing to the MariaDB setup. Whew!
Thanks, this saved me a lot of time!
I was able to get it working very quickly thanks to your explanations.
As other users have been reporting, my config is now much faster and I do not have to keep changing SD cards all the time!
Just installed MariaDB and phpMyAdmin on my Synology RS816 and connected my Home Assistant 2021.2.3 running on a Pi4 with 4gb.
Everything runs must smoother and faster now. Thank you for this tutorial.
I changed some settings concerning authentication and to restrict access.
First of all, thanks for the steps on this. I discovered this shortly after setting up my HA instance on RPi4B last summer and as a solution architect at work I could definitely appreciate the advantages and as an HA user I could definitely appreciate the flexibility in writing queries to and other tools to analyze everything going on within my home.
However, with the winter storm that just hit the U.S. and the major power outages in the Dallas area, I have a new perspective on running a database for state logging on a different server. I won’t get into the issues I ran into with coordinating shutdown of both a Synology NAS and an HA server off a single UPS - will save that for a post on a NUTS thread - but one issue related to this is that every time power came back up and the NAS and RPi restarted, HA would come up all the way before the NAS was up and had MariaDB running. That meant Recorder was basically brain-dead at a time when I was desperate to capture every piece of sensor data possible including temperature from my crawl-space, my hot water heater closet, etc., water sensors near all plumbing exposed on an external wall, etc. Without doing anything, I could see data at a single point in time but not over time. After the second or third extended outage when I realized what was happening, I just pulled the plug on my NAS and HA and manually started them whenever power returned (after dealing with the really important stuff first like making sure the smart thermostat was working so the house could heat up).
Based on another post, I am thinking about changing my setup so that the most recent day or so of recorder data is handled locally on the RPi with deeper history stored on NAS. However, a couple of concerns:
The only option I can find for that would be to use InfluxDB. It looks like running that on Synology requires using Docker but that’s not an option on my ds216play.
Even keeping a limited number of days on the SQLite DB on RPi will still take a toll on the SD card. At best, changing the commit frequency is only apparent way to limit that hit over the long haul.
It would be great if there were a way to have multiple recorder database instances using MariaDB with some things going to both and other things only going to the deep history instance on NAS. So far, I don’t see any sign that there is a way to do that.
As for the local write issue, if I also implemented SSD for the RPi, that wouldn’t be such a concern over time.
I am looking at revamping my UPS configuration, possibly having the HA RPi as master and the NAS as slave so there may be other options but so far I haven’t seen anything in NUTS documentation that would let me delay the HA start-up until MariaDB is up on the NAS. But if I find a way to do that then perhaps my recorder/database concerns are no longer an issue.
I’m running Home Assistant on a Raspberry Pi 4 (2gb) with a 32GB sd-card.
With the newest versions at this moment:
Core 2021.2.3
Supervisor 2021.02.11
I have multiple ‘backup-cards’ just in case, but have to say I never had a problem (running for 9 months now), but also I had excluded a lot of the recorder. Pretty sure this guide did buy all of my cards a lot of extra time
Processor use is about the same around 5% most of the time (again, my recorder almost didn’t record anything…). But the difference in loading history is big for me. And also everything seems to react quicker (changing between tabs and so on).
Now have to look into Grafana on my Synology so I can get a little eye-candy.
I’m not using an RPi for HA, but I’ve made this change to run MariaDB on my Synology NAS anyway. Already things seem a little bit quicker now I’m handing off database duties to a box that isn’t doing a great deal.
Hi, was following your advise, but was not able to find the option ‘Grant all privileges on database’ on the mysqladmin page. Was that an actual option or what did you mean by that… thanks in advance!
I’m not 100% sure but I think the setting I was referring to was under “User Accounts > Edit Priveleges for a specific user (right side) > Database > Database-specific-privileges > Check all (checkbox)”. I’m guessing the wording in the UI has changed in the last 3 years
To be honest, I moved off of using a Raspberry Pi for HA and Synology for the DB a very long time ago (~2yrs).
I moved to a NUC with an SSD (everything runs on the NUC) and haven’t looked back. The performance is incredible and no need to set up workarounds to compensate for underpowered (for the task) hardware. I wish you the best of luck and Happy Automating!
Thanks man, I finally found it… I was trying to create a user outside of the database, once I selected the database I was able to see the option.
First time I hear about the term NUC though… a quick google search gave me some details hehe but being curious on what model you’re using? I could give it a try in the future
I wasn’t aware of them at the time either. One of my colleagues had purchased a NUC as a HTPC/secondary gaming machine and wasn’t happy with the graphics performance. He mentioned wanting to sell it and so I wound up with a NUC8i5 with 8GB RAM and an SSD. It is suuuuper over-powered for running HA but that just means I’ll never have to worry about upgrading hardware.
I run the “Intel NUC” image of “Install Home Assistant Operating System”, have ~700 entities/sensors/etc., 22 addons installed (most of them running all the time), more than a few custom integrations from HACS, and the CPU reports as basically idle, 24/7.
@bungo63 Really good idea to keep the database off the RPI. Will give it a go as I already use MariaDB on Synology NAS for data logging. Just never considered plonking my HA database there too btw new to HA !!
I would like to create nice reports too. Does anyone have any comparative experience with Grafana and KoolReport as it will be take some time to do stuff so always good to have peoples feedback to find the right package ?
Regarding SD Card failure (not really the main topic here) I have been using RPI for over 10 years and had corrupt cards every now and then. Since I moved to Sandisk Class 10 never had a problem
@edozat1 didn’t realise you could run HA in a VM on Synology NAS. I have however used MariaDB on Synology for quite a few years now and it has never let me down. I can’t help thinking that
MariaDB addon in HA Supervised would be easier if you don’t have experience with phpMyAdmin on Synology
MariaDB package of Synology would be much more powerful however more learning curve and planning maintenance etc.
I installed MariaDB to DS, created tables and etc. Connected my HASSIO to it, no problems.
I have huge disk load, 100% of the time on my DS. What could be wrong?
I tested this for 1-2 weeks. Always the same problem.
Once i change HASSIO to internal MariaDB, there is no load to DS at all.
I am planing on doing this, does the very first post of this thread still work? or should I look around something else??
Any issues apart from powering on order (not an issue for me, both devices are VMs inside proxmox with virtual power on order and delays, first Synology, 180 seconds later HASOS)
I’ve been using this solution for about a year. While troubleshooting something else, I noticed the following warning in my log:
Version 5.5.68 of MariaDB is not supported; minimum supported version is 10.3.0.
Starting with Home Assistant 2022.2 this will prevent the recorder from starting.
Please upgrade your database software before then.
I don’t remember how it looked a year ago, but now my QNAP app store offers me MariaDB 5 ver 1.0.1.235 and MariaDB 10 ver 1.0.1.235. Is it as simple as installing version 10 and copying the configuration from version 5? Has anyone else been through this on a NAS that can offer some advice? I’m thinking I will have to re-learn what I did a year ago and re-create it with the new MariaDB version.
I have now also switched to a MariaDB hosted on my Synology NAS. Logically, the hard disk access on the NAS now increases significantly. Actually, a NAS is designed for this, but do you think it has a strong impact on the life of the hard drives when constantly written to?