I my self have no experience in Postgresql, nor sqlite,influxDB or MariaDB for that matter, so those were new to me when starting with HA, have some years experience with Oracle and MySql, and for that reason was/is my goal going towards MariaDB, with a similar “Setup” as you successfully maintain, External DB and Grafana, as DB also should serve my “old” internal Webserver(“old” Web-UI, for various document/finance/statistics handling) , MariaDB sounded like the most close to my previous experience, and what i initially found as “options” beside InfluxDB for Grafana, thou i have as you an i7, MSI GE-60, with 2 MSata and 1 SSD, so performance vise, i have no concerns in having External-DB ( on 1 Msata ), serving both, HA, Grafana ( on other Msata), and external uses… so your first post did really caught my attention , and sounds like an easy setup also … beside i still haven’t got so far to installing external DB yet, just formatted it as the other exFAT on GPT ( i will for sure look into Postgresql, before making further decissions )
I’m pretty sure you should be able to do a very similar ‘trigger’ in your MariaDB. Most database engines support trigger on insert.
Whatever you find, two key tools I have found:
docker : great for updates, testing and rolling back
tableplus : a very user friendly SQL query tool on many OS platforms
This is no longer valid question - Ive learnt today that db cleanup is HA configuration setting, no manual job needed…
(I was asking because in my other project, where Im using intensively Sqlite CRUDS I noticed DELETE’s are not really releasing HDD space until VACUUM command is executed, but this requires exclusive lock, what might be dangerous for HA). I know postgres auto vacuum doing this job far better.
Thank you for sharing your DDLs and experience.
My only dilemma is what engine to choose Postgres (I know well) or Influx (good motivation to learn something new, Grafana seems like it as well) or maybe MariaDB which seems to be good compromise between two. (My PC is NUC but i3 only)
What Im still not 100% sure is Recorder configuration
Below site says:
The default, and recommended, database engine is SQLite which does not require any configuration. The database is stored in your Home Assistant configuration directory (’/config/’) and is named
home-assistant_v2.db
.To change the defaults for the
recorder
integration in your installation, add the following to yourconfiguration.yaml
file:
So this suggests that re-pointing db_url to eg Postgres, result all data goes there instread of Sqlite
But discussing here it was mentioned that configuring new DB will not replace Sqllite but record kind of copy of records in additional DB (in parallel to Sqlite)…
PS. Apologies for naive questions, but Im still relatively new to HA…
Well if you have a little time, enough storage on your machine. Setup docker and run both Postgresql and Influx in docker containers side by side. You can use Postgresql as your HA Recorder store, and also setup my history solution. Then add the Influx add-in to HA, pointed at the docker container, link to Influx on docker hub. Make sure to check the right version of Influx that will work with HA.
Then add a Grafana docker container, link at bottom of post. You can have Grafana query both your Postgresql and Influx databases.
I have not changed my database engine in years for HA, so I am bit rusty. However, I think all you have to do to setup HA and Postgresql is follow these instructions, link at bottom of post (Postgresql at very bottom of link, and make sure to correctly set the character set to ‘utf8’, that is gotcha that caught some folks).
Yes this will create a virgin recorder database, so your history that was in SQLite will be lost. The SQLLite database file is still in your HA directory, but new data is now going to the Postgresql db. If you change back to SQLite for Recorder, then the SQLite db will once again be used for Recorder.
this is my configuration.yaml entrys for postgresql:
# database
recorder:
purge_keep_days: 65
db_url: postgresql://a:[email protected]/homeassistant
auto_purge: true
lol , we have all been there, after about half a year i consider myself as “new” … But if i was you, with Postgresql experience, i would for sure go for that, and spend your “left over” time on learning Ha, and Grafana (thou Grafana is most like a short chapter, for you … edit: thou with lots of options/plug-ins)… HA, there is quite a-lot to “learn”, and always new options, ideas etc… you wont get bored
You are right - better to allocate some energy / time to HA !
I think I’ll setup some test environement and play a bit and see how this works
Many thanks Both for great help/discussion and patience !