Hi! I’m using Postgres as the recorder engine for Home Assistant database and I was poking around, trying to make sure that the database runs at peak performance, excluding unnecessary things from recorder etc… But when I got to analysing database queries that are taking place, I saw something surprising. There are insert queries that insert up to a 1000 rows.
I used queries provided here to analyse the queries.
This is one such query:
userid,dbid,toplevel,queryid,query,plans,total_plan_time,min_plan_time,max_plan_time,mean_plan_time,stddev_plan_time,calls,total_exec_time,min_exec_time,max_exec_time,mean_exec_time,stddev_exec_time,rows,shared_blks_hit,shared_blks_read,shared_blks_dirtied,shared_blks_written,local_blks_hit,local_blks_read,local_blks_dirtied,local_blks_written,temp_blks_read,temp_blks_written,blk_read_time,blk_write_time,wal_records,wal_fpi,wal_bytes
10,16384,true,-2025199088121455629,"INSERT INTO events (event_type, event_data, origin, time_fired, created, context_id, context_user_id, context_parent_id) VALUES ($1, $2, $3, $4::timestamptz, $5::timestamptz, $6, $7, $8),($9, $10, $11, $12::timestamptz, $13::timestamptz, $14, $15, $16), ... ,($7993, $7994, $7995, $7996::timestamptz, $7997::timestamptz, $7998, $7999, $8000) RETURNING events.event_id",0,0,0,0,0,0,1,3853.1356250000003,3853.1356250000003,3853.1356250000003,3853.1356250000003,0,1000,13157,1115,1087,55,0,0,0,0,0,0,0,0,7079,1032,8600462
The query took 3853ms and affected 1000 rows (!). I was wondering whether this is normal or something to dig deeper. I haven’t looked deeper into the mechanism by which Home Assistant decides to batch inserts together, but I’m curious what needs to happen in my home that there’s a need for a single insert query that inserts a 1000 rows. My network isn’t THAT big…
Sensors: 342
Lights: 12
Switches: 106
Binary sensors: 200
Automations: 54