This used to happen to me after restarting HA too many times after a reboot. I would have to reboot to fix it.
Now, since 0.40, it happens after HA is running for a while. It’s completely broken my setup. It appears macOS just has an insanely low limit for this. We just need to increase the allowed number of files to be open.
You can see the list here:
$ ulimit -a
-t: cpu time (seconds) unlimited
-f: file size (blocks) unlimited
-d: data seg size (kbytes) unlimited
-s: stack size (kbytes) 8192
-c: core file size (blocks) 0
-v: address space (kbytes) unlimited
-l: locked-in-memory size (kbytes) unlimited
-u: processes 709
-n: file descriptors 256
after running:
ulimit -Sn 10000
it shows
-t: cpu time (seconds) unlimited
-f: file size (blocks) unlimited
-d: data seg size (kbytes) unlimited
-s: stack size (kbytes) 8192
-c: core file size (blocks) 0
-v: address space (kbytes) unlimited
-l: locked-in-memory size (kbytes) unlimited
-u: processes 709
-n: file descriptors 10000
But it doesn’t stick on a reboot. It goes back to 256.
Does anyone know the best practice for getting this to stick?