Dear community, I need some help… I have rebuild my entire Homeassistant a couple of weeks ago and am now running it in a Docker environment on Ubuntu (on a 5th gen i5). Was a bit of a learning curve, but got it all working properly now and am really happy.
After everything was up I started playing around with Frigate and also got that up an running, with 24/7 recording on my NAS, defined zones for automations, using a Coral for object detection and all, so also that…perfect!!
Now to my “issue”… I am using the Frigate integration in Homeassistant and display the camera feeds on my dashboard. I figured out, by default Frigate forwards the JSMPEG stream, which is really fast. So on my dashboard I use the Picture Elements card to display the camera, clicking it then opens the stream and the reaction is instant. All good so far. The only thing which does not work this way is audio, I understand the jsmpeg stream does not do audio.
The other thing which does not seem to work is open the stream using a mobile browser on my iPhone. It does display the jsmpeg stream in the Companion app, but not when I access through either Safari or Chrome. I think it may have to do with jsmpeg, but am not sure since it did seem to work just after setting it up. Just after a day or 2 it stops.
So after reading a lot I started looking into go2rtc and using that and this is where my issue starts. I have defined the streams in my Frigate config.yml file like so:
go2rtc:
streams:
esszimmer:
- rtsp://USER:[email protected]:554/live/ch0
esszimmer_sub:
- rtsp://USER:[email protected]:554/live/ch1
With the result that both streams are visible in the go2rtc dashboard. When I then change my cameras in the config to use this stream like so:
cameras:
esszimmer: # <------ Name the camera
ffmpeg:
inputs:
#High Res
- path: rtsp://192.168.xx.x:8554/esszimmer
roles:
- record
#Low Res
- path: rtsp://192.168.xx.x:8554/esszimmer_sub
I can see that the streams are being used in the dashboard and in the frigate UI I can now switch between which stream I want to use (MSE, webrtc, MJPEG)
So this all seems to work as it should.
Now when I am in Homeassistant and use the same card setup, it also opens the MSE stream when I click the picture elements card and I also get audio in this stream.
So, all good you would say, but…there is a huge delay when I click the card until it actually starts streaming. Huge here is about 8 to 9 seconds. After that the stream is there and it works, I can see in the go2rtc dashboard that it is using the stream, so I think the setup is ok. I just can’t figure out why there is this huge delay.
I then started experimenting with the Webrtc Camera integration, but am not having much luck there either. It does display the stream and all, but the delay is just annoying.
After reading everything I was under the impression that the mse or webrtc streams should be the lowest latency and should be fast, so using the go2rtc part shoudl reduce the connections to the camera and it should speed up things. But, in my case, it seems to cause all this delay. As I said initially the jsmpeg default from Frigate is instant. Also looking in the WebUI of frigate and selecting the mse of webrtc stream there, seems to be fast. Just displaying it from Homeassistant seems to be so slow.
I also tried to add a Generic Camera in Homeassistant and use the go2rtc stream I get the same delay when opening the stream.
As extra info; I did forward the right ports to my frigate container like so:
ports:
- "5000:5000"
- "1935:1935" # RTMP feeds
- "8554:8554"
- "8555:8555/tcp" # WebRTC over tcp
- "8555:8555/udp"
But nothing seems to help in improving this.
Any ideas on this? I am a bit lost in what I could try further. Again, not a serious issue, since the jsmpeg stream is really fast, but…I would just like the audio as well. And it is one of these things which could work, so that means for me…I NEED it to work…a bit of OCD here…