Frigate/Got2RTC/WebRTC Camera and the combination of things....not getting what I want...;)

Dear community, I need some help… I have rebuild my entire Homeassistant a couple of weeks ago and am now running it in a Docker environment on Ubuntu (on a 5th gen i5). Was a bit of a learning curve, but got it all working properly now and am really happy.
After everything was up I started playing around with Frigate and also got that up an running, with 24/7 recording on my NAS, defined zones for automations, using a Coral for object detection and all, so also that…perfect!!

Now to my “issue”… I am using the Frigate integration in Homeassistant and display the camera feeds on my dashboard. I figured out, by default Frigate forwards the JSMPEG stream, which is really fast. So on my dashboard I use the Picture Elements card to display the camera, clicking it then opens the stream and the reaction is instant. All good so far. The only thing which does not work this way is audio, I understand the jsmpeg stream does not do audio.

The other thing which does not seem to work is open the stream using a mobile browser on my iPhone. It does display the jsmpeg stream in the Companion app, but not when I access through either Safari or Chrome. I think it may have to do with jsmpeg, but am not sure since it did seem to work just after setting it up. Just after a day or 2 it stops.

So after reading a lot I started looking into go2rtc and using that and this is where my issue starts. I have defined the streams in my Frigate config.yml file like so:

go2rtc:
  streams:
    esszimmer:
      - rtsp://USER:[email protected]:554/live/ch0
    esszimmer_sub:
      - rtsp://USER:[email protected]:554/live/ch1

With the result that both streams are visible in the go2rtc dashboard. When I then change my cameras in the config to use this stream like so:

cameras:
  esszimmer: # <------ Name the camera
    ffmpeg:
      inputs:
        #High Res
        - path: rtsp://192.168.xx.x:8554/esszimmer 
          roles:
            - record
        #Low Res    
        - path: rtsp://192.168.xx.x:8554/esszimmer_sub

I can see that the streams are being used in the dashboard and in the frigate UI I can now switch between which stream I want to use (MSE, webrtc, MJPEG)

image

So this all seems to work as it should.

Now when I am in Homeassistant and use the same card setup, it also opens the MSE stream when I click the picture elements card and I also get audio in this stream.

So, all good you would say, but…there is a huge delay when I click the card until it actually starts streaming. Huge here is about 8 to 9 seconds. After that the stream is there and it works, I can see in the go2rtc dashboard that it is using the stream, so I think the setup is ok. I just can’t figure out why there is this huge delay.

I then started experimenting with the Webrtc Camera integration, but am not having much luck there either. It does display the stream and all, but the delay is just annoying.

After reading everything I was under the impression that the mse or webrtc streams should be the lowest latency and should be fast, so using the go2rtc part shoudl reduce the connections to the camera and it should speed up things. But, in my case, it seems to cause all this delay. As I said initially the jsmpeg default from Frigate is instant. Also looking in the WebUI of frigate and selecting the mse of webrtc stream there, seems to be fast. Just displaying it from Homeassistant seems to be so slow.

I also tried to add a Generic Camera in Homeassistant and use the go2rtc stream I get the same delay when opening the stream.

As extra info; I did forward the right ports to my frigate container like so:

    ports:
      - "5000:5000"
      - "1935:1935" # RTMP feeds
      - "8554:8554"
      - "8555:8555/tcp" # WebRTC over tcp
      - "8555:8555/udp" 

But nothing seems to help in improving this.

Any ideas on this? I am a bit lost in what I could try further. Again, not a serious issue, since the jsmpeg stream is really fast, but…I would just like the audio as well. And it is one of these things which could work, so that means for me…I NEED it to work…a bit of OCD here…:wink:

This is incorrect, the frigate integration does not have support for the jsmpeg stream. Only RTSP stream or a jpeg snapshot that updates once a second

you should really be using 127.0.0.1 for the IP address, not your hosts ip address here

You need to enable preload stream. Also this is the actual RTSP stream which by default uses HLS in home asisstant which has a large delay. You should look into the webrtc card

Just to reiterate, the picture glances card does not support MSE and it only supports webrtc when you have the rtsp2webrtc integration installed.

So you are just using RTSP + HLS which of course has huge delay. Definitely recommend using the v5 of the frigate card or the webrtc card instead

2 Likes

Thanks for the reply. That already clarifies quite a bit…!

A few questions:

Where do I do that?

Trying to install this, but not succesfull. The things I did:

  • Open the ports 8555 both TCP and UPD on my router as per documentation
  • Forward the ports in the docker compose to my Frigate Docker
  • Use http://127.0.0.1:8555 as the server for rtsp2webrtc (also tried to use my hosts ip adress but same effect)

But here…I cannot get a connection the the WebRTC server. What am I misunderstanding still?

Ok, got it figured out…port 1984 needs to be forwarded to the docker container as well, since that apparently is the port go2rtc uses. So forwarding that and using that port now gets my stream as fast as I want.

Just needed to recode the audio for the stream so webrtc supports it. But that was a simple fix as well.

So mission accomplished, thanks again!!

1 Like

was hast du beim audio angegeben?, bei mir funktioniert alles perfekt, nur bekomm ich keinen sound?

can you share how you fixed the audio issue?
thank you

English please

1 Like

I defined the go2rtc streams like this toi reencode the audio for webrtc:

go2rtc:
  streams:
    achtertuin: # <- for RTSP streams
      - rtsp://USER:[email protected]:554/live/ch0 # <- stream which supports video & aac audio
      - "ffmpeg:achtertuin#audio=opus"
    achtertuin_sub: # <- for RTSP streams
      - rtsp://USER:[email protected]:554/live/ch1 # <- stream which supports video & aac audio
      - "ffmpeg:achtertuin_sub#audio=opus"

adding the - "ffmpeg:achtertuin_sub#audio=opus" bit reencodes the audio stream to a format which is supported by webrtc.

Thank you very much. Much appreciated

Now I am going absolutely crazy. I have 2 cameras…both the same (Reolink 410W). One works as it should, the other doesn’t. (Actually have more cameras, but only one transports the audio as it should, so just focussing on the 2). This is in my frigate config (reduced, so took all the record and detect settings out, since irrelavant):

go2rtc:
  streams:
    garten:
      - rtsp://USER:[email protected]:554/h264Preview_01_main
      - "ffmpeg:garten#audio=opus"    
    garten_sub:
      - rtsp://USER:[email protected]:554/h264Preview_01_sub
      - "ffmpeg:garten_sub#audio=opus"    
    eingang:
      - rtsp://USER:[email protected]:554/h264Preview_01_main
      - "ffmpeg:eingang#audio=opus"      
    eingang_sub:
      - rtsp://USER:[email protected]:554/h264Preview_01_sub
      - "ffmpeg:eingang_sub#audio=opus"

cameras:
  garten: 
    ffmpeg:
      inputs:
        #High Res
        - path: rtsp://127.0.0.1:8554/garten
          roles:
            - record
        #Low Res    
        - path: rtsp://127.0.0.1:8554/garten_sub
          roles:
            - detect

  eingang:
    ffmpeg:
      inputs:
        #High Res
        - path: rtsp://127.0.0.1:8554/eingang
          roles:
            - record
        #Low Res    
        - path: rtsp://127.0.0.1:8554/eingang_sub
          roles:
            - detect

Now in the Frigate web UI I can open the lice streams of both of the cameras, use the WEBRTC stream and I get audio. Perfect…

In Homeassistant I have the Frigate integration installed and the provided camera entities display using WEBRTC and I can hear audio. By default it is the High Res stream though, have not found a way to change that.
So, I also installed the WEBRTC Camera integration as well as the RTSPtoWebRTC to get this all working. Setup with the url: rtsp://127.0.0.1:1984

I then added a generic camera with the following adress: rtsp://127.0.0.1:8554/eingang_suband one for the Garten rtsp://127.0.0.1:8554/garten_sub

And here the fun starts, it works, it creates both the cameras, but when I open the stream only the “eingang” camera plays audio, the “garten” does not. This is the same for all my other cameras, only this one camera actually plays the audio in Homeassistant.

Go2RTC shows the following:


And also here, the WEBRTC stream for “eingang_sub” plays audio, the “garten_sub” does not.

I am lost…

Please help…what am I overlooking or missing…any ideas??

EDIT: Gets even more interesting. Straight after a reboot of everything it works for a few minutes. After that the audio streams dissapear.

there is no need to create generic camera entities. when you use the webrtc card you should simply set it up with

type: custom:webrtc-camera
url: garten

for example and it will work

Yes, but I do not like that card. I prefer the picture elements with a static image which refreshes regularly and clicking it then openes the pop up with the stream. That is why I created the generic cameras to have them specifically on the low res stream.

But again…the audio is the main topic. It is very unreliable, sometimes all work, then leaving it running all of a sudden starts ignoring audio…really don’t understand it.

EDIT: Got it working… WOOOW!

  • Added forwarding of port 1984 in docker-compose
  • Installed the WebRTC custom component, then added the Integration

Hi guys - perhaps a ‘dumb’ question;

TO get webrtc to work, do I need to install the HACS->Integration->WebRTC Camera ?
Do I need to add anything to my frigate.yml for getting WebRTC to work? Docs are not clear enough (at least not for me).

I run HA as docker with Frigate 0.12 also as docker…

Everything that’s needed for webrtc support is right here Live View | Frigate

A more specific question will be easier to answer.

Hi all,

I have a problem to integrate my Amcrest AD410 doorbell.
This doorbell is already integrated via scrypted. The 2 way audio is working smooth in homekit.
Now, i want to integrate it to HA, like i did with my HikVision (integated with 2 way audio).

Of course, for this operation, i’m using frigate installed on a docker, well configured.
And frigate cards + frigate integration in HA.

My problem is the following :

go2rtc:
  streams:
    sonnette: 
      - 'rtsp://admin:[email protected]/cam/realmonitor?channel=1&subtype=0#backchannel=0'
      - 'ffmpeg:sonnette#audio=opus'

cameras:
 sonnette:
    enabled: True
    ffmpeg:
      inputs:
        #- path: 'rtsp://admin:[email protected]/cam/realmonitor?channel=1&subtype=0#backchannel=0'
        - path: 'rtsp://admin:[email protected]/cam/realmonitor?channel=1&subtype=0'
          roles:
            - detect
            - record
            - rtmp
    best_image_timeout: 60
    ui:
      order: 0
      dashboard: True

When i add “#backchannel=0” → the doorbell can NOT ring.
When i remove “#backchannel=0” → the doorbell can ring.

In both case the audio is working, but not the 2 way audio (meaning the mic is never working even if i see the mic button in the HA interface).

i read and tried some stuff from this thread and others, but i can’t get this stupid 2 way audio working with this doorbell.

Thank you for your help.

Did you manage to resolve this ?
I’m facing the same issue.

Nope.
I momentarily gave up.

Can I ask what settings on the AD410 and in Scrypted you have for two way audio. I had this working fr 2 days 3 weeks ago and then Homekit through Scrypted stop working and no matter what combination I’ve used since I can’t get it working.

I can provide some screenshots







Let me know how I can help you more.

Thats really helpful thank you - if you get the chance could you take a look at you settings in the Amcrest app? I’m interested in the device information for the firmware and the other settings like below: