Go2rtc project - help thread

Which documentation? Are you referring to the webrtc documentation? Because this is go2rtc. Different project. And RTSPtoWebRTC is specifically called out in the go2rtc documentation…

I think its difficult to produce an ‘end to end example’ because the use cases for this is so wide.

If you tell us where you’re stuck, maybe someone can help you. I have this working across various aspects of my HA now.

Thanks @calisro I appreciate your help!

I have cameras coming into home assistant via the frigate plugin. I want to limit the delay from the cameras because with the default rtmp I get 30+ second delay.

My goal is to be able to use go2rtc in the Frigate lovelace card and view externally through my Nabu Casa subscription.

Right now I have go2rtc installed and RTSPtoWebRTC but I don’t know how connect the pieces together and use them in the Frigate Lovelace Card.

Also similar to you I have the AD410 I want to get working through the Frigate card :slight_smile:

I don’t use frigate. But you shouldn’t need to do anything for webrtc to kick in. You should literally need to install those components, forward your 8555 port per the docs and it will make that existing camera stream webrtc. You can check in the go2rtc UI and see the numbers/info link with the data when you open up the camera. That will tell you if its working. Also turn up your debugging logs.

You only really need to define the ‘streams’ if you want to customize them more.

This will work too BUT you probably won’t get 2-way working in that card. Right now I don’t think there is a lovelace card to allow the 2-way. I currently use the ‘webrtc’ link in the go2rtc UI and send that link as a notification to our phones when motion is detected so we can open it and see/talk to the door. It technically just runs in a browser when we click it but it works fine. eventually I am sure someone will make a lovelace card.

There’s no way to mute the mic though. Lol so that’s made some comical fun with “open mic” when we forgot we opened the link. Oops

Hi,
can I use go2rtc to use the reolink camera with full resolution (4k, h265) and convert it to h264, to view it in home Assistant? I tested it and a don’t get any error messages, but i get an grey image with some fragments.

I used the following Command:

- ffmpeg:rtsp://user:pass@IP:554/h265Preview_main#video=h264

Thanks

You built the wrong link. Check docs

Oh, i removed the slashes after rtsp (typo), sorry.
Have i forgotten anything else? Checked the example in the docu, but cant find my fault.

Thanks

Everything should work now

After a few seconds i receive a very laggy stream. Audio and video are very jerky. Also tried to add pcmu for audio, but no difference.

Likely the hardware you’re doing transcoding on (running go2rtc) is making it jerky.

Is there a way to adjust the timeout (increase) for the ‘Timeout Handling WebRTC offer’? I have a couple cameras that run through ffmpeg with go2rtc and when loading at once sometimes some fail to display. This is likely due to hardware slowness but rather than ‘Failed to start WebRTC’, i’d prefer to either a) increase the timeout or b) retry itself.

Once it fails, I’d need to refresh the entire lovelace for it to retry again…

Its running as home Assistant Plugin, the ha vm has two cores and 4gig of RAM. When i Start the stream the cpu is at about 70% and RAM at 40%. Do you have suggestions for improvement?

Transcoding video is not chip operation. Hardware acceleration is very complicated task for all possible user setups. Native H265 support is very limited for all possible user devices.

Some of this problems will be solved in future go2rtc updates.

@AlexxIT - my Dahua doorbell camera has never worked with any of your plugins. With go2RTC this is the error I’m getting. Any suggestions on how I can resolve? I am able to see the RTSP stream in VLC totally fine with the same URL.

07:32:41.702 INF [rtsp] listen addr=:8554
07:32:41.705 INF [hass] load stream url=hass:Camera_2_h264
07:32:41.705 INF [hass] load stream url=hass:Door_Camera
07:32:41.705 INF [hass] load stream url=hass:192_168_0_69
07:32:41.705 INF [api] listen addr=:1984
07:32:41.705 INF [srtp] listen addr=:8443
2022/09/07 13:03:04 [INFO] mdns: Closing client {true false 0xc00012c770 <nil> 0xc00012c778 <nil> 1 0xc000100780}
07:33:50.180 INF [streams] create new stream url=rtsp://admin:[email protected]:554/cam/realmonitor?channel=1&subtype=0&unicast=true&proto=Onvif
07:33:53.017 WRN [api.webrtc] add consumer error="couldn't find the matching tracks"

@sachinss increase log level to trace and came to github issues. Or you can find me in Telegram with same nick.

@calisro next beta.4 will have really fast start for streams with ffmpeg source only for audio transcoding. Something like this:

streams:
  dahua_opus:
    - rtsp://192.168.1.123/cam/realmonitor?channel=1&subtype=0
    - ffmpeg:rtsp://192.168.1.123/cam/realmonitor?channel=1&subtype=0#audio=opus

It’s not recommended to use #video=copy#audio=something if it possible. Because ffmpeg will start more than 3 seconds in this case. But stream delay will be OK anyway (less than half second as usual). Better to split stream on two sources (direct rtsp and ffmpeg).

1 Like

Thanks for confirming. I am using this already for 4 streams.

  backyard_main:
    - hass:10_100_1_134
    - ffmpeg:rtsp://127.0.0.1:8554/hass%3A10_100_1_134#audio=opus

It works good except when loading many cameras (6) at once. Then I always get 1 or 2 that timeout with ‘Timeout Handling WebRTC offer’. I’ll try beta4 when its out but i’d like to avoid cameras not loading. I’d rather wait a little longer for them to load.

I have go2rtc and RTSPtoWebRTC installed. I see my camera picked up by the RTSPtoWebRTC logs but go2rtc doesn’t seem to see it. It seems like the only way for the stream to be WebRTC is if I manually add my rtsp link in the go2rtc.yaml but then I can click on the webrtc link in the UI of go2rtc but in my dashboard the stream still isn’t webrtc.

Can you explain what you mean here? I don’t think there are logs specific to that integration itself. So trying to understand. When you added ‘RTSPtoWebRTC’ did you add the api? (probably http://127.0.0.1:1984/). Did you remove the other ‘webrtc camera’ integration if you had it installed (it would interfere)?

That isn’t required but you could…

How are you checking that?

Anyway, you can define whatever streams you want in the go2rtc too and then reference those as rtsp streams in frigate or generic camera using its rtsp address. (rtsp://127.0.0.1:8554/[rtsp2rtc config name])

But it should be zero config in that it converts them to webrtc.

Another way to see if it is is to look at the attributes of your camera after/while you view it. You’ll see an attribute like this:

“frontend_stream_type: web_rtc”

I see this in the RTSPtoWebRTC logs [GIN] 2022/09/07 - 15:29:31 | 500 | 5.063519892s | 10.10.0.12 | POST "/stream" [GIN] 2022/09/07 - 15:33:57 | 404 | 3.93µs | 10.10.0.12 | GET "/streams" [GIN] 2022/09/07 - 15:33:57 | 200 | 160.84µs | 10.10.0.12 | GET "/static/"

I don’t see any configuration for RTSPtoWebRTC so I’m not sure where to add the API.

I’ll check the attributes of my camera but when I was using WebRTC before the video had 0 lag and was super smooth, but currently it’s very delayed and choppy.

Remove it and re-add it. It should look like this. I had to add that when I initially installed it.

image

1 Like