Go2rtc project - help thread

Hi @AlexxIT - i thought it better to spawn a new thread in here for your go2rtc project (as opposed the now rather large webRTC thread.

i had a fiddle today - it worked great - i think!?! (thats the problem, im not quite sure).

Maybe you can clarify. Heres what i did - im in a docker environment:

  1. Made my docker-compose (like your example), spun up the container… verified it was up on http://x.x.x.x:1984 (it was - i could browse to it… there was no camera, but in that simple web page i could add my rtsp://xxxxxxx cameras and click in on them and the cam feed would show in the browser
  2. I tried adding the following in the go2rtc.yaml file to see if i needed to do that so the cams would be there when i rebooted… but it didnt seem to work (after reboot when i came back to the :1984 webpage, there was no cams). Not sure if this is a problem?
streams:
  - kph_sonoff: rtsp://xxxxxx:[email protected]:554/av_stream/ch0
  - kpk_hikvision: rtsp://xxxxxx:[email protected]:554/Streaming/channels/2
  1. In HASS, added the RTSPtoWebRTC integration, and linked it to your API listener http://my.main.host.ip:1984
  2. When i look at my picture glance cards - i think they might load faster… is that possible? (as i havent done anything else im a bit confused). I saw on the RTSPtoWebRTC integration page - the video they link there - at timecode 11:45 - he says a way to test if you are using webRTC successfully was to look for the ‘preload stream’ checkbox when you click into your picture glance card… well when i do that, its not there, so does that mean im working through your go2rtc container?
1 Like

2.The problem is in the dashes

4.If you select card option “Camera view: auto” - lovelace will show snapshots, that roads every 10 second. Full screen window will show WebRTC stream.

If you select “Camera view: live” - both, lovelace and full screen window will show WebRTC.

You can debug this on add-on Web UI page. Check info link for your camera.

ahh yes, i see removing the list/dash from the cams’ sorts the reboot / cam persistence thing in your API server. Thanks!

so, given my streams: config was invalid, and it was still working as webRTC streams, am i right to assume that, there no benefit in hard-coding my cams in the go2rtc yaml? (they are already in my hass config, obviously). I did see in your FAQ the “zero config” option, so i guess i was using that? Is there any downside to this?

As to the “info” link, are you talking about the “type”:“RTSP client producer” part? (BTW - quite often when i click that link it shows “null” - but after i start the cam feed with the webrtc link and then cline info, i get the full string… see below (scroll over to the right for the comments)

streams:                                                                       #"info" link when browsing to http://10.1.3.50:1984
  kph_sonoff: rtsp://xxxxx:[email protected]:554/av_stream/ch0                #[{"media:0":"video, sendonly, 96 H264/90000","media:1":"audio, sendonly, 8 PCMA/8000","receive":122649,"remote_addr":"10.1.3.88:554","send":0,"track:0":"8   PCMA/8000, sinks=1","type":"RTSP client producer","url":"rtsp://10.1.3.88:554/av_stream/ch0/"},       {"remote_addr":"udp4 prflx 10.1.4.21:62743 related :0","send":122691,"type":"WebRTC server consumer","user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.134 Safari/537.36 OPR/89.0.4447.91"}]
  kpk_hikvision: rtsp://xxxxx:[email protected]:554/Streaming/channels/2      #[{"media:0":"video, sendonly, 96 H264/90000",                                         "receive":160744,"remote_addr":"10.1.3.90:554","send":0,"track:0":"96 H264/90000, sinks=1","type":"RTSP client producer","url":"rtsp://10.1.3.90:554/Streaming/channels/2/"},{"remote_addr":"udp4 prflx 10.1.4.21:55801 related :0","send":160938,"type":"WebRTC server consumer","user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.134 Safari/537.36 OPR/89.0.4447.91"}]

I think you might benefit from an extra FAQ question “How do i know if its working?” as i think lots of people will pester you with this question!!

Anyone using go2rtc w/ the Frigate card? If so could you share the config?

1 Like

How does go2rtc work with the RTSPtoWebRTC and the WebRTC integrations? A working example end to end, including lovelace, might help illustrate it… Right now I’ve gotten all the cameras configured via the config file

but I am not sure where to go from here… It not clear to me how I get the RTSP URL from the Go2RTC UI to put into the lovelace card…

Appreciate any help you can provide

1 Like

I just added the RTSP feed that go2rtc outputs as a ‘Generic Camera’ under Integrations in Home Assistant:
Example:
rtsp://192.168.0.22:8554/backdeck
rtsp://192.168.0.22:8554/backyard etc.

1 Like

Look at my opening post. Point 4. Timecode 1145 in that linked video. Knowing if its working is subtle! (Why i suggested AlexxIT could maybe benefit from updating the FAQ). Also you use native picture-glance card

Sorry, I am not seeing any linked video in your posts

I’m pretty sure that ‘preload’ was removed for all cameras these days.

That isn’t really a good way to determine if its working anymore. Its likely better to turn up logging and see the connections while you’re setting things up…

A really easy way to know if its working is to open up the admin page :1984 and look at the ‘online’ column when you’re streaming you camera. If its greater than zero, something is viewing that camera. and then under ‘info’ link in that UI, you can see specifics of what is being streamed, from where, etc.

I’m at the exact same point. My camera is added and when I click either MSE or WEBRTC on the API web page it shows the camera stream. So that part is working.

Obviously I could just add it as an RTSP camera but would that take advantage of the WebRTC stream too? Or would it just be the same if using the RTSP stream directly? How does it know to stream over port 8555 instead of 8554.

And last question, it says; you can either add the RTSP stream from Go2RTC to Generic Camera or the direct RTSP stream from the camera to Generic Camera, why on earth would you set up Go2RTC and add the native camera stream to HA? That wouldn’t improve anything, right?

I have added the addon RTSPtoWebRTC and added the API URL, and it says it is connected successfully. But it doesn’t create a camera which I would expect. Or does it turn the existing native camera support into a WebRTC stream instead?

If you’ve installed the RTSPtoWebRTC integration it will zero config. It will use the webrtc and you can see this in :1984 interface. You’ll see the ‘online’ increase with the viewer count. and you can see in the info dialogs the source/destination.

It is replacing your stream with an RTC one (as long as you have the RTSPtoWebRTC installed and functioning). You really do not even need to have your cameras added to go2rtc at all. All you need is it installed and the generic camera streams will convert over. Its really pretty neat. I literally had to change nothing on my camera streams and once I connect, I can see the new URL added directly to the ‘1984’ console as soon as they are accessed.

Depending on what you want to do, you can customize your streams and that is the reason you ‘can’ add cameras and ffmpegs and everything to the go2rtc config. But it isn’t really required if you’re simply webrtc’ing the existing cameras as-is.

It changes the existing ones.

btw, I’d add this so you can actually see the connections in the go2rtc logs:

log:
  level: info  # default level
  api: debug
  rtsp: debug
  streams: debug
  webrtc: debug
1 Like

How does one specify to start a MSE stream instead?

1 Like

thanks! I think everything works now. Just have to test outside connections. Mostly it was unclear to me the RTSPtoWebRTC plugin replaced the generic camera plugin. I did however create my RTSP streams in the go2rtc.yaml file. Added the logging information, helps indeed! Again; thanks

Anyone got Hardware Acceleration working? Specifically with NVENC.

Hey guys my little input. I’m using TAPO cameras and they are linked to my HA instance through the TAPO control custom integration (it makes a camera.xxxxx in HA). I also use the frigate lovelace card using these entities with live_provider: HA

I switched from my webrtc rtsp server to this addon and it works fine when adding the go2rtc yaml file and specifying my streams. However I didn’t try adding the streams in my secrects.yaml and call the file from the go2rtc.yaml file, not sure if it will work but it can be a feature :slight_smile:

I also tested this addon by removing the streams and apparently it works if you have cameras already present in HA.

So here is my hypothesis : The frigate card tries to open the camera using the HA live provider. Since I added the RTSPtoWebRTC integration, HA sends the stream info to the go2rtc addon for conversion, gets the WebRTC stream and sends it back to the frigate card.

@AlexxIT what is your opinion? When checking the logs, my hypothesis seems to work ^^:

06:38:12.553 INF [rtsp] listen addr=:8554
06:38:12.553 INF [api] listen addr=:1984
06:38:12.553 INF [webrtc] listen addr=:8555
06:38:12.553 INF [srtp] listen addr=:8443
06:39:31.167 INF [streams] create new stream url=rtsp://XXXX@XXXX/stream1
06:39:31.175 DBG [streams] probe producer url=rtsp://XXXX:XXXX@XXXX/stream1
06:39:31.341 DBG [streams] start producer url=rtsp://XXXX@XXXX/stream1

Also another question.
I have the TAPO integration that makes a camera in HA and opens a flux towards the cameras (not sure if it is always ot only when I request to see the flux).
I also have the GO2RTC but not sure how it really works (see question/hypothesis above).
If I go with frigate, I need to open yet another flux towards it.

What do you guys think that I should do in order to avoid opening to many fluxes towards my cameras? Remove the camera from the TAPO integration, add the stream directly in go2rtc, use that link directly in the frigate card and in frigate?

Thanks for your input!

Just set this up for displaying my Unifi Doorbell camera feed onto my wall mounted tablet using browser mod, the feed loads in 1-2 seconds every time, really happy! Much improved from anything else I tried. Speed is imporant for my use case as I want it to show the live feed from the doorbell when the button has been pressed. Thank you for your efforts in making this.

I have the same experience. The only hardship I currently have is that doorbell feed which loads in 2 seconds doesn’t have audio as its in AAC. Its easy to fix this with this by using ffmpeg source to transcode to opus. It’s very slick. The caveat is then it loads in 7-8 seconds which is still pretty good but too slow for a doorbell. Not a fault of go2rtc!

If I can figure out how to send a MSE stream instead (and MSE eventually works with AAC which I think I saw a issue on this), this will work out of the box without a transcode I believe. I just can’t figure out how yet to specify a MSE stream. :frowning:

How many sources do you using when transcoding audio? Try to use first source as rtsp, and second as ffmpeg with only audio. Maybe ffmpeg will start faster if it ignore video.

Yes I just discovered this config type a few moments ago:

  frontdoor_main:
    - rtsp://admin:[email protected]:554/cam/realmonitor?channel=1&subtype=0&unicast=true
    - ffmpeg:rtsp://admin:[email protected]:554/cam/realmonitor?channel=1&subtype=0&unicast=true&proto=Onvif#audio=opus

I was hoping it would load the video stream without waiting on the audio stream but I found load times are similar to converging them like this:

  frontdoor_main:
    - ffmpeg:rtsp://admin:[email protected]:554/cam/realmonitor?channel=1&subtype=0&unicast=true&proto=Onvif#audio=opus#video=copy

No. go2rtc will wait all sources, because it should answer to client about all codecs at one moment

1 Like