Realtime camera streaming without any delay - WebRTC

@schleeh Hi, I’ve been following this suggestion also, but I am unclear about how to set up the config of the generic camera entity, based upon this. Are you able to show your generic camera settings config?

@AlexxIT Hi, are you able to suggest any way around my issue here by any chance? I’m hopeful there is a way of achieving it…

I’ve worked out how to do this without creating a generic camera entity. Removing the service call to create a link with webrtc.create_link and downloading a slightly different apk file which has the later webview as well as allowing http rather than just https. I’m now using a local http url instead via go2rtc which has resolved the issue.

1 Like

I have been using webrtc for a few weeks to view my 4 hikvision cameras in a dashboard. Like 2 days ago they stopped showing up (although periodically they would work, unsure why). They now just all display a red error circle with the error message “Failed to start WebRTC stream: Timeout talking to RTSPtoWebRTC server”.

I can sign in to the webrtc frontend URL and it shows my cameras as being configured, but I can no longer view them there either.

Anyone have any ideas? Google searches for that error don’t seem to provide any help.

I’m trying to make sure I “get” this thread, given its history. So, I install this repository and it gives me go2rtc and WebRTC Camera. I’m HA Blue/Hardkernal odroid N2+. I went through the config steps. I have the Custom:WebRTC card in Lovelace now. I can get to HA-IP:1984, but I can’t figure out how to use that. Is the idea to use it to figure out the right string for my cameras, then put those in go2rtc.yaml? I have a mixture of Kasa, Lorex, Amcrest, and Eufy (doorbell) cameras. Any pointers are much appreciated. Thanks

Hi
Thanks for your code. I have a Mobotix T25 but:

  • locally I have the image but if I add “audio” in the card’s “media” settings I no longer have anything (even the video) even though I checked “broadcast audio” in the Mobotix interface settings?

  • in remote access (smatphone) I have the camera feed in the card but if I select another view of the dashbord and I return to the view where the camera is the map is black and I have to restart the application HA to display the Mobotix image again. Do you have the mjpeg stream configuration for the camera?

Yes. Camera must go into yaml to view them.

Hi everyone, I have just set up some custom webrtc-camera cards, pulling in 4 x 1080p HIKVision cameras to HA, all on the one dashboard. Everything loads immediately with no lag… such a huge improvement over using standard camera entities via the Onvif integration. Big thanks to AlexxIT for the amazing work on this.

I have a question about viewing cameras on mobile data. Is there a way to automatically down sample bitrate based on the connection speed to the HA server? Everything works lightening fast on wifi, but I really need 10 Mb/s minimum to have the 4 streams playing smoothly on mobile data and in rural areas this might be a bit of a challenge.

I am using a very basic config in the go2rtc.yaml file, just enough to pull in the streams, nothing else.

There is no auto detection for remote connection. You need to use different Lovelace tabs with different streams.

Thanks. Rather than having separate lovelace tabs with separate streams, I might look into optimising the main streams. At the moment, I’m guessing that no transcoding is being carried out. Is transcoding an option for go2rtc? If so, do you have any recommended formats or settings? I don’t mind if transcoding adds some latency, as long as the streams are reliable.

Alternatively I could reduce the bitrate of the streams on the HIKVision DVR. What would be the best option?

Go2rtc support transcoding with ready to use presets. It almost doesn’t add any latency.

Thanks. I went through the docs but unfortunately can’t figure it out. I’d like to transcode an rtsp source stream to a lower bitrate stream. This is what my config file looks like currently:

streams:  
  Camera: rtsp://username:[email protected]:554/Streaming/channels/101

Any pointers would be appreciated :slightly_smiling_face:

What documentation are you reading? Just looked at the docs. It has the word “transcoding” 28 times.

I think I got there in the end. My bit rate is now halved with the code below. Takes a little longer for the streams to load in lovelace but they are perfectly stable, which is what I was after. CPU usage on my N100 server is around 40% when transcoding the four streams. If you have any suggestions on how I could optimise further, please let me know.

streams:  
  Camera: ffmpeg:rtsp://username:[email protected]:554/Streaming/channels/101#video=mycodec

ffmpeg:
  mycodec: "-vf scale=1024:576 -codec:v libx264 -g:v 30 -preset:v superfast -tune:v zerolatency -profile:v main -level:v 4.1 -b:v 1M -an"

I recommend using the built-in transcoding presets. They also include hardware acceleration if your hardware supports it and is configured correctly.

Do you mean by adding # hardware or #hardware=vaapi? I tried this but I’m not seeing any reduction in cpu usage.

The config from my previous post is doing what I want it to do (i.e. nicely reducing bitrate), but it might be using a bit more cpu than I would like… I’m not sure if hw acceleration is working.

If you could provide an example config to try, that would be appreciated!

Good morning everyone,
I have installed the webrtc plugin. Unfortunately, I always have a misfire on my Home Assistant screen on the Fire tablet (current generation, purchased new in 2024) or it takes some time when Fully Kiosk wakes up from “standby”. The original Reolink plugin is actually there immediately. But I want to use Webrtc as this streams live better. Reolink has an offset of approx. 5 to 10 seconds. What can I do? Also, why does the webrtc plugin show me a play/pause button on the screen? Can I deactivate this somehow? I don’t need that. Thank you!
Here is my config:

type: custom:webrtc-camera
url: rtsp://admin:password@192.xxx.xxx.xxx:554/h264Preview_01_sub
media: video
mode: webrtc
style: ‘.mode {display: none}’‘.pictureinpicture {display: none}’‘’
background: true
ptz:
service: onvif.ptz
data_left:
entity_id: camera.kamera_haustuer_profile000_mainstream
pan: LEFT
speed: 0.5
distance: 0.1
move_mode: ContinuousMove
data_right:
entity_id: camera.kamera_haustuer_profile000_mainstream
pan: RIGHT
speed: 0.5
distance: 0.1
move_mode: ContinuousMove
data_up:
entity_id: camera.kamera_haustuer_profile000_mainstream
tilt: UP
speed: 0.5
distance: 0.1
move_mode: ContinuousMove
data_down:
entity_id: camera.kamera_haustuer_profile000_mainstream
tilt: DOWN
speed: 0.5
distance: 0.1
move_mode: ContinuousMove
data_zoom_in:
entity_id: camera.kamera_haustuer_profile000_mainstream
zoom: ZOOM_IN
speed: 0.5
distance: 0.3
move_mode: ContinuousMove
data_zoom_out:
entity_id: camera.kamera_haustuer_profile000_mainstream
zoom: ZOOM_OUT
speed: 0.5
distance: 0.3
move_mode: ContinuousMove

You will find the Pause / Play Button in the picture that I attached!

Thanks for your help in this case! Big THX

Is there a lite version of this that allows displaying a stream that’s already in WebRTC format?

Depends where you get it. If it from Nest integration, you can use built-in Lovelace card.

In this case it’s being used with docker-wyze-bridge which doesn’t have its own card.

I’ve been using your WebRTC Camera in conjunction with it for some time, but recently updated docker-wyze-bridge and discovered that both of these are trying to run their own instance of go2rtc on the same port. I then noticed that docker-wyze-bridge can output WebRTC by itself now (I was previously just using the RTMP steams it exposes with WebRTC Camera).

This makes me think I only need the Lovelace card from WebRTC Camera without much, if any, of the backend behind it.

When opening a WebRTC stream provided by docker-wyze-bridge in the browser, it looks like it’s using a <video /> element with a little bit of JS (a WHEPClient about 150 lines long).