Realtime camera streaming without any delay - WebRTC

Hi James - not sure if you are stil using these forums, but figured I’d give it a shot… I also have a Nest Doorbell, and while I can use the below code to show the live stream, it appears to revert to a static image at some point, but I don;t know how much time passes before this occurs as I can’t tell its become static until I try and dance around in front of it when my wife can confirm if she can see me or not.

type: grid
square: false
columns: 1
cards:
  - show_state: true
    show_name: true
    camera_view: live
    type: picture-entity
    entity: camera.basement_doorbell
    camera_image: camera.basement_doorbell

I haven’t been able to program the camera to work with WebRTC as it asks me for a URL of the camera and I’m not sure where to find this information, can you give me a bit more info on how your parents dashboard is setup in this regard?

This was the only way I could get my Foscam interior cameras to work with Home Assistant. I did end up having to go into the device configs in /.storage and change the RTSP port to the same as the IP port (port 88).

That said - the feed works perfectly fine in realtime if I’m on wifi, but shows the play-symbol with a line through it when I’m on mobile data. It just never loads. It’s falling back to MSE in the top right corner. My speed should be fine and I don’t have any issues streaming things from mobile data. It’s also streaming the 720p sub stream.

This is one of my cards (sensitive info removed):

type: custom:webrtc-camera
url: rtsp://username:password@IP:port/videoSub
mode: webrtc,webrtc/tcp,mse,hls,mjpeg
media: video,audio

Good evening everyone, please tell me I have about 10 cameras, are they all connected to go2rtc via rtsp? How can they all be connected to a zero channel and get an rtsp link?

Dont know if this will help you. This is my setup in HA addon.

rtsp:
listen: “:8554”
default_query: mp4
streams:
CAM1: rtsp://USERNAME:PASSWORD@cameraIP:8554/live0.264
CAM2: rtsp://USERNAME:PASSWORD@cameraIP:8554/live1.264
CAM3: rtsp://USERNAME:PASSWORD@cameraIP:8554/live0.264
(These are for Holowits cameras)

rtsp://GO2RTCserverIP:8554/CAM1?mp4
or
rtsp://GO2RTCserverIP:8554/CAM1?video=all&audio=all
or
rtsp://GO2RTCserverIP:8554/CAM1

The rtsp link becomes the name that you specify

Hello everyone,
I can’t make 2-ways audio work.
All I can do is Hearing the audio from my browser. But I can’t send audio from microphone to camera. (my camera brand is EZVIZ C6N which is support 2-ways talk)

Could anyone helps?

Here is my go2rtc config section (inside frigate):

ffmpeg:
  hwaccel_args: preset-rpi-64-h264

go2rtc:
  streams:
    CameraEZVIZC6N:
      - rtsp://USER:[email protected]:554/live0
      - "ffmpeg:CameraEZVIZC6N#audio=opus"
      - "ffmpeg:CameraEZVIZC6N#audio=pcmu"
      - "ffmpeg:CameraEZVIZC6N#audio=pcma"
      - "ffmpeg:CameraEZVIZC6N#audio=aac"
  rtsp:
    listen: ":8554"
  webrtc:
    candidates:
      - 192.168.1.128:8555
      - stun:8555
  log:
    level: debug
    api: debug
    rtsp: debug
    streams: debug
    webrtc: debug
    mse: debug
    hass: debug
    homekit: debug

cameras:
  CameraEZVIZC6N:
    ffmpeg:
      inputs:
        - path: rtsp://127.0.0.1:8554/CameraEZVIZC6N  # rtsp://USER:[email protected]:554/H.264
          input_args: preset-rtsp-restream
          roles:
            - detect
            - record
    live:
      stream_name: CameraEZVIZC6N

Here is go2rc producer info:

Here is go2rtc consumer info:

Here is go2rtc stream in webRTC mode, I can here the sound:

Here is log when I try to call API go2rtc and send audio to camera:

 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=opus#input=file'
can't find consumer
 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=pcma#input=file'
can't find consumer
 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=pcmu#input=file'
can't find consumer
 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=aac#input=file'
can't find consumer

Here is the log of go2rtc:

I also tried this external link from go2rtc:
image

I tried to talk via this microphone but not work:

I also tried to talk via the microphone on my own https link, but not work.

(I already open port 8555 on my router)

Can anyone help me please - I am struggling with the webrtc stream taking a long time to load - loading as MSE first, then after a few seconds switching to RTC.
I need to use the webrtc.create_link service in order to generate a link, which can be used when displaying my Reolink Doorbell as a popup via Pipup on my android tv.

  - service: webrtc.create_link
    data:
      link_id: "{{ link_id }}"
      entity: camera.front_doorbell_sub

The only way I can do this is to use the Reolink Integrations camera entity for the sub stream. This stream is awful and takes around 10 seconds or so to actually load, before stuttering and trying freezing. Elsewhere in Home Assistant, I use the go2rtc links to display the camera stream as RTC which works fine, although still takes a couple of seconds to load.

I can’t see any way of using a go2rtc RTC stream in the above script code, as it HAS to use an entity, and go2rtc doesn’t create entities, so I have to use this awful stream source.

Can anyone suggest a way around this? it currently takes 5 or 6 seconds of black screen then loads the MSE stream which displays a static picture for around 4 or 5 seconds, before switching to RTC where it starts displaying fine, but as my popup only stays on the screen for 30 seconds, I barely get any avctual use from it.

Full script:

alias: Display Doorbell PIP Popup on TV
mode: single
variables:
  link_id: 0{% for _ in range(39) %}{{ range(10)|random }}{% endfor %}
sequence:
  - service: webrtc.create_link
    data:
      link_id: "{{ link_id }}"
      entity: camera.front_doorbell_sub #This comes form the Reolink Integration
      open_limit: 1
      time_to_live: 120
  - service: rest_command.pipup_url_on_tv
    data:
      ip: 192.168.1.89
      duration: 25
      title: Front Door
      message: Someone is at the front door
      width: 640
      height: 480
      url: >-
        https://myhomeassistanturl/webrtc/embed?url={{
        link_id }}

Ezviz cameras doesn’t support any open two way audio standard.

Good evening. Everything works fine for me Hikvision KV6113, thanks for the integration. Please tell me there is an LED on the front panel of the Hikvision KV6113, it lights up when the microphone is turned on if you use hik Connect. But if they communicate via a WEB RTC card with two-way audio communication, then the LED lights up and does not go out, that is, the microphone is constantly on, until a reboot. Tell me if there is a command to mute the microphone. Thank you.

Дбавте канал ISAPI у меня на hikvision вот так.
go2rtc:
streams:
DoorBell:

And you will need HTTPSS:// to have access to two-way audio

1 Like

You are my hero! After a lot of trying and failing I read your tip and boom I got it! Thank you so much for this input!

Hi is there any way to use this solution with Blink cameras? thx

@schleeh Hi, I’ve been following this suggestion also, but I am unclear about how to set up the config of the generic camera entity, based upon this. Are you able to show your generic camera settings config?

@AlexxIT Hi, are you able to suggest any way around my issue here by any chance? I’m hopeful there is a way of achieving it…

I’ve worked out how to do this without creating a generic camera entity. Removing the service call to create a link with webrtc.create_link and downloading a slightly different apk file which has the later webview as well as allowing http rather than just https. I’m now using a local http url instead via go2rtc which has resolved the issue.

1 Like

I have been using webrtc for a few weeks to view my 4 hikvision cameras in a dashboard. Like 2 days ago they stopped showing up (although periodically they would work, unsure why). They now just all display a red error circle with the error message “Failed to start WebRTC stream: Timeout talking to RTSPtoWebRTC server”.

I can sign in to the webrtc frontend URL and it shows my cameras as being configured, but I can no longer view them there either.

Anyone have any ideas? Google searches for that error don’t seem to provide any help.

I’m trying to make sure I “get” this thread, given its history. So, I install this repository and it gives me go2rtc and WebRTC Camera. I’m HA Blue/Hardkernal odroid N2+. I went through the config steps. I have the Custom:WebRTC card in Lovelace now. I can get to HA-IP:1984, but I can’t figure out how to use that. Is the idea to use it to figure out the right string for my cameras, then put those in go2rtc.yaml? I have a mixture of Kasa, Lorex, Amcrest, and Eufy (doorbell) cameras. Any pointers are much appreciated. Thanks

Hi
Thanks for your code. I have a Mobotix T25 but:

  • locally I have the image but if I add “audio” in the card’s “media” settings I no longer have anything (even the video) even though I checked “broadcast audio” in the Mobotix interface settings?

  • in remote access (smatphone) I have the camera feed in the card but if I select another view of the dashbord and I return to the view where the camera is the map is black and I have to restart the application HA to display the Mobotix image again. Do you have the mjpeg stream configuration for the camera?

Yes. Camera must go into yaml to view them.

Hi everyone, I have just set up some custom webrtc-camera cards, pulling in 4 x 1080p HIKVision cameras to HA, all on the one dashboard. Everything loads immediately with no lag… such a huge improvement over using standard camera entities via the Onvif integration. Big thanks to AlexxIT for the amazing work on this.

I have a question about viewing cameras on mobile data. Is there a way to automatically down sample bitrate based on the connection speed to the HA server? Everything works lightening fast on wifi, but I really need 10 Mb/s minimum to have the 4 streams playing smoothly on mobile data and in rural areas this might be a bit of a challenge.

I am using a very basic config in the go2rtc.yaml file, just enough to pull in the streams, nothing else.

There is no auto detection for remote connection. You need to use different Lovelace tabs with different streams.