Realtime camera streaming without any delay - WebRTC

I have an ipad from 2017 and the iOS version is 16.4.1. When I view the stream on the iPad, it shows as MSE. go2rtc version is 1.5.0 and is running as the home assistant add-on. Webrtc card version is 3.1.0.

is there any other information that could help you debug? any way for me to do more debugging considering it’s the ipad and no a full browser that would allow me to run dev tools?

I’ve added WebRTC Camera card to my dashboard but the output is showing MSE. Should it be WebRTC? Or can I select between HLS/MSE/WebRTC by adding something to my card code?

Adding ‘mse: false’ fixed that for me on my iPad.

Put that line under your stream url.

1 Like

I have 4 Reolink cameras, two RLC-520s and two RLC-520As. the 520s are perfect and can run for days without any time delay. However, the two 520As will be about 30 seconds behind real time a couple hours after the server starts. Any thoughts as to why these cameras would be different?

That’s sorted it. Thanks.

Hi, I am new to HomeAssistant and started playing with the Webrtc Camera Intergration (using built-in go2rtc).
I can view the RTC video stream, but do not have audio.
Before bringing up the question here, I searched for my problem, but could not figure out how to solve it. I found info that the missing audio is most likely due to not using the OPUS codec … but I did not find the way to make sure, that OPUS codec is being used.

HomeAssistant is running in Docker, in Host-Mode.
I have a Reolink E1 Zoom. Reolink Integration is installed and provides the camera entity. Displaying the life-stream in Picture-Glance has video and audio, but terrible latency and stability.

This is what I added into the go2rtc.yaml file:

api:
  listen: 127.0.0.1:1984
rtsp:
  listen: 127.0.0.1:8554

streams:
  cam_01_sub: 
    - rtsp://<user>:<password>@<cameraIP>:554/h264Preview_01_sub 
    - ffmpeg:rtsp://<user>:<password>@<cameraIP>:554/h264Preview_01_sub&audio=opus

webrtc:
  candidates:
    - 127.0.0.1:8555
    - stun:8555

This is the config in my frigate-card:

type: custom:frigate-card
cameras:
  - camera_entity: camera.cam_01_sub
    live_provider: webrtc-card
    webrtc_card:
      entity: camera.cam_01_sub
      mse: false
      mode: webrtc
menu:
  style: hover
  buttons:
    image:
      enabled: true
view:
  default: image
image:
  mode: camera
  refresh_seconds: 2

The Video is playing super smooth using the frigate card, I do not have audio though. Looks like it is not picking up the opus codec. What am I missing? How can I find out, if the go2rtc.yaml is used at all?

I have Reolink e1 outdoor and i have same issue.
I tried webrtc, go2rtc always the same.
Works nice if i restart HA but then it starts lagging behind. If i try vlc stream there is no delay.
I use to ran HA on some old laptop, now i switched to intel n95 mini pc and there is still a problem.

Integration is working OK using MSE with 2 Reolink cameras in a IOT network but I can’t use the webrtc mode, it’s not loading any stream. IOT and lan network have access both ways but i did tried to forward UDP and TCP port 8555 from IOT to lan whitout success. I’ve also tried RTSPtoWeb - WebRTC and RTSPtoWebRTC with the same result. I’m running HA in a virtualbox VM on my server.
Does anyone had the same problem ?

I’m trying to get audio on a real-time stream from my Wyze Cam v3 in Home Assistant Container. I’ve downloaded WebRTC through HACS and I’ve been able to set up a card on the Dashboard that has real-time video using:

type: custom:webrtc-camera
url: rtsp://username:password@ipaddress/live

but no matter what I try, I can’t get audio.

I’m currently desperately trying to add some kind of stream to go2rtc.yaml like,

streams:
camera:
        - rtsp://username:password@ipaddress/live
        - ffmpeg:camera#audio=opus

But no dice. I understand that WebRTC doesn’t support very many audio codecs, but it does support opus and I can’t find anyone anywhere who successfully added audio to their Wyze Cam stream.

Has anyone done this with a Wyze Cam successfully?

What audio codec has your camera? And what technology can you see in the
card’s top right corner?

I’ve read that Wyze Cam v3 supports opus and AAC (even though AAC can’t be used in WebRTC). In the card’s top right corner I see MSE. I’m guessing I should use a Picture Glace card or something else. I’m using the WebRTC custom card.

Well. MSE was selected because AAC codec.
Can you hear audio via VLC?

Yes I can.

Go2rtc has WebUI where you can check active stream info and selected codecs

OK I gained access to the WebUI. I see checkboxes for webrtc, mse, mp4, and mjpeg. And that’s it.

Why is it that if I press “stream” button on go2rtc WebUI, it’s almost instantaneous but takes forever to load in lovelace?

My go2rtc config:

streams:
  camera.doorbell:
    - rtsp://admin:[email protected]:554/h264Preview_06_main
    - ffmpeg:rtsp://admin:[email protected]:554/h264Preview_06_main#audio=opus

My generic camera yaml:

  - platform: generic 
    name: Doorbell
    stream_source: rtsp://127.0.0.1:8554/camera.doorbell

My ui-lovelace.yaml:

title: Home
icon: 'mdi:home'
path: cameras_home
visible: false
bades: []
type: custom:grid-layout
layout:
  grid-template-columns: auto auto auto
  grid-template-rows: auto
  grid-template-areas: |
    "cam1 cam2 cam3"
    "cam4 cam5 cam6"
    "cam7 cam8 cam9"
  mediaquery:
    #phone
    "(max-width: 800px)":
      grid-template-columns: auto
      grid-template-rows: auto
      grid-template-areas: |
        "cam1"
        "cam2"
        "cam3"
        "cam4"
        "cam5"
        "cam6"
        "cam7"
        "cam8"
        "cam9"
cards:
  - type: picture-entity
    entity: camera.doorbell
    show_state: false
    show_name: true
    camera_view: live
    aspect_ratio: '16:9'
    name: Entrance
    style: "ha-card { height: 100%; }"
    view_layout:
      grid-area:cam1

Status at developer menu:

access_token:"token"
frontend_stream_type: web_rtc
entity_picture: /api/camera_proxy/camera.doorbell?token="token"
friendly_name: Doorbell
supported_features: 2

Also, I have a problem with Android devices not playing the stream. PC, Mac, iPhone all work fine.

I have the following in my ptz section, but that also requires Onvif integration

              ptz:
                service: onvif.ptz
                data_left:
                  entity_id: camera.camera_entrance_mainstream
                  pan: LEFT
                  speed: 0.5
                  distance: 0.5
                  move_mode: ContinuousMove
                data_right:
                  entity_id: camera.camera_entrance_mainstream
                  pan: RIGHT
                  speed: 0.5
                  distance: 0.5
                  move_mode: ContinuousMove
                data_up:
                  entity_id: camera.camera_entrance_mainstream
                  tilt: UP
                  speed: 0.5
                  distance: 0.5
                  move_mode: ContinuousMove
                data_down:
                  entity_id: camera.camera_entrance_mainstream
                  tilt: DOWN
                  speed: 0.5
                  distance: 0.5
                  move_mode: ContinuousMove

I understand that you may be tired of me. But for me it is very important. Tell me, please, how to receive video along with sound in telegram? It used to work, but now it doesn’t. What should I fix? Or is there no such possibility now and never will be?

Previously, I received a video with sound in telegrams at this link.
http://localhost:1984/api/stream.mp4?src=DoorCam&video=h264&audio=pcmu&duration=15

do you need help