Realtime camera streaming without any delay - WebRTC

This error not related to this component

Iā€™m trying to sync the volume, so I can remotely adjust it when casting, but I have 2 challenges:

  1. I canā€™t seem to unmute the card, the default says muted all the time (after unmuting, setting the volume works fine)
  • theme: Backend-selected
    title: Camera HomeFlex
    path: camera-homeflex
    badges:
    cards:
    • type: custom:webrtc-camera
      url: >-
      ffmpeg:rtsp://XXXX:[email protected]/live0#video=copy#audio=opus
      media: video,audio
      mode: webrtc
      muted: false
      background: true
    • type: entities
      entities:
      • input_number.homeflex_volume
    • type: custom:content-card-volume
      entity: input_number.homeflex_volume
  1. my javascript code to find the volume element (custom:content-card-volume) is extremely dirty, is there a unique ID I could hook into? It works once unmuted ā€¦ but this is not sustainable code
try {
    //WebRTC layout
    var b = document.querySelector("body > home-assistant")
    .shadowRoot.querySelector("home-assistant-main")
    .shadowRoot.querySelector("ha-drawer > partial-panel-resolver > ha-panel-lovelace")
    .shadowRoot.querySelector("hui-root").shadowRoot.querySelector("#view > hui-view > hui-masonry-view")
    .shadowRoot.querySelector("#columns > div:nth-child(1) > webrtc-camera")
    .shadowRoot.querySelector("ha-card > div.player > div > video")
    b.muted = false
    b.volume = newVol
    
} catch (error) {
}

For anyone else looking for this:

style: '.header {display: none} .pictureinpicture {display: none}'

@RT1080 Thanks for posting that. Iā€™m trying to do the same with multiple style instructions, however the pictureinpicture style still isnā€™t applying - in fact, only one of my intended styles is applying, and itā€™s the .mode one. The .mode style isnā€™t the first in my list either, but itā€™s in the middle of the other styles that I am trying to set:

style: 'video {object-fit: fill} .pictureinpicture {display: none} .mode {display: none} .fullscreen {display: none}'

any ideas what might be preventing the others from applying?

Below works for meā€¦

style: >-
  video {object-fit: fill;} .header {display: none} .pictureinpicture {display: none} .mode {display: none} .fullscreen {display: none}'

Has your WebRTC actually been integrated into the latest version of Frigate?
Somehow it is not clear to me why the Go2rtc log from Frigate shows the following (as an example):

15:59:30.580 WRN github.com/AlexxIT/go2rtc/cmd/streams/producer.go:132 > error=ā€œread tcp 10.1.1.41:59958->10.1.1.60:554: i/o timeoutā€ url=rtsp://admin:[email protected]:554/live/ch0

Although I have not installed WebRTC from you. Or I used to have it. But then I uninstalled it because I thought it was integrated into Frigate by default.
In older docs it says you can install it/ or need it for WebRTC, in other newer ones it is no longer necessaryā€¦
Itā€™s getting more and more confusing.

the same with this config:

webrtc:
candidates:
- INTERNAL-HA-IP:8555
- stun:8555

On the one hand it says in docs you should add this in frigate.yaml for smooth function. On the other hand, it says that this is no longer necessary with the latest Frigate version.

There is a frigate forum/git. Youā€™re better off discussing there but I donā€™t think they have the latest versions.

Trying to get audio working from a couple older cameras. Only codec options on the cameras are AAC, LPCM, G.711, and G.726. Am I correct that itā€™s not possible to get audio working with WebRTC?

AAC audio works without using WebRTC but the stream delay is horribleā€¦ maybe 10-15 seconds behind. Video only with WebRTC works fantastic with almost no delay. Really want that audio thoughā€¦

Hi James - not sure if you are stil using these forums, but figured Iā€™d give it a shotā€¦ I also have a Nest Doorbell, and while I can use the below code to show the live stream, it appears to revert to a static image at some point, but I don;t know how much time passes before this occurs as I canā€™t tell its become static until I try and dance around in front of it when my wife can confirm if she can see me or not.

type: grid
square: false
columns: 1
cards:
  - show_state: true
    show_name: true
    camera_view: live
    type: picture-entity
    entity: camera.basement_doorbell
    camera_image: camera.basement_doorbell

I havenā€™t been able to program the camera to work with WebRTC as it asks me for a URL of the camera and Iā€™m not sure where to find this information, can you give me a bit more info on how your parents dashboard is setup in this regard?

This was the only way I could get my Foscam interior cameras to work with Home Assistant. I did end up having to go into the device configs in /.storage and change the RTSP port to the same as the IP port (port 88).

That said - the feed works perfectly fine in realtime if Iā€™m on wifi, but shows the play-symbol with a line through it when Iā€™m on mobile data. It just never loads. Itā€™s falling back to MSE in the top right corner. My speed should be fine and I donā€™t have any issues streaming things from mobile data. Itā€™s also streaming the 720p sub stream.

This is one of my cards (sensitive info removed):

type: custom:webrtc-camera
url: rtsp://username:password@IP:port/videoSub
mode: webrtc,webrtc/tcp,mse,hls,mjpeg
media: video,audio

Good evening everyone, please tell me I have about 10 cameras, are they all connected to go2rtc via rtsp? How can they all be connected to a zero channel and get an rtsp link?

Dont know if this will help you. This is my setup in HA addon.

rtsp:
listen: ā€œ:8554ā€
default_query: mp4
streams:
CAM1: rtsp://USERNAME:PASSWORD@cameraIP:8554/live0.264
CAM2: rtsp://USERNAME:PASSWORD@cameraIP:8554/live1.264
CAM3: rtsp://USERNAME:PASSWORD@cameraIP:8554/live0.264
(These are for Holowits cameras)

rtsp://GO2RTCserverIP:8554/CAM1?mp4
or
rtsp://GO2RTCserverIP:8554/CAM1?video=all&audio=all
or
rtsp://GO2RTCserverIP:8554/CAM1

The rtsp link becomes the name that you specify

Hello everyone,
I canā€™t make 2-ways audio work.
All I can do is Hearing the audio from my browser. But I canā€™t send audio from microphone to camera. (my camera brand is EZVIZ C6N which is support 2-ways talk)

Could anyone helps?

Here is my go2rtc config section (inside frigate):

ffmpeg:
  hwaccel_args: preset-rpi-64-h264

go2rtc:
  streams:
    CameraEZVIZC6N:
      - rtsp://USER:[email protected]:554/live0
      - "ffmpeg:CameraEZVIZC6N#audio=opus"
      - "ffmpeg:CameraEZVIZC6N#audio=pcmu"
      - "ffmpeg:CameraEZVIZC6N#audio=pcma"
      - "ffmpeg:CameraEZVIZC6N#audio=aac"
  rtsp:
    listen: ":8554"
  webrtc:
    candidates:
      - 192.168.1.128:8555
      - stun:8555
  log:
    level: debug
    api: debug
    rtsp: debug
    streams: debug
    webrtc: debug
    mse: debug
    hass: debug
    homekit: debug

cameras:
  CameraEZVIZC6N:
    ffmpeg:
      inputs:
        - path: rtsp://127.0.0.1:8554/CameraEZVIZC6N  # rtsp://USER:[email protected]:554/H.264
          input_args: preset-rtsp-restream
          roles:
            - detect
            - record
    live:
      stream_name: CameraEZVIZC6N

Here is go2rc producer info:

Here is go2rtc consumer info:

Here is go2rtc stream in webRTC mode, I can here the sound:

Here is log when I try to call API go2rtc and send audio to camera:

 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=opus#input=file'
can't find consumer
 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=pcma#input=file'
can't find consumer
 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=pcmu#input=file'
can't find consumer
 $ curl --location --request POST 'http://192.168.1.128:1984/api/streams?dst=CameraEZVIZC6N&src=ffmpeg:https://download.samplelib.com/mp3/sample-6s.mp3#audio=aac#input=file'
can't find consumer

Here is the log of go2rtc:

I also tried this external link from go2rtc:
image

I tried to talk via this microphone but not work:

I also tried to talk via the microphone on my own https link, but not work.

(I already open port 8555 on my router)

Can anyone help me please - I am struggling with the webrtc stream taking a long time to load - loading as MSE first, then after a few seconds switching to RTC.
I need to use the webrtc.create_link service in order to generate a link, which can be used when displaying my Reolink Doorbell as a popup via Pipup on my android tv.

  - service: webrtc.create_link
    data:
      link_id: "{{ link_id }}"
      entity: camera.front_doorbell_sub

The only way I can do this is to use the Reolink Integrations camera entity for the sub stream. This stream is awful and takes around 10 seconds or so to actually load, before stuttering and trying freezing. Elsewhere in Home Assistant, I use the go2rtc links to display the camera stream as RTC which works fine, although still takes a couple of seconds to load.

I canā€™t see any way of using a go2rtc RTC stream in the above script code, as it HAS to use an entity, and go2rtc doesnā€™t create entities, so I have to use this awful stream source.

Can anyone suggest a way around this? it currently takes 5 or 6 seconds of black screen then loads the MSE stream which displays a static picture for around 4 or 5 seconds, before switching to RTC where it starts displaying fine, but as my popup only stays on the screen for 30 seconds, I barely get any avctual use from it.

Full script:

alias: Display Doorbell PIP Popup on TV
mode: single
variables:
  link_id: 0{% for _ in range(39) %}{{ range(10)|random }}{% endfor %}
sequence:
  - service: webrtc.create_link
    data:
      link_id: "{{ link_id }}"
      entity: camera.front_doorbell_sub #This comes form the Reolink Integration
      open_limit: 1
      time_to_live: 120
  - service: rest_command.pipup_url_on_tv
    data:
      ip: 192.168.1.89
      duration: 25
      title: Front Door
      message: Someone is at the front door
      width: 640
      height: 480
      url: >-
        https://myhomeassistanturl/webrtc/embed?url={{
        link_id }}

Ezviz cameras doesnā€™t support any open two way audio standard.

Good evening. Everything works fine for me Hikvision KV6113, thanks for the integration. Please tell me there is an LED on the front panel of the Hikvision KV6113, it lights up when the microphone is turned on if you use hik Connect. But if they communicate via a WEB RTC card with two-way audio communication, then the LED lights up and does not go out, that is, the microphone is constantly on, until a reboot. Tell me if there is a command to mute the microphone. Thank you.

Š”Š±Š°Š²Ń‚Šµ ŠŗŠ°Š½Š°Š» ISAPI у Š¼ŠµŠ½Ń Š½Š° hikvision Š²Š¾Ń‚ тŠ°Šŗ.
go2rtc:
streams:
DoorBell:

And you will need HTTPSS:// to have access to two-way audio

1 Like

You are my hero! After a lot of trying and failing I read your tip and boom I got it! Thank you so much for this input!

Hi is there any way to use this solution with Blink cameras? thx