WebRTC support for Camera (stream) Components

It is awesome we can stream video via a camera component now, but let’s face it, HLS sucks for livestreaming video (security cameras). HLS chunks segments into keyframes and then pre-loads ~3 keyframes. On my Unifi Protect cameras, that means I have a ~20 second delay when I load up a camera. If I manually jump ahead, I can get it down to ~5 seconds, but that is still a noticable delay.

The issue was even mentioned in the original RFC for adding streaming capabilities to the camera component. It would be really great if we can get something new integrated into core that RTSP / custom integrations can use to livestream in real time instead of on a delay.

Some nice to haves would be:

  • The ability to use a playback method that has less latency then HLS, like Low Latency HLS or WebRTC.
  • Ability to transcode video and/or audio on the fly if necessary to ensure video and audio playback is supported by the HA frontend (Unifi Protect cameras playback on H.264 and AAC audio, which is great for HLS, but the audio does not work for WebRTC).
  • Stream pooling. Because transcoding is expensive, a stream connection should be reused when able to reduce load on server.

I know that a very good point was made in the original RFC streaming in the camera component:

We are a home automation platform, not a video management platform.

And so to that end this might be better off as an official add-on instead of inside of core. And if the add-on is installed, the camera component is just able to use it. Maybe that means we could find an existing video management solution to integrate as an add-on and use it by core to offload all of the streaming.

Existing third party integration that “kind of” works:

Though, I am not a huge fan of how it currently works. It downloads random executables from the Internet and runs them on the system and it has a nasty CPU issue that can bring down your whole Home Assistant instance. Perhaps the correct “solution” here is to make that integration into an Web service add-on that starts up the streams and/or transcodes video/audio when necessary to feed them to the WebRTC player.

EDIT: beta LL HLS support was added in 2021.10. Going to still leave the feature request as open but for WebRTC only since LL HLS does still have a few seconds of latency.

+1

I agree. None of the current integrations seem to work reliably if at all.

I’ve proposed this to the webrtc author https://github.com/AlexxIT/WebRTC/issues/138 since I have added native webrtc streaming in frontend for new nest cams.

I may try playing around with getting old nest cams to use webrtc in an add on as a proof of concept.

2 Likes

Nope. 10 charrrr

A couple things:

  • check out low latency HLS in the stream component ll_hls: True. This is not being advertised broadly yet.
  • Use AlexITTs custom component for webrtc with rtsp cameras.

I’m thinking about a proposal for webrtc integration with existing core. My impression is we might go with a component that talks to an add-on similar to how mqtt or jwavejs works, but there is not a proposal yet.

Nope. 10 charrrr

Are you referring to low latency HLS? That’s great to hear. Perhaps we should spread the world and get more adoption.

I think some worries were about requiring http/2 proxies. Do you have one or is it working well without?

Nope. 10 charrrr

My experience was also that Low Latency HLS worked fine without a proxy. @uvjustin Is it worth advertising broader to get more opt in?

@AngellusMortis Thanks for the feedback, that’s encouraging to hear. @allenporter and I have been working on getting LL-HLS right for a while but you are the first user that I know of that has tried it and gotten it to work.
@allenporter Sure, it would be nice to get at least few more users first so we can try to iron out any kinks. I posted previously in that long stream thread, and I tried to recruit some people on the #cameras channel on discord, but I haven’t gotten any traction.
As for the HTTP/2 proxy, I think whether it’s necessary or not might depend on the player (hls.js and Android don’t seem to care, but iOS might be more strict) as well as the lovelace view the camera is opened from - I think each stream might take up 3 http connections simultaneously and the browser might only have 6 or 7 available. This is a tangent, but one thing I was thinking of doing was reverting the parts to byterange parts and using a fetch wrapper in the service worker to make just one request per segment. This would further reduce the number of network requests and the need for an http/2 proxy. There are a few issues with this approach though: 1) the main reason we gave up byterange parts was because they didn’t seem to work on iOS, so I’m not sure if this approach would work for iOS, 2) I don’t think byterange caching from workbox will work for us since it seems to need the whole response before adding it to the cache, so we’d have to roll our own implementation.

@AngellusMortis Can you remove LHLS from the title? It is a little misleading since we already have LL-HLS support.

Nope. 10 charrrr

Yes, there is a good chance this is related, and it has nothing to do with the power of your hardware.
I’m guessing you are using regular http/1.1. Are you using a reverse proxy for http, or are you just using HA directly?
The issue is that with HTTP/1.1, each request uses a separate TCP connection, and browsers limit the number of simultaneous connections to a given domain. The exact number varies by browser but for most browsers this limit is 6.
With LL-HLS, the browser can actually make a request before the next piece of data is actually ready, so the connection will be occupied for longer. A single LL-HLS stream might take up from 2-4 simultaneous connections depending on how fast the transmissions are. That is fine for viewing one video, but you might have other connections loading in the background, such as still images from your cameras. If you have other tabs open, they will also be sharing the same connection limit, so it will be easy to hit the connection limit bottleneck.
For an illustration of what I’m talking about, see the image in I tried all the camera platforms so you don't have to - #209 by pergola.fabio and my comment below.
An easy fix for all of this is to use a reverse proxy such as nginx and to enable the HTTP/2 feature. HTTP/2 multiplexes all of the HTTP traffic over the same TCP connection, so there is no connection limit issue.

Nope. 10 charrrr

I understand, the information you provided was useful for ruling other causes out.
You can try using a reverse proxy - it should be straightforward to set up and should improve your HA frontend speed.

Nope. 10 charrrr

Hmm, I’m not sure we can add that into the HA documentation itself as there are many different reverse proxies and I don’t think HA recommends any particular one.
For HA OS users, I think there are one or two addons like this: GitHub - hassio-addons/addon-nginx-proxy-manager: Nginx Proxy Manager - Home Assistant Community Add-ons and that should configure the proxy for you.
Otherwise, you can have a look at this community guide Reverse proxy using NGINX - Community Guides - Home Assistant Community (home-assistant.io) which should have a good configuration for HA. I notice that it doesn’t have http2 enabled though - I’ll try to add it now.

Nope. 10 charrrr

I just tried using the settings in your link and they seem ok for me.
However, my setup is a little different as I’m using nginx-quic. If you’re comfortable using docker and building a docker image you can give it a try. QUIC and HTTP/3 should provide a performance benefit over even HTTP/2. I’m just a regular nginx user and not an expert, but if you’re interested I can make a post on my setup. However, our messages have veered quite a bit off this Feature Request, so PM me if you’re interested in this.

I’ve shared my NGINX config here: NGINX with QUIC+HTTP/3