WebRTC support for Camera (stream) Components

Yeah, the LL HLS really drops the latency. And I use to use a proxy for HA (nginx), but it became too much of a PITA to manage (HA has very strict rules to make the proxy work). Now I just like HA OS manage the SSL cert via the Let’s Encrypt add-on + Cloudflare DNS.

I do still use a nginx to reverse proxy public Internet anonymous traffic for webhooks/images from outside of my home network (push notifications, etc.).

My experience was also that Low Latency HLS worked fine without a proxy. @uvjustin Is it worth advertising broader to get more opt in?

@AngellusMortis Thanks for the feedback, that’s encouraging to hear. @allenporter and I have been working on getting LL-HLS right for a while but you are the first user that I know of that has tried it and gotten it to work.
@allenporter Sure, it would be nice to get at least few more users first so we can try to iron out any kinks. I posted previously in that long stream thread, and I tried to recruit some people on the #cameras channel on discord, but I haven’t gotten any traction.
As for the HTTP/2 proxy, I think whether it’s necessary or not might depend on the player (hls.js and Android don’t seem to care, but iOS might be more strict) as well as the lovelace view the camera is opened from - I think each stream might take up 3 http connections simultaneously and the browser might only have 6 or 7 available. This is a tangent, but one thing I was thinking of doing was reverting the parts to byterange parts and using a fetch wrapper in the service worker to make just one request per segment. This would further reduce the number of network requests and the need for an http/2 proxy. There are a few issues with this approach though: 1) the main reason we gave up byterange parts was because they didn’t seem to work on iOS, so I’m not sure if this approach would work for iOS, 2) I don’t think byterange caching from workbox will work for us since it seems to need the whole response before adding it to the cache, so we’d have to roll our own implementation.

@AngellusMortis Can you remove LHLS from the title? It is a little misleading since we already have LL-HLS support.

@uvjustin and @allenporter I have been getting a lot of “waiting for available socket…” from my browser connecting to HA. I have been opening a lot of tabs since I have been doing a bunch of changes. Is the LL HLS stuff also related? Is there a way to bump up the number of connections my HA can do? I have decent powerful hardware for it (an i5 NUC).

Yes, there is a good chance this is related, and it has nothing to do with the power of your hardware.
I’m guessing you are using regular http/1.1. Are you using a reverse proxy for http, or are you just using HA directly?
The issue is that with HTTP/1.1, each request uses a separate TCP connection, and browsers limit the number of simultaneous connections to a given domain. The exact number varies by browser but for most browsers this limit is 6.
With LL-HLS, the browser can actually make a request before the next piece of data is actually ready, so the connection will be occupied for longer. A single LL-HLS stream might take up from 2-4 simultaneous connections depending on how fast the transmissions are. That is fine for viewing one video, but you might have other connections loading in the background, such as still images from your cameras. If you have other tabs open, they will also be sharing the same connection limit, so it will be easy to hit the connection limit bottleneck.
For an illustration of what I’m talking about, see the image in I tried all the camera platforms so you don't have to - #209 by pergola.fabio and my comment below.
An easy fix for all of this is to use a reverse proxy such as nginx and to enable the HTTP/2 feature. HTTP/2 multiplexes all of the HTTP traffic over the same TCP connection, so there is no connection limit issue.

No reverse proxy. I was mentioning my hardware to say that if there was a connection limit imposed by HA itself, there is no reason my hardware could not handle more.

I understand, the information you provided was useful for ruling other causes out.
You can try using a reverse proxy - it should be straightforward to set up and should improve your HA frontend speed.

If using a reverse proxy with LL HLS is recommended, I might recommend adding an “official” recommended configuration to the docs for Nginx. There a ton fo ways to set up a reverse proxy and I remember the reason I got rid of Nginx before was because my default configuration I throw at services did not work very well for HA.

Hmm, I’m not sure we can add that into the HA documentation itself as there are many different reverse proxies and I don’t think HA recommends any particular one.
For HA OS users, I think there are one or two addons like this: GitHub - hassio-addons/addon-nginx-proxy-manager: Nginx Proxy Manager - Home Assistant Community Add-ons and that should configure the proxy for you.
Otherwise, you can have a look at this community guide Reverse proxy using NGINX - Community Guides - Home Assistant Community (home-assistant.io) which should have a good configuration for HA. I notice that it doesn’t have http2 enabled though - I’ll try to add it now.

There is an official Nginx add-on. addons/nginx.conf at 8bf82d5765944c95b51186d06f57f613c9b0905c · home-assistant/addons · GitHub

It has http2 enabled, but you may still want to double check the proxy pass configuration and I originally did not want to use that add-on because it does not allow you to disable the original http origin connection.

I just tried using the settings in your link and they seem ok for me.
However, my setup is a little different as I’m using nginx-quic. If you’re comfortable using docker and building a docker image you can give it a try. QUIC and HTTP/3 should provide a performance benefit over even HTTP/2. I’m just a regular nginx user and not an expert, but if you’re interested I can make a post on my setup. However, our messages have veered quite a bit off this Feature Request, so PM me if you’re interested in this.

I’ve shared my NGINX config here: NGINX with QUIC+HTTP/3

Has anyone had any luck with using WebRTC with the new beta 2021.12.0b0 yet? Everything worked perfectly with all previous version of HA but now gives me this error during setup:
This error originated from a custom integration.

Logger: homeassistant.setup
Source: custom_components/webrtc/utils.py:74
Integration: WebRTC Camera (documentation, issues)
First occurred: 7:19:16 PM (1 occurrences)
Last logged: 7:19:16 PM

Error during setup of component webrtc
Traceback (most recent call last):
File “/usr/src/homeassistant/homeassistant/setup.py”, line 229, in _async_setup_component
result = await task
File “/config/custom_components/webrtc/init.py”, line 75, in async_setup
utils.register_static_path(hass.http.app, url_path, path)
File “/config/custom_components/webrtc/utils.py”, line 74, in register_static_path
app’allow_cors’
File “/usr/local/lib/python3.9/site-packages/aiohttp/web_app.py”, line 186, in getitem
return self._state[key]
KeyError: ‘allow_cors’

This is a feature request post. Not a post for the custom WebRTC component. I recommend going to that forum post for that and asking for help.

Am i right in thinking that this is what the November HA Release Notes were referring to?

@allenporter blew our minds this release by adding initial support for WebRTC streams and cameras to Home Assistant.

But had a good reason, he added support for Nest Battery Cameras and Nest battery Doorbell Cameras to Home Assistant! Thanks @allenporter!

I have been following this “scene” for some time, and I wasn’t sure if I got myself confused or if the information available on the topic is indeed sparse and slightly misleading? e.g. the release notes

PS: i guess for the sake of clarity my interest in this is based on using the the Unifi Protect integration and the delays that come with it (i know they are not through any fault of the integration - and I am sure that the moment there is a better approach, the core contributors like @AngellusMortis will try to get it natively implemented - hence my confusion on WebRTC)

Thank you all

PPS: let me know if I can be of any help on the testing side of things

I am not actually a core contributor on the stream stuff. That is @uvjustin and @allenporter. I am one of the devs for the UniFi Protect integration (which is not in core yet).

Realtime camera streaming without any delay - WebRTC - #360 by allenporter explains the difference in what exists now and what the custom component does today.

You should check out low latency HLS.

Yep I am aware. I was referring to core contributors to the Unifi Protect integration :slight_smile:

Thank you for providing the link to that comment. Gave me a better understanding of what work was actually done versus what I thought happened.

I’ve had this on my todo list for some time, life just kept me busy recently.

Thank you both for your answers!