I tried all the camera platforms so you don't have to

Anyone tried this now that 2021.10 is out? I tried it yesterday by adding the options to my configuration.yaml and it didn’t necessarily seem like it was working. It still took a while to start up, and seemed to have a lot of issues with hanging and buffering. I didn’t do that much testing though, so it’s definitely possible I did something wrong. I’ve been pretty interested in trying this since I’ve been kind of annoyed by the issues with WebRTC seemingly always stopping and buffering.

Not tested yet, something for next week

For best results, please make sure you have an HTTP/2 (or HTTP/3) reverse proxy set up.
I’m not familiar with WebRTC, but if you’re having issues stopping and buffering with both that component and with LL-HLS, it’s possible the issue is actually with the camera connection. Are there any buffering issues when using stream without LL-HLS? Are you using a wired camera?

i am using it now, added those new options to the stream:
but i still have a lag , about 7-8 seconds when i watch live stream with lovelace …maybe its better then before, not sure… dont have any issues with it, and i dont know this new stream option is better with lag/delay?

i am using the synology platform for cameras, not generic

edit: tested generic with rtsp stream coming from synology, but still lag/delay for about 8-10 , so no difference for me

Interesting, 7-8 seconds seems too high. Are you using more-info camera (ie can you see a progress bar below the video?) If you’re just using a “live” camera it may not be seeking to the correct position.
If you open developer tools in a browser, there’s a network tab where you can see the network requests. To verify that LL-HLS is working, can you check that the m4s filenames look something like xxx.yy.m4s? eg segment/364.6.m4s. There should be many of these.

yeah, they look like hls streams

indeed, i use more info, so it popups, the stream starts insantly , like after 1-2 sec, thats good, was also like before … but the video itself is like 10 seconds behind;, so if i wave, i see myself waving after like 10 seconds on camera :slight_smile:

That looks right except it looks like you are not using HTTP/2. If you look at your picture around 1700ms, all those bars are when your other cameras are updating their still images in the background, and that uses up all the HTTP/1.1 connections, causing congestion and slowing everything down.
You could try keeping each camera in its own view to get around that issue. Otherwise, HTTP/2 is pretty much a requirement for LL-HLS.

yeah, thats possible, the still images is another method i use
but if i open an stream, and cast it to a google hub, i see the same behaviour, also 10 sec lag/delay, so not only on chrome

I’m guessing that most Chromecast devices don’t support LL-HLS yet, so they will just fall back to the regular HLS stream.
Anyway, LL-HLS should work on browser, Android, and iOS. If you really want to give it a try, please use an HTTP/2 reverse proxy.

If webRTC is stopping and buffering, then there is a problem with the network or the cameras don’t support RTSP. webRTC should buffer for no more than 30 seconds, while it tries to establish a direct connection to the camera.

On my network webRTC takes about 15 seconds on the Windows laptop to connect directly with the camera. The Chromebook takes 1 second. The Android phone takes a few seconds.

Ah, ok, that’s interesting info Justin. I run HAProxy on my firewall to handle SSL and routing to some different servers, and it looks like it’s only doing HTTP/1.1 currently. From some quick searching it looks like HTTP/2 is possible, I’ll just have to mess around with the configuration. Thanks for the tips.

Anyone have experience getting HTTP/2 working in HA with a reverse proxy? I ended up setting HAProxy to force HTTP/2 and verified with my website (which runs Apache) that it was using it. HA is somehow still sticking with HTTP/1.1 though, even though it’s the same proxy (I split the traffic to different servers depending on the subdomain requested).

As for WebRTC buffering, I’m not the only one with that issue based on the thread for it. Sometimes it works great, sometimes it starts stuttering and popping up the “play” icon over and over. There was a point where I was running 4 live feeds on 3 wall tablets and HA just fell apart from all the connections getting exhausted I assume, but now I just run one per tablet and it’s generally fine. My cameras are all Unifi running through Unifi Protect, and they work great for RTSP through VLC.

Sorry, I don’t have any experience with HAProxy.
The HA frontend uses service workers, and some browsers (Firefox, maybe Chrome?) show HTTP/1.1 when a service worker is used even though the real connection to the server is over HTTP/2.

Literally 20ish minutes after posting that… mine started buffering.
I haven’t updated the webRTC component recently so I know nothing has changed there.
I might look in to the whole STUN/TURN server stuff and see if locally hosting that speeds up the discoverability?

I found a bug in stream that was causing problems with ExoPlayer and LL-HLS. Not sure if it was causing problems elsewhere. Have submitted a PR to fix it, and I have added PRs to the frontend and android repos to improve live performance in browsers and in app.
Hopefully these should all be in by the 2021.11.0 release.


testing out, i still have the 5-8 second delay, but dont have the buffer issues anymore, great work, thnx!!

I configured a reverse proxy and everything, confirmed that HTTP/2 is being used. But now, how do I make sure I’m really using LL-HLS after enabling it as suggested?

In your browser, open up Developer Tools → Network. If you are using LL-HLS, you should see a lot of requests in the format [xx].[yy].m4s and the playlist requests should have extra query parameters like ?_HLS_msn=[xx]&HLS_part=[yy]. Also, if you are using HTTP/2 or HTTP/3, the network overview graph on the top should only have one dashed line instead of multiple lines. See the attached screenshot. If you’re not using LL-HLS, the requests will just be in the format [xx].m4s and the requests will not have the extra query parameters.

1 Like

Thanks a lot for the detailed explanations. I think I’m indeed using LL-HLS then:

Yes, but it looks like you are using “live” view. I think “live” view is broken and breaks more with LL-HLS.
Look at the difference between your screenshot and mine. Yours doesn’t load the playlists consistently - I think it’s because the frontend card was designed for still images so it still refreshes the assets every 10 secs or so which messes up non still images. It should be a relatively easy fix, but I don’t have a good frontend dev process set up. I’ll look at it when I get some time.
Try using “auto” for now.