Thanks a lot for the detailed explanations. I think I’m indeed using LL-HLS then:
Yes, but it looks like you are using “live” view. I think “live” view is broken and breaks more with LL-HLS.
Look at the difference between your screenshot and mine. Yours doesn’t load the playlists consistently - I think it’s because the frontend card was designed for still images so it still refreshes the assets every 10 secs or so which messes up non still images. It should be a relatively easy fix, but I don’t have a good frontend dev process set up. I’ll look at it when I get some time.
Try using “auto” for now.
Interesting. Good to know. Thank you!
FYI, both @allenporter and I have noticed that this doesn’t seem to be happening with “live” cards anymore. I haven’t tracked down what the culprit was or what has changed, but “live” cards seem to be working with LL-HLS and Nest WebRTC cameras.
Update: So I finally got around to trying the HLS-LL in the newest versions of HA, and it does indeed seem to produce an improvement. I got about 6-7 seconds of latency instead of the usual ~13 seconds that I was getting before. This is a big step up… But I do still get occasional freezes in the feed when in preview, and some occasional buffering animation when in fullscreen. So for now I think I will stick to the ffmpeg which still gives me lower latency, and rarely freezes. I’m paying a big price in CPU for this luxury though… For most people HLS-LL streams using the stream: component will be the best solution. I’ve updated my orignal post to reflect this.
I didn’t see this in the release notes. Would you mind sharing a link to more info?
Oh interesting! I didn’t know about the options. I’m going to have to try to play with it a bit more and see what happens.
Hi everyone good morning/afternoon/evening
Amazing thread with lots of information in here. I’ll just try and add my two cents in here as well. If someone is experiencing ocasional freeze with using HLS or LL-HLS a proxy RSTP server could improve this a lot.
I’ve setup rtsp-simple server with just a few commands and editting a yaml file and it works. CPU impact is minimal if no transcodding is used. It also has the advantage that you can pull a single stream for your camera, and then proxy as many streams as you like without having an impact in total LAN traffic. For me the proxy made the stuttering and freezing gone from my HA instance. I also have to say that I am running the proxy and HA in the same machine.
You can download it at GitHub - aler9/rtsp-simple-server: ready-to-use RTSP / RTMP / HLS server and proxy that allows to read, publish and proxy video and audio streams . The developer is very active.
Hope is usefull.
Cheers.
Yes this was also very useful for me in fixing streams not opening at all because of reaching the limits of number of streams the camera could serve. I use the live555 RTSP proxy and it works well too.
Take a look at this people if you are looking for lagless cameras!
Should i use go2rtc or Add on: RTSPtoWeb and RTSPtoWebRTC ?
Or not have camera entities at all and just use webrtc lovelace cards (does Frigate work with RTSP streams directly?) Might be inefficient cpu-wise, but it also adds quite a bit of cpu adding all my rtsp streams as camera entities i noticed!
If you use Frigate then use their Lovelace Frigate Card
it also works for none Frigate cameras. It works pretty great, auto detects streams and it even supports WebRTC as well. Although just like anything having 10 live 4k cameras on a page showing live feed gets pretty laggy on my pi4 but out of everything I had the best experience with their card. Although I haven’t tried the go2rtc yet.
I described the method I’m using to combine go2rtc with Frigate to achieve the lowest latency possible when streaming from HA, lowest resource consumption on the HA host (by performing less transcoding), audio support in both HLS and WebRTC and a single connection with the camera to reduce network usage (which makes my wi-fi cameras a lot more reliable).
It may be useful for others.
Note that Frigate 0.12.0 is gonna come with go2rtc embedded, but this has nothing to do with what I described above, as I’m using Frigate 0.11.
Hi,
I use camera streamer GitHub - ayufan/camera-streamer: High-performance low-latency camera streamer for Raspberry PI's to stream my Picamera.
It provides a webrtc stream, do you know à way to integrate directly this stream in HA and Lovelace ?
Now, i use the webcam composent, tried all the prévious solution but always get an issue ( picture, quality, lag…)
Based on the below from it’s docs, I think you’d be best off using MJPG camera as it doesn’t appear that you have RTSP support.
HTTP web server
All streams are exposed over very simple HTTP server, providing different streams for different purposes:
http://<ip>:8080/
- index pagehttp://<ip>:8080/snapshot
- provide JPEG snapshot (works well everywhere)http://<ip>:8080/stream
- provide M-JPEG stream (works well everywhere)http://<ip>:8080/video
- provide H264 in-browser muxed video output (using GitHub - samirkumardas/jmuxer: jMuxer - a simple javascript mp4 muxer that works in both browser and node environment., works only in Desktop Chrome)
thanks a lot for this thread - it’s been greatly helpful.
With so many open source video players it’s just crazy that 2 years later the opening of this thread, and so many people trying to address the same issues, there’s not yet an embedded way to have a zero lag video stream. VLC or any player is showing zero delays on my rtsp stream, but still the best option today it’s giving at least a 1 second lag in HA.
Rather than opening a new thread I’m trying to ask help here, where people are basically trying to deal with my same issue.
Background:
I’m trying to build a front-end interface on my Motorhome Android radio. This is how it looks so far:
I have a 360 bird view system stream the vehicle top view over a RTSP stream (the manufacturer say UDP is better but I’m not seeing particular differences even on TPC).
The HA instance is local, running on Hassio Pi4 with 8GB of RAM. The Android radio, where HA companion app runs, is local as well.
From my iPhone, even on LTE over a VPN to my Motorhome, using any RTSP player the streaming is with zero lag.
On the HA companion app there is no way to get below 5 seconds.
Here’s what I tested so far:
- Local camera integration pointing to the RTSP stream of the camera (with or without
stream
enabled) - Proxy integration
- Ffmpeg integration with the following code (this is where latency is more constant and around 5 seconds)
- Live555 RTSP Proxy add-on GitHub - alex-savin/hassio-addons-live555 with generic camera - inly slightly better than ffmpeg. Perhaps half second but not sure.
ffmpeg:
camera:
- platform: ffmpeg
name: "360View"
input: -f rtsp -rtsp_transport udp -i rtsp://192.168.1.169:554/live
- Webtrc GitHub - AlexxIT/WebRTC: Home Assistant custom component for viewing almost any camera stream in real time using WebRTC and other technologies. but unfortunately it just produce only a black screen with no errors logged
- Go2rtc GitHub - AlexxIT/go2rtc: Ultimate camera streaming application with support RTSP, RTMP, HTTP-FLV, WebRTC, MSE, HLS, MJPEG, HomeKit, FFmpeg, etc. with the same result (black screen no errors)
In the above screenshot the ffmpeg camera is using inside picture-elements with the following code:
elements:
- type: image
entity: camera.360view
camera_image: camera.360view
camera_view: live
style:
left: 96.5%
top: 96.5%
height: 193%
width: 193%
Anyway the results are the same with picture-entity card
Now the question is: what you suggest me to test next, to get better lag/latency in my case?
today I decided to give it a try to MotionEye
while the streaming in the WebUI is definitively better (2 seconds probably)
When I embed MotionEye camera into any kind of card in HA - the latency is still 5 seconds…
Of course a local rtsp stream will have lower latency than a remote stream. 1 second lag is pretty good.
The main problem is the crappy situation of remote realtime streamed video in general. There are not many options on a technical level and all of these options come with problems. (LL-)HLS is the base line, it’s compatible with pretty much everything out there, doesn’t require any fancy firewall punching (it’s just regular HTTP traffic) but it’s very very laggy. That’s what HA uses out of the box. In my opinion it’s unusable, due to the lag and the time it takes to open a stream.
On the other hand of the spectrum there’s WebRTC, which has very low lag, but is completely overengineered for the simple task of viewing IP cameras. It’s complex to set up technically, it’s finicky, has a tendency to fail randomly and is very much dependent on firewall issues (mostly due to its p2p aspect). As far as I see it, it’s a dead end for IP camera streaming, because of the complexity and inherent unreliability.
And then there’s RTSP over Websocket with MSE on the client side. Not sure if that has been attempted yet, at least in a simple way. The idea is to directly stream the (more or less) raw RTSP data over a websocket connection to the client. No firewall issues, no transcoding. A component inside HA could do this, maybe even over the existing HA websocket conneciton. A frontend JS decoder gets the RTSP and decodes it with MSE, which I think is globally supported across all browsers now, except for iOS (duh).
Most commercial security streaming apps use some kind of proprietary protocol (both server and client side) to get realtime streaming.