I tried all the camera platforms so you don't have to

Check out Stream - Home Assistant

2 Likes

Oh interesting! I didn’t know about the options. I’m going to have to try to play with it a bit more and see what happens.

1 Like

Hi everyone good morning/afternoon/evening :smile:

Amazing thread with lots of information in here. I’ll just try and add my two cents in here as well. If someone is experiencing ocasional freeze with using HLS or LL-HLS a proxy RSTP server could improve this a lot.
I’ve setup rtsp-simple server with just a few commands and editting a yaml file and it works. CPU impact is minimal if no transcodding is used. It also has the advantage that you can pull a single stream for your camera, and then proxy as many streams as you like without having an impact in total LAN traffic. For me the proxy made the stuttering and freezing gone from my HA instance. I also have to say that I am running the proxy and HA in the same machine.

You can download it at GitHub - aler9/rtsp-simple-server: ready-to-use RTSP / RTMP / HLS server and proxy that allows to read, publish and proxy video and audio streams . The developer is very active.

Hope is usefull.
Cheers.

2 Likes

Yes this was also very useful for me in fixing streams not opening at all because of reaching the limits of number of streams the camera could serve. I use the live555 RTSP proxy and it works well too.

Take a look at this people if you are looking for lagless cameras!

3 Likes

Should i use go2rtc or Add on: RTSPtoWeb and RTSPtoWebRTC ?

Or not have camera entities at all and just use webrtc lovelace cards (does Frigate work with RTSP streams directly?) Might be inefficient cpu-wise, but it also adds quite a bit of cpu adding all my rtsp streams as camera entities i noticed!

If you use Frigate then use their Lovelace Frigate Card
it also works for none Frigate cameras. It works pretty great, auto detects streams and it even supports WebRTC as well. Although just like anything having 10 live 4k cameras on a page showing live feed gets pretty laggy on my pi4 but out of everything I had the best experience with their card. Although I haven’t tried the go2rtc yet.

I described the method I’m using to combine go2rtc with Frigate to achieve the lowest latency possible when streaming from HA, lowest resource consumption on the HA host (by performing less transcoding), audio support in both HLS and WebRTC and a single connection with the camera to reduce network usage (which makes my wi-fi cameras a lot more reliable).

It may be useful for others.

Note that Frigate 0.12.0 is gonna come with go2rtc embedded, but this has nothing to do with what I described above, as I’m using Frigate 0.11.

2 Likes

Hi,

I use camera streamer GitHub - ayufan/camera-streamer: High-performance low-latency camera streamer for Raspberry PI's to stream my Picamera.
It provides a webrtc stream, do you know à way to integrate directly this stream in HA and Lovelace ?

Now, i use the webcam composent, tried all the prévious solution but always get an issue ( picture, quality, lag…)

1 Like

Based on the below from it’s docs, I think you’d be best off using MJPG camera as it doesn’t appear that you have RTSP support.

HTTP web server

All streams are exposed over very simple HTTP server, providing different streams for different purposes:

thanks a lot for this thread - it’s been greatly helpful.

With so many open source video players it’s just crazy that 2 years later the opening of this thread, and so many people trying to address the same issues, there’s not yet an embedded way to have a zero lag video stream. VLC or any player is showing zero delays on my rtsp stream, but still the best option today it’s giving at least a 1 second lag in HA.

Rather than opening a new thread I’m trying to ask help here, where people are basically trying to deal with my same issue.

Background:
I’m trying to build a front-end interface on my Motorhome Android radio. This is how it looks so far:

I have a 360 bird view system stream the vehicle top view over a RTSP stream (the manufacturer say UDP is better but I’m not seeing particular differences even on TPC).
The HA instance is local, running on Hassio Pi4 with 8GB of RAM. The Android radio, where HA companion app runs, is local as well.

From my iPhone, even on LTE over a VPN to my Motorhome, using any RTSP player the streaming is with zero lag.

On the HA companion app there is no way to get below 5 seconds.

Here’s what I tested so far:

  • Local camera integration pointing to the RTSP stream of the camera (with or without stream enabled)
  • Proxy integration
  • Ffmpeg integration with the following code (this is where latency is more constant and around 5 seconds)
  • Live555 RTSP Proxy add-on GitHub - alex-savin/hassio-addons-live555 with generic camera - inly slightly better than ffmpeg. Perhaps half second but not sure.

ffmpeg:

camera:
  - platform: ffmpeg
    name: "360View"
    input: -f rtsp -rtsp_transport udp -i rtsp://192.168.1.169:554/live

In the above screenshot the ffmpeg camera is using inside picture-elements with the following code:

elements:
  - type: image
    entity: camera.360view
    camera_image: camera.360view
    camera_view: live
    style:
      left: 96.5%
      top: 96.5%
      height: 193%
      width: 193%

Anyway the results are the same with picture-entity card

Now the question is: what you suggest me to test next, to get better lag/latency in my case?

today I decided to give it a try to MotionEye

while the streaming in the WebUI is definitively better (2 seconds probably)

When I embed MotionEye camera into any kind of card in HA - the latency is still 5 seconds…

Of course a local rtsp stream will have lower latency than a remote stream. 1 second lag is pretty good.

The main problem is the crappy situation of remote realtime streamed video in general. There are not many options on a technical level and all of these options come with problems. (LL-)HLS is the base line, it’s compatible with pretty much everything out there, doesn’t require any fancy firewall punching (it’s just regular HTTP traffic) but it’s very very laggy. That’s what HA uses out of the box. In my opinion it’s unusable, due to the lag and the time it takes to open a stream.

On the other hand of the spectrum there’s WebRTC, which has very low lag, but is completely overengineered for the simple task of viewing IP cameras. It’s complex to set up technically, it’s finicky, has a tendency to fail randomly and is very much dependent on firewall issues (mostly due to its p2p aspect). As far as I see it, it’s a dead end for IP camera streaming, because of the complexity and inherent unreliability.

And then there’s RTSP over Websocket with MSE on the client side. Not sure if that has been attempted yet, at least in a simple way. The idea is to directly stream the (more or less) raw RTSP data over a websocket connection to the client. No firewall issues, no transcoding. A component inside HA could do this, maybe even over the existing HA websocket conneciton. A frontend JS decoder gets the RTSP and decodes it with MSE, which I think is globally supported across all browsers now, except for iOS (duh).

Most commercial security streaming apps use some kind of proprietary protocol (both server and client side) to get realtime streaming.

2 Likes

so If I found in this specific case the issue is the picture-elements. If I view the motioneye camera in a standard picture card, latency is close to 1 second.

/edit

and I’ve should have add that I don’t have a clue on how to solve it…

not only I made some progress but I also solved my issue today.

MotionEye was the way to go. With it I’m able to get something very close to the rtsp stream. I had to give up something by dropping udp in favour of tcp.

My original stream has less than 0.5s of latency, MotionEye udp is probably 1s, MotionEye tcp probably less than 2s. As the use case was a camera with a lot of motion (driving), udp was too often only partially updating the image - leaving behind ghosting effect. tcp is perfect although a little bit more behind.

The last piece was using custom:hui-element inside picture-elements to expose the camera stream as a picture entity instead of an image.

1 Like

@stepir do you mind sharing your config/dash?

I had endless connection problems using three Foscams together with Frigate and Foscams’ VMS, even using different users for Frigate. Foscams’ software in my eyes is a bunch of …, but the hardware is pretty good.
I guess I found a solution now after spending nights to get a satisfactioned result.
I use LIVE555 Proxy Server as dockerized version now to stream RTSP to Home Assistant / Frigate. The HASS addon unfortunately doesn’t work for me.
Now latency is much better than using direct access to the cams RTSP server and the connection problems seem to have stopped. CPU usage decreased also…

1 Like

I don’t see many people talking about it here, but nowadays go2rtc should be the go-to solution for this.

2 Likes

Another vote for MotionEye. Have this running on a separate server with 5 cameras.

The MotionEye integration provides an MJPEG camera to home assistant. On the Motion Eye UI → Video Streaming set Motion Optimization to ON. This turns the streaming frame rate down to 1fps when there is no motion in the image.

I can show all 5 of my cameras on the same page with around 0.25s latency and it’s reliable. All other methods I have tried have been worse…, anything that involves streaming (which is every other type of camera connection method), have been slow to startup, high latency, unreliable and behave differently on different client types.

As soon as any ‘imperfect’ network connection between HA and the client is encountered, streaming bites the dust and rarely recovers (and if it does recover it just introduces massive latency), this is not the case with MJPEG cameras (in my experience).