I tried all the camera platforms so you don't have to

When I started with Home Assistant a couple years ago, one of the main use cases was making a nice dashboard which included my cameras. Unfortunately this simple desire has been far from easy to achieve. In my quest for something reliable and real time, I think I have now tried every option that is open to me and realized that none of them is really perfect, but I’ve learned enough that it’s at least worth sharing so that other people have some better idea of their options and which one might fit their use case first. So, let’s start.

To begin, I am running 4x 1080p hikvision cameras configured with a 1080p h.264 main stream and a ~700x500 h.264 substream and they are working perfectly using the same streams with my synology and frigate object detection. The stream has never dropped from those systems that I can remember. Not the case with hass unfortunately.

I have tested the MJPEG stream from the substream too. Also during my testing I switched everything over from TCP to UDP to reduce lag a couple seconds and tried to enable multicast but I’m not sure if it’s working. I’m going to list the camera platforms I’ve tried in order of worst to best for my use case (which involves a few tablets around the house with both persistent small previews and temporary popups for when humans are detected) with ratings for various properties. You can see my config here

Below I list my notes for each camera integration and tell you my observations of it. Below that I try to make some conclusions about which options work best for which use cases (if you want TL;DR you can skip to that.)

MJPEG (with and without streaming component enabled):

MJPEG cameras are extremely close to realtime, even with streaming component turned on (because I’m pretty sure they don’t use it). You can choose from a live view which gives a decent framerate (but which my tablets can’t handle) or a “non-live” view which gives one frame every 10 seconds or so . The downside seems to be that only one thing can be streaming the camera URL at a time. This even applies to hass endpoints, So if one hass session is viewing the camera, the other hass sessions cannot. That alone makes it unusable for my use case as I have multiple tablets that need the stream simultaneously. Also I noticed frequent cases where the streams would smear and have other visual artifacts.

Also, synology forces you to choose H.264 or MJPEG for all the streams, so if I switch the substream on my camera to MJPEG (main stream doesn’t support it), it means that I have to choose to only use the main stream or substream in Synology.

MJPEG Camera Rankings:
Lag: 10/10 (this is the least laggy camera platform in hass, 1 sec or less)
Framerate 8/10 (decent, but not the best)
Initial Load time 8/10 (Pretty good, around 3 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 10/10 (can get high framerate 1080p previews for the one tablet that grabs the stream)
Reliability: 7/10 (if you only have one device ever accessing the stream, it’s rock solid, add one more device however, and the reliability is a perfect 0. Unfortunately sometimes the cameras smear, maybe due to CPU load on the cameras).
Usability: 2/10 (only usable if I have only 1 hass dashboard open with the camera streams, with more the streams fail)
Server CPU: 10/10 (no noticeable effect on hass server cpu)
Camera CPU: 3/10 (the cameras don’t appear to like serving MJPEG streams and bog down)

FFMPEG Camera Platform without Stream Component Enabled:

I had high hopes for FFMPEG to give me the reliability of MJPEG type streams that it transcodes from the h.264 streams that don’t kill my cameras. Unfortunately it brought the CPU on my 2011 mac mini Ubuntu server to it’s knees. I You need an FFMPEG instance for each camera streamed to each device, so for 4 cameras to 3 devices it’s 12 instances which gets pretty intense for an old CPU like mine. I have since migrated to a newer i5 based server and this is what I currently use, as despite using about half the CPU, it produces good reliability, low lag, and relatively quick appearance upon load.

FFMPEG Camera Component Rankings:
Lag: 7/10 (2-3 seconds, not bad.)
Framerate 7/10 (Probably 2-3 FPS IIRC)
Initial Load time 8/10 (Pretty good, around 3 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 8/10 (I used to have these at 10/10, but they are regularly causing me problems now, so I’m thinking the 8" tablets are not quite up to my 4 1080 streams anymore, not sure what changed. My 10" tablet still handles them nicely)
Reliabiliy 4/10 (A lot of dropouts and failures to load in lovelace, even on a powerful CPU, also smeared the stream)
Usability: 4/10 (The CPU requirements and lack of reliability mean it’s not real practical)
Server CPU: 2/10 (the CPU killer)
Camera CPU: 10/10 (cameras don’t seem to mind unless you open more streams than they can handle which doesn’t appear related to the CPU)

FFMPEG Camera Platform with Stream Component Enabled:

Turning on the stream component with FFMPEG increases realiability and framerate a bit at the expense of 10 additional seconds of lag.
FFMPEG Camera Component Rankings:
Lag: 4/10 (6-7 seconds)
Framerate 8/10 (Something near full 4 fps)
Initial Load time 8/10 (Pretty good, around 3 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 4/10 (Can’t use the live stream so stuck with the 1 frame every 10 seconds)
Reliabiliy 4/10 (A lot of dropouts and failures to load in lovelace, even on a powerful CPU)
Usability: 3/10 (The lag, CPU requirements and lack of reliability mean it’s not real practical)
Server CPU: 2/10 (the CPU killer)
Camera CPU: 10/10 (cameras don’t seem to mind unless you open more streams than they can handle which doesn’t appear related to the CPU)

Generic Camera Platform Without Stream Component:

This works okay, pretty low lag and you can choose between a 1 frame every 10 second preview or a “live” preview which gets maybe 1 frame every 1-2 seconds. The problem is that the streams tend to drop off the UI or not load, so unfortunately reliability isn’t very good.

Generic Camera without Stream Component Rankings:
Lag: 8/10 (2-3 seconds of lag)
Framerate 5/10 ( 1 frame every 1-2 seconds for “live”. does the job, but not good)
Initial Load time 8/10 (Pretty good, around 3 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 3/10 (don’t display reliably on really any device)
Reliability:: 3/10. Streams would often blank out or not load
Usability: 3/10: Reliability kills it
Server CPU: 10/10 (no noticeable effect on hass server cpu)
Camera CPU: 10/10 (cameras don’t mind serving h.264 streams)

Generic Camera Platform WithStream Component:

I was using my cameras this was for a long time simply due to the reliability of the streams and framerates. Only very occasionally they wouldn’t appear on my tablets, but often on my desktop I would have to click the … and do a refresh before they would update. Now that they’ve implemented HLS-LL, I had about 6-7 seconds of lag which is roughly half what I used to get, and probably acceptable for a lot of people due to the massive benefits in low CPU usage this method brings. I am willing to pay a heavy cost in CPU for the lower lag and slightly higher reliability of ffmpeg, but I’m sure I’m in the minority here.

Generic Camera with Stream Component Rankings:
Lag: 4/10 (6-7 seconds of lag)
Framerate 8/10 ( very good, my cameras only do 4 fps but I appeared to get all of that)
Initial Load time 8/10 (Pretty good, around 3 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 5/10 (fully kiosk on kindle wouldn’t allow the live stream for more than 1 camera, it would crash, so I had to run without live stream which worked quite reliably but was very slow)
Reliabiliy 8/10 (Live stream will occasionally freeze for ~10 seconds and the fullscreen will show the buffering animation)
Usability: 8/10 (work well if you don’t mind the lag)
Server CPU: 10/10 (no noticeable effect on hass server cpu)
Camera CPU: 10/10 (cameras don’t mind serving h.264 streams)

Synology Camera Platform with Stream Component:

This is a funny one, it only grabs the substream of the camera from synology and gives you 1 frame every 5 seconds or so. This limited it’s usefulness because my substream is 4x3 aspect ratio which doesn’t look good on my dashboard with my portrait orientation cameras which I stack next to each other. But what I did use it for a while for was to do the preview popups which despite having low framerate was more realtime and in a good aspect ratio for popups on my landscape orientation tablets.

Synology Camera with Stream Component Rankings:
Lag: 7/10 (when it finally refreshes the frame, it only has 2-3 seconds of lag)
Framerate 3/10 ( 1 frame every 5 seconds, not so good)
Kindle Fire/ Fully Kiosk Tablet friendliness: 10/10 (I can’t remember it ever failing to display. Most reliable one out of all of them.)
Reliabiliy 10/10 (The only reliable camera platform in hass that I’ve found)
Usability: 8/10 (work well if you don’t mind the substream and the frame rate)
Server CPU: 10/10 (no noticeable effect on hass server cpu)
Camera CPU: 10/10 (cameras aren’t even serving the stream)

ONVIF Camera Platform with stream component:

This was one of the last ones I tested because I didn’t see any reason why it should be any different than the generic camera, but it was. First off, it seems to be a bit more reliable. Secondly it gives you the sensors from the camera like motion detection. One of these sensors is very useful and not available from the other camera/sensor platform, which is the CPU level sensor. Once I had this I started to realize how much of a factor my camera CPU was in the performance of the whole camera display. Very useful.

ONVIF Camera Component with Stream Component Enabled Rankings:
Lag: 4/10 (6-7 seconds of lag)
Framerate 9/10 ( very good, my cameras only do 4 fps but I appeared to get all of that)
Initial Load time 8/10 (Pretty good, around 3 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 5/10 (fully kiosk on kindle wouldn’t allow the live stream for more than 1 camera, it would crash, so I had to run without live stream which worked quite reliably but was very slow)
Reliabiliy 8/10 (Live stream not super reliable, but non live is quite reliable)
Usability: 8/10 (work well if you don’t mind the lag)
Server CPU: 10/10 (no noticeable effect on hass server cpu)
Camera CPU: 10/10 (cameras don’t mind serving h.264 streams)

ONVIF Camera Platform without stream component:

This is where the reliability of the platform really started to shine. I was able to get these 1080p streams to come up pretty reliably (though not perfectly) in lovelace using the live view of ~1fps even on tablets. This level of real time with pretty good reliability was not something I had found before.

I am still running with these ONVIF cameras with the addition of a proxy camera (next rating) for preview, but I use the full 4 fps substream to get a near real time (1 second lag) popup with ~2-3 FPS on the tablets when people are detected outside. It’s not 100% reliable but maybe 95%, the drop in reliability from synology component is made up for by having ~10 times higher framerate.

ONVIF Camera Component without Stream Component Enabled Rankings:
Lag: 9/10 (~1 second of lag, very good)
Framerate 8/10 (getting 2-3 out of the 4 fps I think)
Initial Load time 8/10 (Pretty good, around 3 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 6/10 (I don’t recall if I had them working with live view on the tablet, but I don’t think they would at 1080p)
Reliabiliy 8/10 (Not perfect, but better than most)
Usability: 8/10 (Lack of lag and good framerate make them a good choice)
Server CPU: 2/10 (when running with live view, similar to FFMPEG cameras. Somehow running with the proxy camera made this go away)
Camera CPU: 10/10 (cameras don’t mind serving h.264 streams)

Proxy Camera Platform without stream component:

I was pretty excited when I found the proxy camera component because the only stream I had with the 16x9 aspect ratio I needed was 1080p and I knew that that was causing problems on my frontend.

Proxy camera allowed me to divide the resolution by 6 in each direction which was still fine for the DPI of the displays I was using, so now I’m running 320x180 preview streams that are downsized by the proxy component. This means that they load up very quickly in the frontend and I can run them in live view on the tablets very nicely.

Unfortunately they do seem to load up the camera’s CPU a bit, but I seem to be able to just barely squeak by with this enabled and not totally kill the camera’s CPU.

What’s magic about this is it lets me turn a camera stream that would take 30% of my CPU to display 4 cameras of and turn it into something that only uses 8% for like 12 cameras of stream. It’s not using FFMPEG at all. So for people who want realtime streams and want more streams being displayed than their CPU can handle, this can be a real lifesaver.

Unfortunately, the same camera stream will be less reliable after it goes through the camera proxy. It will not display more and it will stop updating after it displays initially quite often. I usually have to re-load lovelace a few times to get all 4 cameras updating consistently with the proxy cameras whereas the ONVIF cameras they are based on are quite stable. On my tablets they will run for quite a while if they load up for the first time, but on Firefox on desktop they stop running after a few minutes and I have to refresh the page.

Recently I re-tried this component and instead of the 1 frame every 2 seconds or so I got the last time I tried it, this time I only got 1 frame every 7 seconds or so… I’m guessing this is either a new problem (all my cameras are acting up on 2021.4 when compared to 2021.1), or it has to do with that I’m feeding it an FFMPEG camera now (I think I was feeding it an onvif camera or something last time). Anyhow, I’m going to downgrade the framerate score from 7/10 to 3/10

Proxy Camera Component without Stream Component Enabled Rankings:
Lag: 9/10 (~1-2 second of lag, very good)
Framerate 3/10 (Originally I asked it for one picture a second and it gave me me one every 1-2 seconds in the lovelace card. Now it refuses to average more than 1 frame every 7 seconds. Not sure why.)
Initial Load time 7/10 (Pretty good, around 3-4 seconds when they load correctly)
Kindle Fire/ Fully Kiosk Tablet friendliness: 9/10 (fully kiosk doesn’t choke on these and can be run at 1 frame every second or 2 which is fine for a small preview)
Reliabiliy 7/10 (Not perfect, but better than most. Sometimes drops or stops updating for a while)
Usability: 8/10 (Lack of lag and good framerate make them a good choice)
Server CPU: 8/10 (pretty minimal, at least how I use them… Something like extra 8%)
Camera CPU: 4/10 (added something like 30% additional load to the CPU, bringing them right to the limits of usability)

FFMPEG cameras without stream component and with Live555 RTSP proxy:

So when I had to migrate to Debian from Ubuntu to get to a supported platform, I decided to take the opportunity to upgrade my CPU and see if I could go back to FFMPEG cameras. I moved to Home Assistant Supervised on Debian 10 on a Shuttle SZ270R8, Intel Core I5 7500T 35W, 8GB DDR4 2400Mhz, 2x Samsung 970 Evo Plus 256GB in RAID1, 3x WD RED 2GB in RAID5. Needless to say it’s quite a bit faster than the 2011 mac mini I was running, and now I can stream my 4 cameras easily to 4-5 devices at a time (though I’m still averaging about 70% CPU usage so the CPU upgrade didn’t quite help as much as I was hoping it would). We will see as I add more devices, I might need another CPU or a GPU (which I purposely kept open as an option in my hardware).

This worked very nicely at first, with 4 1080p streams that will display even on my kindle dashboards at 2-3 FPS and what looks like the full 4FPS on faster machines with the live view turned on. But one of the first things that I noticed when I did this is that I could hit the limit on the number of streams my cameras would let me open. On my 1080p hikvisions, that number seems to be around 5 max. So, I tried a solution that I had been considering for a while, which is to run an RTSP proxy server to cut down the direct connections to the cameras. I downloaded the Live555 docker image and got that up and going (I posted my docker compose here). This solved the problem and now I can open up virtually unlimited streams until my CPU dies. It adds about 600ms to the lag which isn’t bad, also saves network bandwidth since it’s running on the same server. Doesn’t seem to produce much CPU load at all.

Overall this is the best solution I’ve tried so far. For a while it looked like it was going to be perfect- crystal clear streams running at high frame rates with low lag (2-3 seconds). But then every couple days I started noticing cameras vanishing for a while. This happened a lot more on my portrait orientation cameras than on the landscape ones, and almost never happens on the front one for some reason, so maybe others will experience this less than me… But for me, I lose 1-3 cameras every 1-2 days for 10-30 minutes… It seems like if I hit refresh a bunch of times I can get them to come back to life.

The other issue I found is that after about an hour many of the streams will freeze up. Part of this was some power settings I needed to change on fully kiosk, but I’ve got them refreshing automatically every 30 minutes now, and they are always streaming perfectly when I do this.

So, overall, not perfect, but still the best solution I’ve found, and I just love seeing everything up on my dashboard in near real-time. It was an expensive fix, but it improves the usefulness a lot. Now when I need to see what’s going on outside, I can pull it up quickly and know exactly what’s happening.

FFMPEG cameras without stream component and with Live555 RTSP proxy:
Lag: 7/10 (2-3 seconds, not bad.)
Framerate 9/10 (On a good machine’s browser, seems like full framerate)
Initial Load time 9/10 (Quickest load time of the options, maybe 2 seconds)
Kindle Fire/ Fully Kiosk Tablet friendliness: 10/10 (I can’t remember it ever failing to display. Most reliable one out of all of them.)
Reliabiliy 8/10 (I think a lot of my problems with FFMPEG before were CPU related, but maybe the proxy server is fixing many of them too)
Usability: 4/10 (The CPU requirements and lack of reliability mean it’s not real practical for most people who aren’t obsessed with lag like me)
Server CPU: 2/10 (No change here. You will likely have to plan your server CPU specifically around this use case)
Camera CPU: 10/10 (cameras don’t seem to mind unless you open more streams than they can handle which doesn’t appear related to the CPU)

AlexIT’s WebRTC card

So I’ve tested this one out a couple times. It’s still early going but it’s making good progress. This is IMO the most important camera type here because it offers the hope of cameras with both low lag and low CPU utilization. And it actually delivers on both of those promises. But so far for my use case it’s not the right option. Unfortunately the initial opening of the stream is the slowest of any of the options here right now, and on my tablets for realtime display, it is constantly losing the stream and reopening it. Also if the card is halfway off the screen, it stops running, which unfortunatley is something that my dashboard on the lovelace tablet does.

So, take the numbers below with a grain of salt as this is a very early development. I am going to add initial camera load up times to all cameras from memory, I hope I can get it right.

FFMPEG cameras without stream component and with Live555 RTSP proxy:
Lag: 9/10 (~1 second, very good.)
Framerate 9/10 (Seems to run at more or less full framerate)
Initial Load time 3/10 (Maybe 10 seconds. This is the weak point right now)
Reliabiliy 5/10 (On a strong frontend it runs well, on my tablets it disconnects a lot)
Usability: 6/10 (For certain use cases this will be a very good option)
Server CPU: 9/10 (Honestly I didn’t notice any CPU usage)
Camera CPU: 8/10 (It is opening another rtsp stream to the camera, which my cameras don’t mind much. You can use with a proxy like Live555 to reduce these. )

Conclusion:

So currently I am running the FFMPEG cameras with the Live555 RTSP proxy. Not for everyone, as the CPU usage is quite high. I use the RTSP substream when I have a popup on my tablets to give a decent framerate and low latency stream which is about the most those tablets can handle. This isn’t as reliable as the ONVIF cameras, but I was afraid I was putting load on the cameras with that component, so I switched back to the RTSP stream.

Special mention: MQTT camera. I use this one for my snapshots of humans detected on our property. Never had any problems with it, but I’m not really putting it through it’s paces. I only need 1 frame every 5 seconds or something.

EDIT I updated the CPU on ONVIF cameras after trying it without the proxy camera. Turns out it uses the same CPU as FFMPEG when you go to the streams directly, it was only the proxy camera that was saving my CPU, somehow it avoids using FFMPEG at all (!)

Troubleshooting common problems:

It’s probably worth mentioning some of the reasons I think I have isolated for cameras not displaying correctly… Although you will note that the reliability score I put are the cases where I don’t think it was these things that were causing the cameras to not display.

1) Overloaded CPU (/GPU?) on your server: Any time you are taking an h.264 stream and displaying it as MJPEG, there is one ffmpeg instance being created for each camera being displayed on each frontend device. 4 cameras being displayed on 3 devices? 12 instances. Each of these instances uses something like 8-10% of my CPU so displaying more than 8 of these cameras is impossible on my old core duo mac mini. I’ve never tried to see what was possible with GPU, my understanding from an FR is that the NUC hass.io image has the ability to use the intel integrated GPU, so people running intel NUC’s with hass.io may be utilizing GPU resources for this, so it would be good to check those too.

Solutions:

  1. Activate streaming component so that your cameras display as HLS instead of MJPEG (though you will pay for this with lag, which in my case is about 10 seconds.
  2. Use proxy camera component to re-serve the stream. This magically seems to not use much of any CPU, though it does also make the camera stream less reliable so that sometimes it still won’t come up.
  3. Buy a better server :wink:

2) Limited number of streams available from your camera

Some cameras will only allow one stream of a certain type or one stream at a time from a type of device. Keep in mind that generally homeassistant is not re-casting your streams. When you open a stream to display on the frontend, it opens that stream from the camera or DVR that you’ve configured. It means you are likely to have many streams going to the camera and/or DVR at once if you have multiple displays (like I do with my tablets).

A couple examples of this that I’ve encountered are that

  • My Hikvision cameras are only able to serve one MJPEG stream at a time, so the second device to try opening will get a blank card. They seem happy to stream many h.264 camera streams, though.

  • My reolink camera will only allow streaming via the app or via the stream that the reolink beta component uses at a time. So, as soon as I open the reolink app, all the homeassistant streams vanish.

Solutions:

  1. Try using h.264 instead of MJPEG
  2. Account for everything that’s using the camera and try to close clients to see if it allows more clients to connect.

3) Overloaded CPU on your cameras

This is one I realized too late to really assess it properly, as I didn’t have a sensor for it until I got the ONVIF cameras going. But basically cameras have very limited CPU resources, and if you are running some of the features like motion detection, line crossing, tampering, camera rotation in addition to certain types of apparently demanding streams like MJPEG or to a lesser extent the proxy camera component, you can overload the CPU. That being said, mine still seem to work pretty well when almost maxed out but I’m sure it caused some of my problems.

Solutions:

  1. Avoid MJPEG cameras. For whatever reason they have high CPU load
  2. Avoid rotating your cameras. This causes me some additional load
  3. Avoid proxy camera. Adds some small load
  4. Don’t use camera motion detection features (you can do this on the synology or DVR for example)

4) Overloaded browser on the frontend

I see this sometime on my kindle fire tablets if I am trying to display 4x 1080p stream cameras with live view enabled. But instead of not displaying the camera, fully kiosk browser just crashes, so a bit different. But your frontend devices do have limits. One other browser limit that’s common across most desktop browsers (as far as I understand) is the limitation of not being able to open more than 6 connections to a domain. So if you are displaying 5 camera feeds, you will not be able to open a second browser window to your hass instance without having camera problems.

Solutions:

  1. Use a less demanding camera type (non live stream, mjpeg appears easier to stream on the frontend than HLS using stream).
  2. Get a stronger frontend device. A modern iOS device seems to be able to keep up with just about anything I’ve tried, compared to a low powered Kindle Fire device. My 2019 Kindle Fire 10s will display 4 MJPG streams, but I can’t get more than a couple to load consistenly on my 2018 Kindle Fire 8s. YMMV.
  3. Don’t open multiple hass browser windows on the same computer

All that being said, there are many, many cases that I can’t explain. The cases I can’t explain are shown as a lower reliability score above.

118 Likes

I also made a month of what the heck post about why the cameras aren’t more reliable here if you’d like to vote for that.

1 Like

To help other your lag issues, in your Hikvision cameras see the I-Frame interval to the same setting as your frame rate. Lag with stream component enabled should then be reduced significantly, but you may encounter buffering depending on your network capacity.

Yes, i-frame has been set to 1 every 4 frames, so one per second for all the tests listed. That’s not my issue. Disabling the stream component reduced the lag by 10 seconds. The other thing that cut 1-2 seconds off the lag in all cases was moving from tcp to udp for the streams. It also seemed to improve reliability a bit, but at the cost of increased smearing in the MJPEG streams in the front end (though not in streams when viewed in synology app).

The problem with the stream component is a lack of tuning for low latency HLS I believe.

Yes, there are two components. Without stream enabled Home Assistant uses the snapshot URL and creates a 1 FPS MJPEG stream. Some integrations will pass through MJPEG streams from other sources if available natively. The stream component remuxes (not transcode like ffmpeg) the RTSP stream to HLS. Low Latency HLS distill doesn’t have broad browser and device support and requires a newer HLS standard. 0.115 brings a lot of improvements to the stream component in Home Assistant, but it is still not perfect as it is tuned to function on low powered devices and have broad client compatibility.

2 Likes

You have two sections labelled: Generic Camera Platform Without Stream Component:

I think one is supposed to be ‘with’.

I currently use the generic platform with stream enabled and I get near full frame rate live streaming from my 6MP cameras

Fixed. Victim of copy paste, thanks.

I’m expecting stream component to someday become the one that is the best when the latency problem is solved.

Would love to see some more “high cpu utilization” and low latency HLS options in future versions for those of us on faster machines and who can experiment with different browsers on kiosks for example.

Would also be great if we could have both HLS and MJPEG cameras produced from the same stream. Each have advantages and disadvantages and I’d like to use one type sometimes and the other other times in the same hass instance.

I tried ONVIF with stream: and the frame rate was terrible in Lovelace, plus I wasn’t able to do the camera.snapshot service (it would take a photo with zero KB and produce an error). I’m keen to try them without stream: now to try and get a better result. (currently using generic with stream). Did you test camera snapshots at all?

First - really great and in depth analysis of options available!!!

But I’m not sure I understand this one… Have same setup (not same cameras though - I use DLink ones) and I get real-time video not 1 frame every 5 seconds as you do? Can it be because of the cameras?!
Did you select live in “camera view”?

If that’s the synology platform you are talking about, since it’s serving the substream and the substream is the wrong aspect ratio to look nice in my dashboard I only used it for more-info popups on my tablets, but those are usually the same as putting the live view to true. I suspect the difference is that I was using the stream component and you aren’t?

1 Like

Synology I guess Surveillance station

No I don’t have a need for snapshots currently. The only snapshots I need are sent via MQTT from frigate to show me who it found outside in the MQTT streams.

I can’t promise that my results will apply to everyone’s situation the same, but hopefully they at least give you some idea for the best things to try first.

I do have stream integration activated, but yes, don’t use it for this since it works great without it. I use camera here and if needed use that integration to stream to other devices from HA. (for ex. on event trigger push camera stream to TV or something).

Thanks for the clarification.

Interesting, I wonder what the difference is. Probably something on the synology side. Are your streams h.264? Anyhow the substream only nature of it limits its usability unfortunately.

@scstraus, really interesting writeup. I did my fair share of testing various camera integrations myself (I also have multiple Hikvisions, although most are 4k, so that makes the problem worse), but by far not as exhaustive as you did. Great read.

I agree, camera integration into HA is highly problematic and the lag introduced by the stream component is a major issue, going as far as making the entire camera integration useless. But you can’t really blame the component for that. The problem is really inherent to how HLS works. It just wasn’t designed as a realtime low latency protocol to begin with. Interesting find about low latency HLS though. That would really help in the future.

Another idea would be to forward the RTSP stream directly, instead of remuxing it to HLS first. But as far as I know, there’s no native browser support to decode that on the client side and you would need to install some third party decoding plugin (like that really sketchy exe the Hikvision web UI always prompts you to install when you try to view the stream). It may be workable to open a fullscreen VLC session when tapping the camera preview card, which would pull the forwarded RTSP from the HA websocket connection. That would probably bring the latency way down, but it might be awkward to integrate.

Another important thing is h.265. A lot of the higher end camera setups are heavily shifting towards it, as it really reduces bandwidth and storage requirements significantly. Unfortunately, as far as I know, there is no support for that on HA yet.

Edit: oh and there’s another thing: Fully Kiosk has a bultin fullscreen RTSP viewer. Could that be leveraged to pull an RTSP stream forwarded directly by HA, bypassing the stream component / HLS remuxing entirely ?

5 Likes

Great writeup. I might finally look at moving my Zoneminder setup over to something natively using HA, in combination with blakeblackshear’s real time object detection.

I’m also using Synology to record 24x7 from the cameras. Have you looked at using substreams in HA while recording the main stream in Synology? I’ve been toying with ways to use the substream for analysis and then grab a high resolution video clip from the Synology recordings.

Yes, it’s h.264

Love the idea of full screen RTSP in fully kiosk, didn’t know about that!! Going to start investigating!

1 Like

Yes I’ve tried lots of combination of substream and main stream between frigate, synology, and hass. TBH none of them had a major effect other than sub streams coming up a bit quicker and using less CPU in cases I had to transcode them. With h.264 the camera doesn’t seem to mind much which stream you take. It’s only MJPEG from the camera that causes problems.

These days I just use whatever stream has the best resolution and aspect ratio for what I want to do with it.