I tried all the camera platforms so you don't have to

Anybody able to get the high quality stream to show in Apple Homekit? I am able to get see the ‘main’ stream nicely in HA within lovelace and through the Home Assistant iOS App. However, when the camera is sent to Homekit, only the ‘sub’ steam plays.

What’s also interesting, is that the camera shows ONVIF in Homekit, however, I am using the generic camera platform.

@moto2000 forgive me if this is obvious but have you checked for disabled entities? When i was using ONVIF with my cameras it would install with only one feed enabled. FYI wasnt homekit but just a thought becuase I missed it :slight_smile:

I experience that cams images is missing once in a while… And also the love lace UI seems to hang on some requests…

Looking in chrome devconsole I can see that requests end up as cancelled… reason being stalled.
http://192.168.1.252:8123/api/camera_proxy/camera.haveost?authSig=xxxx

I believe this happens due to the fact that all 6 connections is in use between clien/server serving the cam feeds.
From Chrome event/debug:
SOCKET_IN_USE [dt=66597+] [ --> source_dependency = 151784 (HTTP_STREAM_JOB)]

I’m currently using camera generic platform with
still_image_url: http://192.168.1.202/ISAPI/Streaming/channels/301/picture

I have 7 cams…

Any recommendation on which camera: platform/configuration to use instead - that will not end locking up my entire Lovelace UI ?

I’m fine with updating only cam pics every 2 secs… no live stream needed.

Thanks.

1 Like

Any of the MJPEG options such as MJPEG, FFMPEG, Proxy, etc seem to work reasonably well with many streams, otherwise try the generic camera (or better yet ONVIF camera) with stream: enabled in your config.

thanks scstraus…

I have a NVR with 7 cams… And first of all should each CAM go through the NVR’s ip or to each seperate IP of the cam? (Proxy for example?)

Can you give an example on your config?

I found that using the ONVIF integration you can actually just link to the NVR and access all cameras, however in my case I didn’t get the additional features that ONVIF provides from the cameras directly such as motion detection signals etc. So I just use the direct integrations to each camera.

There are lots of different ways of setting it up like sparkydave mentions, and if you read my original post, you know they all have advantages and disadvantages. I learned that through trial and error over a couple years of messing with it.

In your case, I don’t think that we know what the issue is yet. There are lots of reasons why 1 camera out of 7 doesn’t show up.

  • General unreliability of general camera platform without stream (would likely be different cameras exhibiting behavior at different times)
  • A limit on max number of streams from NVR (probably would be same camera most of the time)
  • One poorly configured camera (would be same camera every time)
  • A limit on max number of streams from camera
  • A limit on how many streams the client can handle
  • CPU usage on server
  • CPU usage on NVR
  • CPU usage on camera

Each of these has different solutions. So the best thing you can do is try some different ways of setting it up and see what your results are. Try the live555 proxy server (I give my config further up in the thread). Try connecting directly to the cameras. Try different camera platforms like ONVIF which is quite good in my experience. You will get different results and eventually you will be able to triangulate the issue.

My cameras.yaml is here, but cameras aren’t that interesting in their config], my comments there are probably the most useful thing, but my comments in the original post are far more fleshed out and up to date. Everything useful to be said about my config is already in the orginal post and the thread.

Has anyone played around with cameras in HomeKit? Some of my notes:

  • When you use the HLS stream from Blue Iris, it loads instantly in HomeKit but experiences quite a bit of smearing. Also loads at full resolution, full quality, super high bitrate, and there’s no way to independently turn it down in Blue Iris (typical BI). There’s also about 7 to 8 seconds of lag.
  • When you use the RTMP stream directly from the camera, it takes 7 to 8 seconds to load in HomeKit, but there’s zero lag. And you can use the substream, and adjust it to whatever quality you like from the camera.

I wish I could get the RTMP to HLS conversion to happen all the time so it would load instantly. It seems the Preload Stream thing doesn’t start the conversion in the background, it seems like it just loads the RTMP stream.

Does anyone know how to start that → HLS conversion when HA starts and keep it running so HomeKit can just access it without any starting delay?

I have 3 Xiaomi Dafang hacked version (RTSP) and 1 Foscam camera (ONVIF). Tried multiple ways to integrate them to HA as suggested in this post but I still unable to get the best option. My lovelace UI glance card (live/auto) can’t show the image sometimes (grey out) and the stream is way delay (minutes to hours) even with preload stream.

So far I have tried generic camera+with/without stream, ONVIF+with/without stream. I have yet to try the FFMPEG as I think it’s kinda heavy for RPi4.

I’m not so sure whether it is because the hardware is not powerful enough (RPi4 with 4GB version) to handle the stream or is it because of HA.

With a Pi it can be tricky because it’s quite heavy on the slow bus and processor to work without stream and you might bring the whole thing to it’s knees, but with stream you will have a delay. You just have to test what’s possible. Try low resolution, low FPS streams if you want to turn stream: off.

@scstraus have you tried the new Frigate release, particularly the feature that @blakeblackshear to streams the camera feeds as rtmp feeds?

He primarily did this to reduce the number of connections to the cameras - so everything pipes through frigate.

The best way to get realtime in the browser will probably be to use jsmpeg in a custom card. It will take a bit more processing power to do the h264 to mpeg1 conversion than rtmp, but it should get the latency down to almost nothing. This is something I have been considering for Frigate for a while.

2 Likes

No, despite being one of the first frigate users, I have not yet had a chance to upgrade. Previous releases did not really encourage you to use the streams they created as full time cameras… It looks like that might have changed, and so might be a candidate for replacing my RTSP proxy.

I would test the hell out of this :grinning:. What’s the latency like on RTMP now? Does it also proxy the stream so that it’s just keeping one connection open to the camera even if I have 5 devices streaming the camera from frigate in the frontend?

The latency for RTMP from frigate should be similar to pulling it directly from the camera. Frigate is taking the camera stream and immediately rebroadcasting over RTMP. FFmpeg makes one connection to the camera and can pipe the output to multiple locations. A single connection can be used for 24/7 recording, clips, detection, and rtmp. From there you can connect multiple clients to Frigate’s RTMP endpoint. Since it is using nginx’s rtmp module, I expect it can handle many simultaneous connections. All of that can be done with a single connection to the camera itself.

1 Like

This idea has been rattling around in my head for a while now: https://github.com/blakeblackshear/frigate/issues/338

2 Likes

I liked the recap idea that you had assigned yourself a year ago to do something like this :slight_smile:

You ever mess with cameras that depend on CloudEdge? I use a bunch of Zumimall wireless IP battery cameras and while they’re great on their own I’d love to integrate them with HA if at all possible.

I’ve tried the new Frigate custom component with RTMP, but I’ve also ~10 sec of lag in HA… I’m using the stream: component as well.

EDIT: @scstraus maybe something that could help with the camera lag in general: Realtime camera streaming without any delay - RTSP2WebRTC

1 Like

Yes I don’t think there’s any way to avoid the delay when using the stream: component. It’s simply a byproduct of the protocol.

The WebRTC component looks extremely promising! I’m going to give it a try. Thank you for that!

@scstraus you can ask me any questions about WebRTC. Your research is very good. I’m also hate lags and have been looking for a solution for a very long time :slight_smile:

2 Likes