I have a wyze camera flashed with DaFang firmware. I have added the camera component to HA and I am able to get the still image. What do you need to do to see the live stream within HA? If put the RTSP stream in VLC, it works.
I gave pushing the seek bar forward a try but it hit 5s for a few ticks then drifted back to 15s of delay in chromeā¦ iOS shows the video as live and the 15s seek does nothing if clicked. This is with preloadā¦ If I donāt have preload set stream is over 30-45s behind when it does load.
Using this at my front door this creates some strange effects as sometimes you see someone walking up the steps to deliver something in the stream but in reality they are already on there way back down or gone or it looks like no one is there yet.
Using the cameras direct native app, or viewing via my synology DS cam app the feed is nearly real-time with only a few frames / 1s delay.
Very sorry, I did not realise that the stream component never successfully installed. I just went through dependency hell, having to install ffmpeg>=3.2 on Ubuntu 16.04, to be able to install the python av library which is a prerequisite for stream to work.
Now that stream is initialising, the camera stream is working fine, actually better than before when the video was proxied. Iād say that the frame rate has vastly improved, maybe even getting close to the 15fps that I configured for my Unifi G3 Flex camera.
I havenāt made any changes to the generic camera configuration, nor to the Lovelace card configuration.
When I click on the still image in the card to open the full screen view, I get an entry like this in the HA log:
2019-04-10 13:14:31 INFO (MainThread) [homeassistant.components.stream] Started stream: rtsp://192.168.1.x:7447/0fv5...
I am using Chrome, and looking into the developer tools I can see many consecutive requests to /api/hls/... which are answered with video/mp2t responses. So, Iād say that video streaming is working now.
The delay is inevitable. HLS is not a real time streaming protocol. All it does is break your feed up into little chunks, and sends those over HTTP for your browser to consume. The delay can be reduced by manipulating the āI Frameā interval on your cameras if they have the option like my HikVisions do.
I guess where we are in a bit of disagreement is the overall usefulness of the delay. With the MJPEG stream, by the time i got a notification, was able to open the app on my phone, and loaded the camera stream, 99% of the time the subject of interest was gone.
It all depends on perspective. You are welcome to turn the component off and go back to the old method.
here is what Iām seeing notice the still image is current, but when I click on it to start the stream (preloaded) it shows an old segment from 8:06pm last night, and then just spins.
(my Lab is 14.5+ and has a lot of accidents any more soā¦)
The delay can be adjusted by the segment length and playlist size. Could we have these exposed as variables? The big win for me on this component is the ability to cast to my Apple TVs.
A 2 second segment (I believe the smallest recommended) size can result in a 6s delayed stream for example. I wasnāt able to dig up what the current segment size and playlist size are currently being used.
We are only reformatting the feed that comes from your camera. We are not transcoding, because that would take way too many resources. All settings need to be adjusted on the feed coming from the camera directly, which is what I said before.
The way the stream component works is it cuts each segment at every keyframe (I-Frame). We do that because of the way h264 encoding works. Each keyframe has the full image, and subsequent frames are just the pixel difference from the keyframe.
Since we are not transcoding, we can not create new keyframes, and therefore can not adjust segment length. Some cameras provide the ability to adjust the interval in which it sends keyframes. That is the only way to reduce the delay at this point in time.
so you are experiencing the exact same issue described in the bug linked, if you look at my latest comment you can see that I was able to reproduce this by just disconnecting the camera while the stream is up.
Iām seeing the same thing. If I click on any of the videos, it starts to spin and then HA becomes unresponsive āUnable to connect to Home Assistant.ā
Iām running HA in Docker on a Synology NAS with amcrest cameras. I assume the container doesnāt have enough resources to handle the load requirement.
What are the minimum specs required for this to work as expected?
I have what is probably a dumb question. Why would someone not want to load the latest version of javascript? in otherwords, why is this not just the default HA process?