Has anyone successfully integrated an eminent camera yet?
I have a eminent em6225 camera and can’t get it to work with the generic component.
No still image and no live stream(keeps loading). Still image url does work in a browser and the rstp url does work in vlc player.
With the ffmpeg component i get a still image in the picture glance card but also no stream…
I’m seeing the same problem as @ManuelAbril but using Amcrest cameras. Casting to the google home hub or chromecast looks like it is going to start you see the player controls and then to just goes to a blue cast icon and never loads the stream. In the web UI all of the streams and snap shots load up fine. I see no errors in the logs just like the stream ends or the format sent is not in the right format. Any help or suggestions would be appreciated.
Ok I have seen the delay explained as key frames, but after leaving HA running for over a week the delay increases to over 1min making the video very very confusing for events.
Just updated to the RTSP enabled firmware from Wyze on a couple of Wyzecam 2.0’s. Haven’t found a still image but streaming seems to work well with a 5-10 second delay.
I’ve started playing with the RTSP beta firmware from Wyze, and since I couldn’t find a still image URL, I’m using the ffmpeg camera component:
It “works” at least once in my testing environment. I get still frames extracted every so often and the streaming stuff is also working. I’ve been able to grab snapshots as well as using the camera.record server to make a video clip.
I think the camera firmware is a little flaky still. This all needs a bit more testing. I’m using Home Assistant 0.91.4 to test this set-up. If this holds up, I’m anxious to connect up the tensorflow stuff and see how well this works for people detection in rooms.
Thanks @anasazi. I have a bit higher of a resolution but everything else is identical. Still only 1/2 fps on the frontend while the zoneminder passthrough cameras look great at 15 fps. So weird.
Nice. Yeah the new firmware definitely has some stability issues. I’ve actually taken my cams back out of HA for now. We’ll see if Wyze can improve the fw in the next update.
Mine have been pretty stable. cams running through synology surveliiance and then into HA. still image refreshes every 10 seconds or so. and the same results when sending them through motioneye.
If you read up the thread it is apparently based of the key frames from the source video however this can’t really be changed on many cameras.
I have however had some extreme issues with it getting longer as HA has been running for a few days.
Oddly while trying to quantify exactly how long it was delayed I turned on the Time stamp overlay for my camera and suddenly it is a consistent 15 seconds delay. I really hate the overlay but it seems to have for the moment fixed the extra long segment problem. I would love it to be less than 15s however.
I’ve managed to get this working sort of. I’m accessing my cameras through my synology. They all show up and I can view them on my windows machine and on my iphone using the HA ios app but I can’t view the streams using the same HA ios app on my ipad. Why would the ipad not work? It’s a new one and on the same ios 12.2 as my iphone. I’m seeing erros in the log too…
Error demuxing stream: No dts in packet
corrupted macroblock 24 36 (total_coeff=-1)
error while decoding MB 24 36
Hey just wanted to shared that I was able to setup my 2 unifi cameras and raspberry pi camera. Performance isn’t great since everything is running on the Raspberry pi 3
For raspberry pi camera I used the restreamer docker image, then
Once 0.92 arrives, what are the steps to “enable” the stream when using the integration? I’ve read this thread and can’t figure it out, nor can I find anything in the Foscam, Axis, or Doorbird documentation pages regarding streams. First off, I assume “stream:” is required in configuration.yaml. Anything else need to be specified under that component?
Do I need to explicitly call out the stream_source: rstp://address, or does the integration handle that? If I call it out, I assume it actually goes under camera: -platform: ?