Stream Component: Confirmed Cameras and Configurations

Great to hear you’re working on the ONVIF integration. If you don’t mind, a small feature request:
I’m using ONVIF to communicate with my intercom. This has two problems:

  1. The lights around the camera around constantly on
  2. After a couple of days the camera is stuck and I have to power cycle my intercom

What I’d like is the ability to configure this integration, but instead of it constantly pinging my camera, I’d like to operate it on demand (for either a still image or a stream).

Do you think that would be possible?

Thanks!

BTW I think both of these problems are related and are a recent regression, and I know there were a few changes to the integration recently.

http://10.10.1.48/webcapture.jpg?user=admin&password=&command=snap&channel=1

I’ll have to try the updated Onvif integration with my Dahua cameras when it’s out. Just wanted to mention that the Amcrest camera integration works quite nicely with the Dahua cameras. Their API’s seem to be virtually the same.

If it works with Amcrest, I’d use that because it also provides the motion sensors I believe. That’s something I’m working on for ONVIF, but might not make it into 0.110.

Yes, the Amcrest integration supports the motion sensors on Amcrest/Dahua cameras, and was recently updated to subscribe to any motion event immediately instead of polling.

There is a known issue with switching motion detection on/off via HA service on the latest Amcrest camera firmware update (2.6) , but I’m not aware that is also an issue with Dahua. So far, Amcrest hasn’t disclosed or updated their API doc to detail what they changed so it can’t be easily fixed. Meanwhile, some users have installed previous firmware to get that important functionality back.

Onvif with motion detection sensor capability might be a better solution than using older firmware.

Hi, I’m using the stream component to try to cast a camera from BlueIris to the Chromecast devices I have in my home. I’ve got the stream component enabled and the camera configured per the instructions at the top of this thread. The camera is a 4k Amcrest at 15fps (re-encoded to 1920 x 1080 by the BlueIris web server).

The issue I have is that when I cast the stream, it plays for 2-3 seconds (and the ‘live’ indicator lights up on the chromecast display) then pauses and seems to buffer (and the ‘live’ indicator goes out) for 2-3 seconds, then the cycle repeats itself, eventually pausing completely. I don’t see the issue when utilizing the above URL through QuickTime.

When I looked at the BlueIris status tool, it shows the streaming rate at ~.4fps when serving up through the link here to either QT or the chromecast display, though the time seems to count up normally when I view the stream in QT (and I believe in chromecast when it’s not buffering). I also have BI set up to serve this camera to Home Assistant for Lovelace as type mjpeg, and the framerate on the BI tool shows ~7.5fps when that stream is active. When accessing BI via a browser window, it shows a solid 15fps, so I don’t think it’s a Blue Iris or network issue.

Thinking this was a performance issue with my HassOS / RPi4b installation, I tried an install of HA as a VM on my i7 machine, with the exact same results. I’m now thoroughly confused. Your help would be greatly appreciated. Thank you!

1 Like

Could someone please add their code for a Wyze camera with official RTSP-firmware provided by Wyze (beta)? It works through VLC but I can not get it to work in HA.

I added this to my configuration.yaml :

camera:
  - platform: generic
    name: mycamera
    still_image_url: https://www.apkmirror.com/wp-content/uploads/2019/04/5cc0af6bb22b6.png
    stream_source: rtsp://user:[email protected]/live

stream:

So if my camera is using mjpeg I won’t be able to use stream to cast it to a Google display or add it to Google assistant?

hi, this still image url doesn’t work for me,

that’s strange i have the same camera.Is your channel and stream correct?
i know that it only works for one of them and not for the other channel/stream

i’ve used:

http://192.168.1.10/webcapture.jpg?user=admin&password=&command=snap&channel=1

the browser save a copy of the file but it’s an empty file

maybe a dumb question but did you fill in the username and password in the URL of your camera?

cause i have a empty file when those are not correct

no, i’ve just used the url that i’ve posted. but for the streaming url:

rtsp://192.168.1.10/user=admin_password=tlJwpbo6_channel=1_stream=0.sdp

I actually use the password (the default one : tlJwpbo6)

anyway, if i use:

http://192.168.1.10/webcapture.jpg?user=admin&password=tlJwpbo6&command=snap&channel=1

it’s the same, empty file, zero byte

I’m running Dafang hacks on some wyze cams but the delay in Lovelace is about 15 seconds. Anyone know what I could do to cut that down?

1 Like

Anyone has a clue for a reason why Synology Surveillance Station is not giving me Share Stream URLS? The boxes where they were supposed to be are blank. If i use the desktop app, it actually gives me an error “Operation failed”.

I opened a ticket, anyways.

This one work’s for metoo

I’m doing literally the same thing. 4K Amcrest camera with Blueiris. Constant buffering. I’m trying to work on it. Hopefully one of us figures it out

I spoke with @hunterjm (the developer of the stream component) about this on Discord. He’s thinking the main issue with google cast is the latency due to having to route the stream externally to have it play on the chromecast device. He thinks this causes enough delay to trigger the buffering. He said he reconfigured his network (I assume with a local DNS server) so that it’s streamed locally. I don’t know yet how this works when using Nabu Casa (which I do) instead of, say a separate service (like DuckDNS) which he does.

BTW, I loved listening to his appearance on the podcast. I would highly recommend it as a background to begin to understand the streaming component.

Couldn’t find anyone share anything about the Yi Cameras so here’s what I’ve got working

I put this in my configuration.yaml:

stream:
ffmpeg:
camera:
  - platform: ffmpeg
    name: Camera 1
    input: -rtsp_transport tcp -i rtsp://192.168.x.xx/ch0_0.h264 

Then I used picture entity cards in the front end and only filled in the required entity

cards:
  - aspect_ratio: 70%
    entity: camera.camera_1
    hold_action:
      action: more-info
    name: Camera Name
    type: picture-entity
type: vertical-stack

The rtsp link would be available from the camera admin panel via the IP address and port in accordance with the hack you’re using