Stream Component: Confirmed Cameras and Configurations

I’m getting the same issue. I think is my vlan too but my HA is in the same subnet as my google nest. So im not even sure now. Did you open any special pass through for your nest devices? In my vlan i did have to open an ICMP to wan for the nest devices or they constantly reconnect itself. Other than that everything see ok. Also I login to my IOT Vlan with an laptop and casted my laptop screen to nest so not sure why HA having this issue.

I’m not sure if this has been covered, since there are 726 replies before me… but I have a Reolink system and noticed that in the original post the code reads as this:

- platform: generic
  still_image_url: "http://192.168.1.x/cgi-bin/api.cgi?cmd=Snap&channel=0&rs=wuuPhkmUCeI9WG7C&user=user&password=password"
  stream_source: "rtsp://user:[email protected]:554/h264Preview_01_main"

but I think that the ampersand (&) is followed by the characters “amp;” when in fact it should just be an ampersand. Another post I read had a yaml code example from Reolink Tech Support and their code only showed the ampersand symbol.

It depends how you VLANs are configured. If you have HA and the hub on the same VLAN and allow inter-VLAN traffic, you should be fine. If they’re both on your IOT VLAN then you’ve probably disabled inter-VLAN traffic, so you’d need to change that (or put them on your primary/trusted VLAN).

Thank I got it streaming now. However it keep rebuffing during the stream every 3 sec. ie the camera still steaming live but every 3 sec gone to the background and the Smart Home Camera… screen popup for like 1 sec and it just keep cycling back and forward.

The entire screen gone black with just Smart Home Camera wording after it stream for ~ 1 min or so.

BTW assume the above issue can be fixed. The “live” stream delay is just too long, it around 25 sec off from the real live feed. That isn’t much useful in real situation. Is there way to bring this delay close to real time maybe within 3 sec.

Oh one last time I couldn’t get “cast” stream on node red working. So I’m currently using call server “camera.play_stream” node. Not sure if the “cast” node will have lesser delay or not if someone can let me know that will be great. At my end using via camera.play_stream node that video delay is about 25 second from live.

Mine has a delay as well, but checking the “preload stream” checkbox in the lovelace UI does help with delay. Mine erratically buffers as well – sometimes every 10s, but sometimes goes all night with no buffering. I simply expose my camera entities to Google Assistant and stream that way (and I don’t use Nabu Casa, I run my own DNS). I believe based on some prior messages here, it is suspected that the video feed leaves the home network (to the internet) just to come back to the Google Home, but I don’t know that anyone has proven that.

I think I kind of fix the buffering issue via bring the camera in (dahua) to ha in ffmpeg platform. It seem buffer a little less than when comparing the generic platform. I have Nabu Casa and still have massive delay so I don’t think is DNS. I think the delay is due how the image being cast to chrome via HA. Since rstp stream can’t really cast directly there some type of process the image.

If you cast directly off blue iris UI3 webpage it pretty much a live feed it only about 3 second delay. I tried and it is not with in my network. I open the webpage offisite and cast onto a friends Nest Hub so the image is totally gone through the www and still have minimum delay. I compared the live time on inprint on the video to real time during viewing and it was only about 3 second off. Next thing I’ll may try is cast via CATT of the blue iris http UI3 webpage see if there any delay issue.

I believe that’s correct from my research. I read the Google Cast devices are actually hard-coded to query Google DNS servers and if that’s not available, will use local DNS servers.

It seems that the latency is significantly more for folks who use Nabu Casa for external access than those who use alternative methods. It seems like folks who are using their own local DNS server have the best performance and Nabu Casa users have a constant (every 1-3 seconds) rebuffering. I’m not smart enough on the network architectures to understand this yet, but any additional insight would be greatly appreciated from anyone here.

When I compare the view in homeassistant versus my synology surveillance station view, there is 10 seconds of extra delay on hass. They are both accessing the same rtsp stream. I have tried setting a keyframe every 4 frames (1x per second) but it didn’t help. I also use the “preload stream” option. I am accessing my hass instance from a fast computer directly on my local network via the local IP address.

If I comment out stream: in my configuration.yaml, the streams I get are near real time and approximately 1fps (which is fine but they don’t display correctly in lovelace on my ios and kindle fire devices). So it’s pretty clear this is due only to delay from the stream component itself.

Is there anything else that can be done to reduce this delay? It makes it pretty unusable for anything like video doorbell, realtime surveillance, etc… I would happily give up some CPU cycles to speed this up as I’m on an Intel box with some to spare. Again, I only want to output directly to a lovelace card, so this has nothing to do with casting, and the camera streams themselves are not the issue because they can be displayed quickly without the stream component enabled.

1 Like

I haven’t read a config for motioneye working properly with the stream component.
After serveral hours I got it to work.
First I changed my motioneye camera settings to “Expert Setting --> Fast Network Camera”.
After a reset I changed the “Video Streaming” Option --> “Streaming Protocol” to RTSP.
Then i added this to my camera config in home assistant:

  - platform: ffmpeg
    input: -rtsp_transport tcp -i rtsp://user:[email protected]:554/h264
    name: garten_camera

After a restart the camera showed up and the streaming component worked!

For my case it would be very useful if I could have the stream component only apply to certain cameras in my config while having others not use it (to get rid of the delay). If you feel the same way, please vote for my feature request here

As described in more detail in this thread, different combination of generic, ffmpeg, and streaming turned on or off create very different results in the frontend. The stream component often delays cameras for 10 seconds, but is usually easier to get to display correctly in the frontend without the camera feed breaking.

But without the stream component, you can get a better realtime feed. It would be a big advantage if we could apply the stream component only to some feeds while other feeds could be left with the classic MJPG type presentation. In this way we could use stream for devices that don’t cope well with MJPG presentation but where the delay is acceptable, while also having faster feeds via MJPG for things that are time critical.

Hi all. As I have spent an awful lot of time testing almost every camera component in hass, I have decided to do a write up with ~10 different types of camera setups in hopes that it helps other people not to go through the same process. There are plusses and minuses of each combination, but none are perfect. You can see it here:

1 Like

Dahua DVR:
Follow the link DahuaWiki and you will see it specifies that you must setup your camera encoding to H.264
Once I did that my cameras are working in HA.

RTSP Link Generator:
You can use this link RTSP URL Generator for IP Cameras and Recorders generator to help you create the RTSP link for your DVR’s/Cameras.

My Config in the configuration.yaml file:

  - platform: generic
    still_image_url: "http://user:[email protected]/cgi-bin/snapshot.cgi?channel=1"
    stream_source : rtsp://user:[email protected]:554/cam/realmonitor?channel=1&subtype=1
    name: Gate - Main Outside

Lovelace View:

  • Then in the Lovelace I added a Picture Entity.
  • Chose the camera I wanted as per the name I specified in the Config above.
  • Set the camera view to live.

I tried to configure my NVR (Zosy K9608-W) with 8-Channels.
Te only form I success is ussing:

- platform: generic
  name: Channel 0
  still_image_url: http://NVR_IP/cgi-bin/snapshot.cgi?chn=0&u=user&p=password
  stream_source: http://NVR_IP/cgi-bin/sp.cgi?chn=0&u=user&p=password

But, after configuring all the cams, in my lovelace place, using different cards, can’t view the video stream, only a picture that refresh every 10seconds.
I tried to use picture-entity, picture-glance, and setting the camera_view:live but then the picture is a black square without changes.
If I click on the picture, the video doesn’t work too. I can use de VLC to stream the strem_source and I can view without any problem.

Can anyone help me?


Did you ever get this working?

Panasonic SF-332 is also working

	- platform: generic
	  name: **Panasonic SF-332**
	  stream_source: rtsp://user:[email protected]/MediaInput/h264

For those who can’t find the streaming and image url , try this

I know that a lot of time has passed, but for me it is the current topic. I successfully streams images from 4MP hikvision cameras (generic platform) to devices with iOS and Android v.5. The resolution is reduced automatically. Unfortunately, on the new tablet with androdid v.10 rtsp gives an image only if I reduce the cameras resolution to max. tablet resolution, i.e. 1080P, otherwise it does not display the image (although the old Android phone v.5 has no problem with that). I am looking for a solution to the problem, and I do not want to reduce the resolution of the cameras themselves, because it makes no sense, but maybe I can reduce the resolution of the streaming itself. Substream is also out of the question because the resolution is very low there.

check if your camera can do 3 streams, they usually need to have a box ticked to enable it and the 3rd stream can do higher res then the 2nd stream.

- platform: generic
  still_image_url: http://user:[email protected]/video.cgi?msubmenu=jpg
  stream_source: rtsp://user:[email protected]/profile5/media.smp

Could you please suggest on how to find the user for SNH-V6414BN ? On Samsung application/website you can only specify password for the camera and none of the default usernames (admin/root/etc) work for me :frowning:

PS: With try and fail I’ve figured user=‘admin’. I don’t think it can be changed.
Only ffmpeg works for me though:

  - platform: ffmpeg
    input: rtsp://admin:[email protected]/profile5/media.smp

If I remember correctly, the username is set on samsungs’s smartcam site: . That page is pretty dated and I had to use Internet Explorer for it to work. I think I ended up configuring that camera as an ffmpeg camera, which worked better, but I no longer have the config that I used.