Stream Component: Confirmed Cameras and Configurations

Hi there!

Has anyone successfully integrated an eminent camera yet?
I have a eminent em6225 camera and can’t get it to work with the generic component.
No still image and no live stream(keeps loading). Still image url does work in a browser and the rstp url does work in vlc player.

With the ffmpeg component i get a still image in the picture glance card but also no stream…

Anyone have any ideas?

I run hassbian 0.91.3 by the way

Hi!

The following setup worked for my D-Link DCS-935L camera.

camera:

Thank you!

I hacked my Xiaomi Dafang3 but I cannot stream via ChromeCast. Could you help me?
configuration.yaml
stream:

camera:

Then, I turn on Google Cast and execute the following command via dev-service:

service: camera.play_stream
{
“entity_id”: “camera.dafang3”,
“media_player”: “media_player.salon”
}

It seems it gonna work: google cast blue icon changed and a blue vertical bar started to move. However, I never see anything.

I’m seeing the same problem as @ManuelAbril but using Amcrest cameras. Casting to the google home hub or chromecast looks like it is going to start you see the player controls and then to just goes to a blue cast icon and never loads the stream. In the web UI all of the streams and snap shots load up fine. I see no errors in the logs just like the stream ends or the format sent is not in the right format. Any help or suggestions would be appreciated.

stream:

amcrest:

  - host: 10.0.4.3
    username: ***
    password: ***
    name: "Driveway"
    resolution: low
    stream_source: rtsp

  - host: 10.0.4.5
    username: ***
    password: ***
    name: "Front Door"
    resolution: low
    stream_source: rtsp

  - host: 10.0.4.7
    username: ***
    password: ***
    name: "Garage"
    resolution: low
    stream_source: rtsp

  - host: 10.0.4.8
    username: ***
    password: ***
    name: "Back Door"
    resolution: low
    stream_source: rtsp

camera:
  - platform: amcrest
  
  - platform: generic
    name: Garage
    still_image_url: "http://usr:[email protected]/cgi-bin/snapshot.cgi"
    stream_source: "rtsp://usr:[email protected]:554/cam/realmonitor?channel=1&subtype=1"

https://photos.app.goo.gl/DZ3ZgXdUuQzSm6bm6

Ok I have seen the delay explained as key frames, but after leaving HA running for over a week the delay increases to over 1min making the video very very confusing for events.

Just updated to the RTSP enabled firmware from Wyze on a couple of Wyzecam 2.0’s. Haven’t found a still image but streaming seems to work well with a 5-10 second delay.

I’ve started playing with the RTSP beta firmware from Wyze, and since I couldn’t find a still image URL, I’m using the ffmpeg camera component:

ffmpeg:
stream:
camera:
  - name: wyze
    platform: ffmpeg
    input: rtsp://username:[email protected]/live

It “works” at least once in my testing environment. I get still frames extracted every so often and the streaming stuff is also working. I’ve been able to grab snapshots as well as using the camera.record server to make a video clip.

I think the camera firmware is a little flaky still. This all needs a bit more testing. I’m using Home Assistant 0.91.4 to test this set-up. If this holds up, I’m anxious to connect up the tensorflow stuff and see how well this works for people detection in rooms.

1 Like

Thanks @anasazi. I have a bit higher of a resolution but everything else is identical. Still only 1/2 fps on the frontend while the zoneminder passthrough cameras look great at 15 fps. So weird.

Nice. Yeah the new firmware definitely has some stability issues. I’ve actually taken my cams back out of HA for now. We’ll see if Wyze can improve the fw in the next update.

Any luck with the Digoo DG-M1Q and Digoo DG-M1Z
?

They appear in the card, but the stream does not start.

From other forum they say that rtsp works in UDP not TCP mode. How would the command be in this case. Now I use

rtsp://admin:[email protected]:554/onvif1

which works in VLC but not in HASS

1 Like

How to add esp32 camera to hass. I flash esphome camera

Mine have been pretty stable. cams running through synology surveliiance and then into HA. still image refreshes every 10 seconds or so. and the same results when sending them through motioneye.

is there anyway to reduce the buffer time (decrease latency period)? it always delay around 20-30secs than real time when I streaming.

My hass running on a laptop, Ubuntu 18.04, i7-7700HQ, 16GB RAM

If you read up the thread it is apparently based of the key frames from the source video however this can’t really be changed on many cameras.

I have however had some extreme issues with it getting longer as HA has been running for a few days.

Oddly while trying to quantify exactly how long it was delayed I turned on the Time stamp overlay for my camera and suddenly it is a consistent 15 seconds delay. I really hate the overlay but it seems to have for the moment fixed the extra long segment problem. I would love it to be less than 15s however.

1 Like

I’ve managed to get this working sort of. I’m accessing my cameras through my synology. They all show up and I can view them on my windows machine and on my iphone using the HA ios app but I can’t view the streams using the same HA ios app on my ipad. Why would the ipad not work? It’s a new one and on the same ios 12.2 as my iphone. I’m seeing erros in the log too…

Error demuxing stream: No dts in packet
corrupted macroblock 24 36 (total_coeff=-1)
error while decoding MB 24 36

Knewmart (Amazon link, searching the FW version leads to Wanscam HW0049) works.

- platform: generic
  still_image_url: http://<user>:<pass>@<ip>/tmpfs/auto.jpg
  stream_source: rtsp://<user>:<pass>@<ip>/11

Does anyone know if it’s possible to hide (or move to another corner) the 'Preload Stream" checkbox?
It covers the time of the stream.
Auswahl_031

2 Likes

Hey just wanted to shared that I was able to setup my 2 unifi cameras and raspberry pi camera. Performance isn’t great since everything is running on the Raspberry pi 3

For raspberry pi camera I used the restreamer docker image, then

- platform: generic
  name: remoteRPiCam
  still_image_url: !secret rpi_url
  stream_source: !secret rpi_url_stream
  verify_ssl: false


rpi_url: http://192.168.1.249:8080/images/live.jpg
rpi_url_stream: http://192.168.1.249:8080/hls/live.stream.m3u8

Once 0.92 arrives, what are the steps to “enable” the stream when using the integration? I’ve read this thread and can’t figure it out, nor can I find anything in the Foscam, Axis, or Doorbird documentation pages regarding streams. First off, I assume “stream:” is required in configuration.yaml. Anything else need to be specified under that component?

Do I need to explicitly call out the stream_source: rstp://address, or does the integration handle that? If I call it out, I assume it actually goes under camera: -platform: ?

Thanks!

If the integration supports the stream component it is just to do stream: in your configuration.yaml

1 Like

anyone got it working with a Foscam c1 in wifi ?

does anyone come across with ezviz camera?
I’m figuring out how to make streaming works for ezviz camera, still_image_url is an issue for me