Stream Component: Confirmed Cameras and Configurations

All the integrations I’ve used thus far have example configurations on how to use them. All the ones I’ve seen for trying to put an RTSP stream in do not. I’ve looked at Camera/Stream/FFmpeg and they all just say add stream: to configuration. I don’t understand what to do with that. FFmpeg says to add FFmpeg: to configuration.yaml. I’m assuming I have to somehow create the RTSP entity underneath that but not sure how.

Docs are here for generic cameras. Include this in your config:

camera:
  - platform: generic
    name: 'Whatever'
    still_image_url: http://a.b.c.d/foo.jpg
    stream_source: rtsp://x.y.z.t/bar

stream:
1 Like

Ah use camera and stream together. That is the part I was missing. Thank you.

Sorry, i linked the wrong ffmpeg. This is the correct one.

camera:
  - platform: ffmpeg
    input: FFMPEG_SUPPORTED_INPUT

input: accepts rtsp or rtmp

1 Like

I’ve the same camera and my config is as below:

camera:
 - platform: ffmpeg
   input: rtsp://username:[email protected]/11
   name: Porch FFMPEG
   extra_arguments: -vf "transpose=2"

stream:
ffmpeg:

Hi, Thanks for that, I get the same results as my or original config.

Strange this is though on my HA front-end I get the following.

image

This updates say every 10 seconds but all I get is this blur.

Any thoughts?

Try this:

rtsp://IP_ADDRESS:554/user=admin&password=whatever_you_have_defined&channel=1&stream=0.sdp

I am hoping someone get a Swann camera hooked up some day. I have tried for a long time and no matter what I try, it just never works.

I finally got it working. Thanks.

I never could get this to work. I have the streams working fine in HA, all works there and loads. I am using Nabu Casa and shared the Cameras, they show up in Google Home app. I can tell it to show the camera and I just go to “Smart Home Camera” on the screen.

It is trying something local, because I use pfSense and at first I had my IOT VLAN blocked from talking to LAN and that’s where HA is. With that rule on, it just would spin. When I disabled that rule it would stop spinning and go to a black scree, white text, “Smart Home Camera”.

I don’t have the direct to Google setup, I went the fast way with Nabu Casa for now.

Any ideas? Thinking about emailing them as well.

I figured this out, must be Google devices ignore my DNS resolver? so they couldn’t find my server, I removed all the certs and went back to an IP and it started working. So I will dig in and figure out how to make it work

Hello,

I’m getting the following error when streaming Amcrest Cameras to Roku TV:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/websocket_api/commands.py", line 134, in handle_call_service
    connection.context(msg),
  File "/usr/src/homeassistant/homeassistant/core.py", line 1230, in async_call
    await asyncio.shield(self._execute_service(handler, service_call))
  File "/usr/src/homeassistant/homeassistant/core.py", line 1253, in _execute_service
    await handler.func(service_call)
  File "/usr/src/homeassistant/homeassistant/helpers/entity_component.py", line 198, in handle_service
    self._platforms.values(), func, call, required_features
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 402, in entity_service_call
    future.result()  # pop exception if have
  File "/usr/src/homeassistant/homeassistant/helpers/entity.py", line 590, in async_request_call
    await coro
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 433, in _handle_entity_call
    await result
  File "/usr/src/homeassistant/homeassistant/components/camera/__init__.py", line 676, in async_handle_play_stream_service
    DOMAIN_MP, SERVICE_PLAY_MEDIA, data, blocking=True, context=service_call.context
  File "/usr/src/homeassistant/homeassistant/core.py", line 1230, in async_call
    await asyncio.shield(self._execute_service(handler, service_call))
  File "/usr/src/homeassistant/homeassistant/core.py", line 1253, in _execute_service
    await handler.func(service_call)
  File "/usr/src/homeassistant/homeassistant/helpers/entity_component.py", line 198, in handle_service
    self._platforms.values(), func, call, required_features
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 402, in entity_service_call
    future.result()  # pop exception if have
  File "/usr/src/homeassistant/homeassistant/helpers/entity.py", line 590, in async_request_call
    await coro
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 433, in _handle_entity_call
    await result
  File "/usr/src/homeassistant/homeassistant/components/media_player/__init__.py", line 600, in async_play_media
    ft.partial(self.play_media, media_type, media_id, **kwargs)
  File "/usr/local/lib/python3.7/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/src/homeassistant/homeassistant/components/media_player/__init__.py", line 595, in play_media
    raise NotImplementedError()
NotImplementedError

Any assistance is highly appreciated.

Thanks

Hi @hunterjm

I was using this for months flawlessly… now it sends my CPU into orbit. I had problems with CPU maxing out and I just tracked it down to the stream component. Any idea why or how I can get it back under control? nothing in the logs

  - platform: generic
    still_image_url: "http://192.168.0.208:8165/Streaming/channels/1/picture"
    stream_source: "rtsp://admin:[email protected]:554/Streaming/Channels/3"
    authentication: digest
    username: x
    password: x
    name: Front Door stream 

I’ve been having issues with streams dying, stream.record stalling (“Stream already recording to…”) and not working until restart of HA, or possibly taking “Preload stream” tick out of the problem cameras and waiting for the stream to die. Mostly “stream dying” means that it keeps playing a couple of second loops whenever I click a picture glance card having the problematic camera or not playing at all.

Latest issues came with the Sonoff camera which only supports RTSP. That could be a good thing but it only works with the FFMPEG component (generic camera requires a still picture url which the camera doesn’t have, to my knowledge). So errors happen.

The funny thing is, that even though the stream itself is frozen or dead - the snapshots in LL still update. I can see the ffmpeg command (yes, the same one in configuration) running with some extra options (grabbing only one frame) every 10 seconds.

Yesterday I found about camera proxies. So I thought what the heck and tried that out. So now I have the same camera twice like this:

  - platform: ffmpeg
    name: Sonoff
    input: -rtsp_transport tcp -i rtsp://xxxx:[email protected]:554/av_stream/ch0 
  - platform: proxy
    name: sonoff_proxy
    entity_id: camera.sonoff

And uh…to my surprise…the proxy camera is the most resilient one to date. It’s funny thing though, that whenever I click the proxy camera “picture glance” in LL, it opens every time. Ok it takes a couple of seconds but unlike clicking a ffmpeg camera, it doesn’t generate a “Started stream” entry in HA log. And the FFMPEG process is killed soon after I stop watching the camera. With the ffmpeg camera I don’t get an extra FFMPEG process but there’s also no way to stop/restart it manually.
This way it (proxy cam) eats more CPU but it’s not eaten all the time (only when I watch the camera). I tried many things like ZoneMinder, Motioneye etc. but they all ate CPU all the time (since the streams are running all the time).

Also I wanted, in addition to having the live stream actually work, to send video snippets on movement through Telegram. Now I can still use camera.record service from Node-Red where my automations are, and the camera stream is started only when recording, and stopped immediately after recording, so there’s less chance for that to freeze. But the funny thing is, that even if the ffmpeg stream freezes, the proxy stream still works (although it cannot be recorded using camera.record)

The only other negative thing here is that I can’t get 20 second snippets without any “space” between them in time, like I could get if I had the stream preloaded (using the “lookback” option in camera.record), but it’s down to around 3 seconds like this. So basically when a snippet is recorded on motion, there’s a 3 second gap before the start of the next snippet.

2 Likes

You can use a file as a still image

Andy

I recently bought besder CCTV camera and i want to integrate it with hass.io. But i can’t find any besder cctv in this confirmed camera and configurations. Is there anyone that can guide me how to set it up? Thanks

This worked for me and i can cast it to my TV

platform: generic
name: Colordriveway
username: !secret cam_username
password: !secret cam_password
authentication: digest
still_image_url: “http://192.168.0.192:80/ISAPI/Streaming/channels/101/picture
stream_source: “rtsp://username:password##@192.168.0.192:554/Streaming/Channels/101/”

I am trying to use Hikvision camera in Push messages (Camera Stream):

camera:
  - platform: generic
    name: HikvisionOut
    still_image_url: "http://admin:[email protected]:9000/ISAPI/Streaming/channels/101/picture"

I get error:

2020-04-06 15:24:50 ERROR (MainThread) [homeassistant.components.generic.camera] Error getting new camera image: Cannot connect to host 192.168.1.61:8000 ssl:None [Connect call failed ('192.168.1.61', 8000)]

If I open camera on my Lovelace from browser on my PC, it works. Maybe, camera image is not going trough Apple Push Server? I am using Nabu Casa.

hi, I have a tplink nc200 that works on vlc with this string: http://admin:[email protected]:8080/stream/getvideo.
in ha i can’t see the pictures, has anyone already solved this kind of problem?

camera: 
  - platform: generic
    stream_source: http://admin:[email protected]:8080/stream/getvideo
    name: NC-200

Anyone seen this before in the logs? Not sure which one of my cameras is causing it.

H.264 bitstream malformed, no startcode found, use the video bitstream filter 'h264_mp4toannexb' to fix it ('-bsf:v h264_mp4toannexb' option with ffmpeg)