Built a docker image to allow casting generic security camera streams into chromecast devices

Hey,

Just starting using Home Assistant a few days ago and one of the first things I wanted to do was cast my generic IP camera feeds into my TVs, so I put something simple together. I’ve seen this feature advertised on a high end camera manufacturer (arlo or nest) but not on generic cameras.

I spent some time today to build a docker image that combines Nginx, FFMPEG and a bit of bash scripting in order to grab a live stream (e.g.: RTSP from an generic IP security camera), convert it into an HLS stream that chromecast devices accept and host it in nginx with the proper configuration.

Repo is here: https://github.com/gihad/streamer - Just sharing in case someone else wants to use this as well.

There were a few “gotchas” for getting everything done to the point where I can cast the security cameras live feed into chromecast but it’s working with my Amcrest cameras now.

17 Likes

Awesome! I have some Amcrest as well

1 Like

So how do you automate this in Hass? Is it possible to stream on TV when doorbell is pushed?

4 Likes

Did you figure out how to use this?

Any updates on this? I would like to use an automation to play my generic (RTSP) cam on TV.

works great - thanks for creating the docker!

1 Like

I have the docker up and running great. Can you share your home assistant configuration on how to send the stream to the chromecast?

Thanks!

I managed to achieve similar result with chromecast without converting the video, using android phone to control the cast device. Might be useful alternative.

@gmurad Is it possible to use your system without docker?

also when you convert it to hls file wouldn’t the storage be a problem? as you will need to store the converted file in your machine before you can cast it ?

Starting backdoor stream
ffmpeg version 3.4 Copyright (c) 2000-2017 the FFmpeg developers
  built with gcc 6.4.0 (Alpine 6.4.0)
  configuration: --prefix=/usr --enable-avresample --enable-avfilter --enable-gnutls --enable-gpl --enable-libmp3lame --enable-librtmp --enable-libvorbis --enable-libvpx --enable-libxvid --enable-libx264 --enable-libx265 --enable-libtheora --enable-libv4l2 --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-libxcb --disable-stripping --disable-static --enable-vaapi --enable-vdpau --enable-libopus --disable-debug
  libavutil      55. 78.100 / 55. 78.100
  libavcodec     57.107.100 / 57.107.100
  libavformat    57. 83.100 / 57. 83.100
  libavdevice    57. 10.100 / 57. 10.100
  libavfilter     6.107.100 /  6.107.100
  libavresample   3.  7.  0 /  3.  7.  0
  libswscale      4.  8.100 /  4.  8.100
  libswresample   2.  9.100 /  2.  9.100
  libpostproc    54.  7.100 / 54.  7.100
Option rtsp_transport not found.

I get that error when trying to run the docker, it seems to be complaining about the ffmpeg option in the script -rtsp_transport. Not sure how to solve this one. It appears the stream isn’t being created because of this error and I end up with a 404 error where the stream should be.

Well my issue is not really an issue as this docker image is not needed at all if your camera has an mjpeg stream.

alias: Back Porch Camera
sequence:
  - service: media_player.play_media
    data:
      entity_id: media_player.shield
      media_content_id: http://192.168.1.159:8081
      media_content_type: 'image/jpeg'

The media_content_type is very important, it won’t play an mjpeg stream if it’s expecting a video because of the strict requirements, but it will happily stream an mjpeg video as an image.

3 Likes

can you give a bit more information on how you accomplished this? like where did the code you post go?

That code I posted is a script. You can execute it any way you like. I have it exposed to my Google Home devices so I can simply say “Activate Back Porch Camera” or “Start Back Porch Camera” and it will cast to the Nvidia Shield which is our main video player in our living room. I’m using a Raspberry Pi 3 with a camera running MotioneyeOS but it outputs a standard Mjpeg stream and thus this method should work with any camera that outputs an Mjpeg stream.

What kind of frame rate do you get?

@roofuskit sorry mate bit of a noob but exposed it to Google Home…how? How does this enable you cast the feed to a Chromecast? How could I send it the video via the UI as well as voice?

The camera is on Wi-Fi through a brick wall so I have the camera set to cap at 15FPS in software, as far as I can tell that’s what I’m getting.

EDIT: I’ve moved the camera up to the ceiling and upped the camera to 20fps to test the stronger connection and it still appears that I’m gettin 100% of the frames.

I tried the docker image but there is 20 to 30 seconds lag. Is there a way to reduce the lag.

Can you please provide more detail in how you got this to work? I have added a camera stream from Blue Iris but I am not able to cast via Google Home. I have tried other methods people have said work with no luck.

Thanks,

Hi, I have the conversion working great and I can view the resulting m3u8 file in VLC to test but I am having trouble casting it to my chromecast.
I am using the media_player.play_media service but which media_content_type should I use?

I have tried all of the following without success, (the chromecast loads the default media player, thinks for a while and then fails back to a black screen with a blue cast logo on it):

  • video
  • channel
  • video/mp4
  • application/vnd.apple.mpegurl
  • video/mpegURL
  • application/x-mpegurl
  • image/jpg

i have the exact same issue.

script:
  babycamera:        
    sequence:
      - service: media_player.play_media
        data:
          entity_id: media_player.lounge_display
          media_content_id: rtsp://xxxxx:[email protected]:80/cam/realmonitor?channel=1&subtype=00&authbasic=[AUTH]
          media_content_type: 'image/jpeg'

works in VLC ok, just not on the chromecast (google home hub).