Ffmpeg, raspi camera module, motion detection

Hello,

I am a bit confused on how all this work. I am running the Hassbian distro on a RPi2 and have a Camera Module. Per the documentation, I have downloaded ffmpeg, or more aptly avconv. So in my config.yaml file I have…

ffmpeg:
  ffmpeg_bin: /usr/bin/avconv

which is all well and good. I assume I am 1/3rd of the way there. My next challenge is understanding the camera portion…

camera:
  - platform: ffmpeg
    input: <WHAT GOES HERE?>
    name: Front Door

I see others in the forum have along the lines of -rtsp_transport <things here and stuff> ... … but I’m not sure the heck that means. :slight_smile: I’d like to understand, and would appreciate anyone who has the patience and time to help educate me. Simple Googling helps me understand it as a network control protocol… but how does that relate to my setup? How do I learn what to put here so I can optimize my setup?

Thank you,

  • SN

Hoping someone has experience here and can help me! Thanks.

Hey Supah,

What camera do you have, depending on the camera brand/model the FFMPEG url will be different.

Local USB Webcam set up as mjpeg - Very similar to FFMEG

platform: mjpeg
mjpeg_url: http://ipaddress:port/video.mjpg
name: Front Door

I’ve actually got the RPi Camera module. It’s on the Pi that’s also currently running HASS (Hassbian, specifically). raspivid works fine, raspistill works fine. Camera’s all good to go itself. I just am not positive what they input would be and have been having trouble finding resources to learn from.

I followed the step here:
https://www.element14.com/community/community/design-challenges/pi-iot/blog/2016/07/26/pi-control-hub-spoke-1-security-camera-setting-up-motion-to-stream-video

And then simply configured the camera like so:

camera:
  - platform: mjpeg
    mjpeg_url: http://$IP_OF_THE_RPI:8081
    name: XiaoCam

This also work when you run HASS on another machine.

Unfortunately, this did not work for me. I guess what I’m really looking for is a way to use raspivid, as I’m able to achieve very high framerates with it. I’m just not sure how to pipe it to ffmpeg but also use it in Homeassistant. Everyone on the forums here seem to be using external/ip cameras, but I have the camera module attached to the pi. :slight_smile:

The link I give also concern a camera attached to the pi (Pi Noir), it"s just that it would also work if the pi cam is on a remote Pi.

You’re correct. While I was able to connect to the feed made by Motion on my windows PC (and thus, it should work fine for HASS), it transmits the video in too low of a framerate for my liking. So I’m really hoping to stick to raspivid as I’ve been seeing 60+ fps. I can do my own motion detecting if it becomes a “want” later on, but as of now it’s not something I care about at all.

For others who might be following my steps in the future: I’ve gotten closer to what I need, however. Using raspivid piped to ffmpeg to ffserver with the below commands allows me to establish a camera feed, but I’ve not been able to connect to or view it yet. ffserver is quite noisy, so I redirect its ouput to null, but you could certainly pipe it to a logfile instead.

sudo ffserver -f /etc/ffserver.conf > /dev/null 2>&1 &
sudo raspivid -n -w 960 -h 540 -fps 35 -t 0 -vf -hf -o - | ffmpeg -i - -vcodec copy -an http://0.0.0.0:8090/webcam.ffm

…with my ffserver.conf below…

Port 8090
BindAddress 0.0.0.0
MaxClients 10
MaxBandwidth 50000

<Feed camera_module.ffm>
file /tmp/camera_module.ffm
FileMaxSize 16M
</Feed>

<Stream a.mpg>
Feed camera_module.ffm
Format mpeg
VideoSize 960x540
VideoFrameRate 35
VideoBitRate 2000
VideoQMin 1
VideoQMax 10
NoAudio
</Stream>

connect to the stream via http://<address_of_the_pi>:8090/a.mpg allows me to connect, but shows no video. :slight_smile:

edit1// Additionally, it looks like there’s a similar conversation going on simultaneously here. With this post being specifically interesting.

@SupahNoob You get much further with this?

Yes and No. ffserver is a piece of crap and you’re wasting your time trying to get it to work proper. It’s apparently not actively developed anymore, so I gave up on it.

I’m still working on getting something like livestreaming to work… but I think it will only work with the Pi3, and it can’t be the same machine that’s running HASS, because the streaming aspect takes up a fair amount of resources when combined with the picamera/mmal api. I couldn’t get anything better than 7-10s delay and right now, that’s not really “good enough” for me to keep researching so I moved on to functional replacements to livestreaming.

Relevant Python code below though!

import subprocess
from picamera import PiCamera

...

    def __init__(self, output, save_dir, node_name):
        self.camera = PiCamera()

    def start_ffmpeg(self):
        """
        Publish the H264 camera feed to NGINX RTMP server
        """
        livestream_url = 'rtmp://{}/live/{}'.format(self.nginx_server_ip, self.node_name.lower())

        ffmpeg_proc = subprocess.Popen(
            [
                'ffmpeg',
                '-i', '-',
                '-codec:v', 'copy',
                '-f', 'flv',
                '{}'.format(livestream_url)
            ], stdin=subprocess.PIPE, stdout=subprocess.DEVNULL, stderr=subprocess.STDOUT
        )

        print('Livestream URL: {}'.format(livestream_url))

        return ffmpeg_proc

    def livestream(self):
        self.ffmpeg_proc = self.start_ffmpeg()
        self.camera.start_recording(self.ffmpeg_proc.stdin, 
                                    format='h264', 
                                    resize=(1280,720), 
                                    splitter_port=3,
                                    )