I am a bit confused on how all this work. I am running the Hassbian distro on a RPi2 and have a Camera Module. Per the documentation, I have downloaded ffmpeg, or more aptly avconv. So in my config.yaml file I have…
ffmpeg:
ffmpeg_bin: /usr/bin/avconv
which is all well and good. I assume I am 1/3rd of the way there. My next challenge is understanding the camera portion…
camera:
- platform: ffmpeg
input: <WHAT GOES HERE?>
name: Front Door
I see others in the forum have along the lines of -rtsp_transport <things here and stuff> ... … but I’m not sure the heck that means. I’d like to understand, and would appreciate anyone who has the patience and time to help educate me. Simple Googling helps me understand it as a network control protocol… but how does that relate to my setup? How do I learn what to put here so I can optimize my setup?
I’ve actually got the RPi Camera module. It’s on the Pi that’s also currently running HASS (Hassbian, specifically). raspivid works fine, raspistill works fine. Camera’s all good to go itself. I just am not positive what they input would be and have been having trouble finding resources to learn from.
Unfortunately, this did not work for me. I guess what I’m really looking for is a way to use raspivid, as I’m able to achieve very high framerates with it. I’m just not sure how to pipe it to ffmpeg but also use it in Homeassistant. Everyone on the forums here seem to be using external/ip cameras, but I have the camera module attached to the pi.
You’re correct. While I was able to connect to the feed made by Motion on my windows PC (and thus, it should work fine for HASS), it transmits the video in too low of a framerate for my liking. So I’m really hoping to stick to raspivid as I’ve been seeing 60+ fps. I can do my own motion detecting if it becomes a “want” later on, but as of now it’s not something I care about at all.
For others who might be following my steps in the future: I’ve gotten closer to what I need, however. Using raspivid piped to ffmpeg to ffserver with the below commands allows me to establish a camera feed, but I’ve not been able to connect to or view it yet. ffserver is quite noisy, so I redirect its ouput to null, but you could certainly pipe it to a logfile instead.
Yes and No. ffserver is a piece of crap and you’re wasting your time trying to get it to work proper. It’s apparently not actively developed anymore, so I gave up on it.
I’m still working on getting something like livestreaming to work… but I think it will only work with the Pi3, and it can’t be the same machine that’s running HASS, because the streaming aspect takes up a fair amount of resources when combined with the picamera/mmal api. I couldn’t get anything better than 7-10s delay and right now, that’s not really “good enough” for me to keep researching so I moved on to functional replacements to livestreaming.