MotionEye Native Integration (New Custom Component)

I’ve whipped up a native integration for the MotionEye (camera) API. The main benefits are:

  • Automatic discovery of cameras.
  • Trigger camera actions with Home Assistant events.

So you can do things like this card, which controls my CNC machine and has pan/tilt:

Details, pictures, examples, etc. can be found here. The github project has the README and installation.

I’ve tested it with dozens of cameras on my CCTV system, but this is my first time sharing a component with others. Please be patient and let me know how it goes!

3 Likes

Hi, I am inexperienced and I am trying the CC especially because it seems much easier to use the commands for the Pan and Tilt, I wanted to know something, why with the addon I see fluid straming, while with the CC I see 1 frame every 10 seconds? Thank you

Short answer: MotionEye UI is built for streaming, whereas Home Assistant cameras (not just this add-on) are rendered as 10s stills in the picture glance card.

Specifically, this component implements a type of mjpeg camera. Such cameras have a still_image_url (updated every 10s) and a mjpeg_url (stream). When you use a standard Picture Glance card for a camera, it uses a picture – aka, the still_image_url. If you want to see the stream, you should be able to click on the card to view the it in a popup. However, this will depend how you configured the MotionEye add-on. It is necessary that the HA backend have access to the streaming port on the MotionEye instance.

TBH, I find it much easier to run official MotionEye installs rather than using the HA add-on, which is not actually supported by the MotionEye devs.

Incidentally, improving that popup is one thing on my wishlist for this component.

I have just installed this and I get the cameras created. But how do I start the streaming? I have added the cameras to lovelace picture-entity. Ex:

    - type: picture-entity
      entity: camera.camera4
      camera_view: live

I just get this:
image

You need to enable streaming on MotionEye if you want access to it from Home Assistant:

Note that the add-on also works with “authentication enabled.”

If that’s enabled correctly, then take a look at the camera in Home Assistant. In its attributes, look at the mjpeg_url. My guess would be that Home Assistant cannot communicate with the host/port combo. For example, if you see “http://192.168.0.100:8081” as the mjpeg_url, you could simply try cURLing that from HomeAssistant to ensure that you are able to reach the server.

Your screenshot suggests it is trying to connect to MotionEye but cannot. Note that streaming happens on a different port than the MotionEye UI, so you may need to ensure that the correct ports are open on your MotionEye instance if you deployed via Docker, etc. More details should be in the logs.

Yes I run both HA and MotionEye in docker on the same host. That’s got to be it since I cannot access the mjpeg_url from anywhere.
I am just not sure where I need to change something.

What is the mjpeg_url you are seeing?

The fact that you’ve gotten this far means that Home Assistant can talk to the MotionEye API, just not to the streaming URL. Which suggests that the streaming port is simply not open on the Docker container, and it should just be a matter of running the container with that port open. For example, in the screenshot above, the streaming port is 8081. So you’d simply add -p 8081:8081 to the docker run command.

Just to be clear – you should be testing connectivity from within Home Assistant (e.g., SSH into the docker instance), not your host machine. Docker runs a virtual subnet, and your ability to access different sites will be completely different within Home Assistant.

1 Like

Thanks @zaneclaes
I put this in my docker run because 8081 is already in use by another container. So I guess I have to change that in the config for motion_eye in HA ?

-p 8765:8765 -p 58081:8081 -p 58082:8082

The camera in HA shows:

mjpeg_url: http://10.0.0.150:8081/

tl;dr that sounds right.

It looks like your home network is on the 10.0.0.1/24 subnet. Which is to say: 10.0.0.150 is the static IP of the device (RPi?) and you can connect to that IP address on your host machine.

In that case, yeah, you should be able to:

  1. Change the port setting in the above screenshot.
  2. Start the MotionEye docker container with that port open.
  3. Restart Home Assistant.

At that point, HA should see the new mjpeg_url and connect. If I was right about my above subnet assumption, you should also be able to open that same mjpeg_url in your web browser on your host machine and see the stream there, too (now that the Docker port is open).

You may wish to add authentication to the stream if you have concerns about having this port open.

1 Like

Yes thanks for your thorough feeback. It works now :laughing:

Edit: Quite pleased so far. At least compared to connecting directly to each camera with HAs ffmpeg integration.

1 Like

Glad to hear it! I’ll be planning to upstream it into HA add-ons some time soon, now that it’s worked for a handful of different people in different situations :slight_smile:

Can you add an option to create binary_sensor for motion detection ? I am not even sure how to do that manually.

I’ve been thinking about this for a bit now. There are two options. One is for MotionEye to call a webhook. That’s pretty easy to DIY (e.g., motion eye → node red → HA). However, it doesn’t seem as simple to build into a component (need to create HA API endpoint?)

The other is to watch the recording directory. You can actually do this easily in Home Assistant with the “folder” sensor. Any changes to that folder essentially constitute motion detection.

I’ve implemented a prototype of a sensor that does something like that. I call it a Recorder, and it stitches together multiple images into an animated gif. It’s specifically meant to work with MotionEye. When it sees images coming in (MotionEye is capturing 1sec still images), it is in the active/recording state. When the recording completes, the animated gif is saved into a destination folder.

I’m thinking of baking this into the MotionEye component itself, because it’s somewhat tied to a MotionEye camera (file format, etc).

Would this work for you?