Proof of concept: feeding Eufy's intermittent RTSP streams from wifi/battery cameras to Frigate

How much responsive is this when watching the livestream? We can incorporate this idea to my eufy security integration

Let’s say, you are watching live stream over frigate at t second and motion is triggered at t+1, how long it takes until you receive the rtsp stream into frigate?

1 Like

Based on my observations, I’d say around 1 sec (take it with a grain of salt for now). It doesn’t take much to replicate, but I could try to record something tomorrow if you want.

1 Like

Hey, no need to record to convince me, i of course trust you but having a video might help me to compare it existing solution for my.

If all looks great, i can integrate it into eufy_security or you can do it.

1 Like

(as one of the frigate contributors) one thing that needs to be kept in mind is that frigate keeps a background averaged for motion detection so frigate knows where to look for objects. In frigate0.13 when there is a major change in the background (in cases where a camera switches between color and IR mode for example) frigate goes into calibration mode where it waits until motion settles before looking for objects.

In this case if by “blank frame” you mean a single color this means any time the camera sees motion and switches to a camera stream frigate will not run detection at least for a second or two which may cause unexpected issues.

There is a similar project for aarlo cameras which sends the last frame in between motion bits so that way frigate is still running with a background that looks like the camera frame

Nice idea @crzynik , i have access to latest event image, so we can use it for idle frames.

On the other hand, this might not be an user issue for eufy cameras given that they usually have multiple type of sensors to catch motion, car, person, dog etc.

Thanks for sharing this @crzynik , that’s exactly the kind of feedback I was expecting.

I could definitively look in there and see how the repo owner sends the latest image with ffmpeg, or as @anon63427907 said, we could also take it from the events.

I’ll record something so people can see how reactive it is, and see if I can make the changes to add the still image directly.

Here is the recording. I’d say it takes roughly 1 sec from when the ffmpeg command is sent to seeing the real time update in Frigate’s UI. And when I switch the camera feed on and off in the app (not visible in the recording) maybe up to 2 secs (some of that is Eufy propagating the update from the app to the camera it-self or the homebase I’m guessing).

Next step: send the latest image on the camera.

you might need to make that file viewable by anyone with the link

Oops sorry, permission updated.

I updated the original script to have ffmpeg both stream and take a screenshot.

Here: runner.rb · GitHub

I have yet to correctly set up Frigate with my coral, then I’ll be able to properly test this.

Based on what I read I may went to lower down the resolution of streaming, and maybe other tweaks?

Once it’s tested, I guess it will be ready for packaging.

Thanks for sharing, is it possible to have recording around camera’s being triggered?

I am personally a bit worried on when camera is being prepared for streaming but not get ready yet, ffmpeg might hang for couple of seconds before starting to read stream or move into next frame.

One more thing, what would be the cpu consumption if you would run this on pi or some other lightweight hardware?

I could send you the recording to you privately, as I don’t want to see the outside of my home and waste a lot of time moving the camera etc. :smiley: How can I reach you?

CPU consumption

No idea, for now I’m running it on my 2016 MacBook Pro. I think someone’s going to have to try it on a Pi to see how it performs.

To improve reactivity, I could also explore having a continuous ffmpeg process and make the switch via a pipe. I think it’d be faster because it wouldn’t have the overhead of starting ffmpeg when the script detects the camera is live. Maybe I can try that next!

Please email me, [email protected]

Piping is also great idea.

Thanks

I’m still working on this. I had to reimplement part of the RTSP protocol (yay…) and something to receive RTP packets. However I’m not receiving any data from the camera (after doing all the RTSP handshakes with describe/setup/play). Any thoughts?

Did you check the custom integration which would allow you to initiate a RTSP stream without an actual motion event?

1 Like

I have not. The structure of the different code bases is still a bit confusing. Can you provide more details as to where I can find this code?

Thread: Eufy Security Integration
Integration: GitHub - fuatakgun/eufy_security: Home Assistant integration to manage Eufy Security devices as cameras, home base stations, doorbells, motion and contact sensors.

This custom integration will trigger camera to generate RTSP stream without any physical event occurring on the device (motion, sound, person etc).

1 Like

How complete is this? I’d be very excited to get a package that I can install.

No progress on my side since Nov last yr.