Local realtime person detection for RTSP cameras

I didn’t bother to build it at all. I just cloned the docker image and went to town.

As is, the image doesn’t build on ARM architecture. You have to modify according to the directions below. I was planning to update once I got my RPi4, but it still hasn’t shipped. I can look at updating it and testing on my RPi3.

Thanks uid0!

That’s fixed it :slight_smile:

& Thanks Blake
Hope you get your RPi4 soon :slight_smile:

Will you be able to share your Dockerfile for a raspberry

For v0.2.0-beta the only changes I had to make were the ones uid0 said above

For the ffmpeg-subprocess branch, just comment out the line below

 libva-drm2 libva2 i965-va-driver vainfo \

Hi can I ask you where I can find docker CPU version? Thank you.

I think the develpment on the CPU only version has stopped however, I have been running it since it’s release with no issues, try this: https://github.com/blakeblackshear/frigate/releases/tag/v0.0.1

2 Likes

Yes but I cant find frigate:0.0.1 on docker :confused: and I dont know how to make docker from zip file but I will try figure out. Thank you.

Extract the zip to your local system, open terminal if on linux and run the commands below:

  1. cd (your directory with frigate, example: /home/user/desktop/frigate)
  2. docker build -t frigate . (Don’t forget the period at the end)

Let me know how this works for you.

3 Likes

It works thank you a lot!

2 Likes

Hi

I’m running the ffmpeg branch on a rpi4 with 4 cameras (1 region per camera) @ 8fps. It copes with this just fine however after a while (say 12+ hours), the queue seems to stop processing, resulting a flood of ‘queue full’ messages and nothing detected.

Last night I reduced my config down to 1 camera and the same thing occurred. It’s most likely a coincidence but today it just happened to recover itself when I ssh’d in and started tailing the docker logs.

It had stopped working at 15:20 and recovered itself at 17:38:52. I’ve omitted 300k* ‘queue full’ messages but left the rest for context. It looks like it stopped working after the stream was restarted.

*Every time this has happened, there have been ~35 ‘queue full’ messages printed per second - coming from a 8 fps stream.

Please let me know if you need me to try anything or if I can give you any more information.

Thanks!

2019-07-31T22:49:34.846022559Z queue full. moving on
2019-08-01T06:54:16.047945605Z queue full. moving on
2019-08-01T15:20:40.333901222Z queue full. moving on
**2019-08-01T15:20:49.560783347Z last frame is more than 2 seconds old, restarting camera capture...**
**2019-08-01T15:20:49.561405279Z Killing the existing ffmpeg process...**
**2019-08-01T15:20:49.585887734Z Waiting for the capture thread to exit...**
**2019-08-01T15:20:49.866705849Z Creating a new ffmpeg process...**
**2019-08-01T15:20:49.866896975Z ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://xxxx:[email protected]:554/h264Preview_01_sub -f rawvideo -pix_fmt rgb24 pipe:**
**2019-08-01T15:20:49.902358804Z Creating a new capture thread...**
**2019-08-01T15:20:49.903187028Z Starting a new capture thread...**
2019-08-01T15:21:01.974617729Z queue full. moving on
2019-08-01T15:21:01.994283635Z queue full. moving on
2019-08-01T15:21:02.012862266Z queue full. moving on
2019-08-01T15:21:02.031864019Z queue full. moving on
2019-08-01T15:21:02.051394780Z queue full. moving on
2019-08-01T15:21:02.070620009Z queue full. moving on
2019-08-01T15:21:02.087977406Z queue full. moving on
2019-08-01T15:21:02.097336516Z queue full. moving on
2019-08-01T15:21:02.116409045Z queue full. moving on
2019-08-01T15:21:02.136028564Z queue full. moving on
2019-08-01T15:21:02.155387624Z queue full. moving on
2019-08-01T15:21:02.177395687Z queue full. moving on
2019-08-01T15:21:02.229053644Z queue full. moving on
2019-08-01T15:21:02.264200150Z queue full. moving on
2019-08-01T15:21:02.289786828Z queue full. moving on
2019-08-01T15:21:02.311576562Z queue full. moving on
2019-08-01T15:21:02.333195096Z queue full. moving on
2019-08-01T15:21:02.366358290Z queue full. moving on
2019-08-01T15:21:02.388409834Z queue full. moving on
2019-08-01T15:21:02.410120089Z queue full. moving on
2019-08-01T15:21:02.444373076Z queue full. moving on
2019-08-01T15:21:02.477961020Z queue full. moving on
2019-08-01T15:21:02.521607154Z queue full. moving on
2019-08-01T15:21:02.542322003Z queue full. moving on
2019-08-01T15:21:02.577770003Z queue full. moving on
2019-08-01T15:21:02.599522553Z queue full. moving on
2019-08-01T15:21:02.635506932Z queue full. moving on
2019-08-01T15:21:02.657262963Z queue full. moving on
2019-08-01T15:21:02.706910237Z queue full. moving on
2019-08-01T15:21:02.745748318Z queue full. moving on
2019-08-01T15:21:02.761630689Z queue full. moving on
2019-08-01T15:21:02.784409607Z queue full. moving on
2019-08-01T15:21:02.789292140Z queue full. moving on
2019-08-01T15:21:02.825426738Z queue full. moving on
2019-08-01T15:21:02.855459586Z queue full. moving on
2019-08-01T15:21:02.878049657Z queue full. moving on
2019-08-01T15:21:02.911402643Z queue full. moving on
2019-08-01T15:21:02.932256026Z queue full. moving on
2019-08-01T15:21:02.965794268Z queue full. moving on
2019-08-01T15:21:03.001353303Z queue full. moving on
2019-08-01T15:21:03.023210629Z queue full. moving on
2019-08-01T15:21:03.039819040Z queue full. moving on
2019-08-01T15:21:03.068815632Z queue full. moving on
2019-08-01T15:21:03.078369239Z queue full. moving on
2019-08-01T15:21:03.097654708Z queue full. moving on
2019-08-01T15:21:03.117733921Z queue full. moving on
2019-08-01T15:21:03.137466548Z queue full. moving on
2019-08-01T15:21:03.157364691Z queue full. moving on
2019-08-01T15:21:03.177325887Z queue full. moving on
2019-08-01T15:21:03.195957962Z queue full. moving on
2019-08-01T15:21:03.205245574Z queue full. moving on
2019-08-01T15:21:03.224872463Z queue full. moving on
2019-08-01T15:21:03.244527388Z queue full. moving on
2019-08-01T15:21:03.264133536Z queue full. moving on
2019-08-01T15:21:03.283624205Z queue full. moving on
2019-08-01T15:21:03.303053431Z queue full. moving on
2019-08-01T15:21:03.322753540Z queue full. moving on
2019-08-01T15:21:03.342820365Z queue full. moving on
2019-08-01T15:21:03.363383920Z queue full. moving on
2019-08-01T15:21:03.383146417Z queue full. moving on
2019-08-01T15:21:03.402311444Z queue full. moving on
2019-08-01T15:21:03.422165384Z queue full. moving on
2019-08-01T15:21:03.443119561Z queue full. moving on
2019-08-01T15:21:03.463461620Z queue full. moving on
2019-08-01T15:21:03.482486780Z queue full. moving on
2019-08-01T15:21:03.501534754Z queue full. moving on
2019-08-01T15:21:03.521084311Z queue full. moving on
2019-08-01T15:21:03.540287375Z queue full. moving on
2019-08-01T15:21:03.548186996Z queue full. moving on
2019-08-01T15:21:03.567408633Z queue full. moving on
2019-08-01T15:21:03.590586877Z queue full. moving on
2019-08-01T15:21:03.610324986Z queue full. moving on
2019-08-01T15:21:03.641779751Z queue full. moving on
2019-08-01T15:21:03.664887700Z queue full. moving on
2019-08-01T15:21:03.737644628Z queue full. moving on
2019-08-01T15:21:03.772928206Z queue full. moving on
2019-08-01T15:21:03.793704368Z queue full. moving on
2019-08-01T15:21:03.826540383Z queue full. moving on
2019-08-01T15:21:03.848005495Z queue full. moving on
2019-08-01T15:21:03.881370407Z queue full. moving on
2019-08-01T15:21:03.902823704Z queue full. moving on
2019-08-01T15:21:03.923706327Z queue full. moving on
2019-08-01T15:21:03.964162951Z queue full. moving on
2019-08-01T15:21:03.997559325Z queue full. moving on

<snip>queue full messages</snip>

2019-08-01T17:38:43.007228978Z queue full. moving on
2019-08-01T17:38:43.045237823Z queue full. moving on
2019-08-01T17:38:43.076018758Z queue full. moving on
2019-08-01T17:38:43.099803813Z queue full. moving on
2019-08-01T17:38:43.111062651Z queue full. moving on
2019-08-01T17:38:43.148388160Z queue full. moving on
2019-08-01T17:38:43.184362459Z queue full. moving on
2019-08-01T17:38:43.209241360Z queue full. moving on
2019-08-01T17:38:43.251422796Z queue full. moving on
2019-08-01T17:38:43.278006733Z queue full. moving on
2019-08-01T17:38:43.344020724Z queue full. moving on
2019-08-01T17:38:43.411174392Z queue full. moving on
2019-08-01T17:38:43.434270260Z queue full. moving on
2019-08-01T17:38:43.444179573Z queue full. moving on
2019-08-01T17:38:43.481596839Z queue full. moving on
2019-08-01T17:38:43.519989361Z queue full. moving on
2019-08-01T17:38:43.557515365Z queue full. moving on
2019-08-01T17:38:43.598912634Z queue full. moving on
2019-08-01T17:38:43.624971175Z queue full. moving on
2019-08-01T17:38:43.650425933Z queue full. moving on
2019-08-01T17:38:43.677484230Z queue full. moving on
2019-08-01T17:38:43.701000810Z queue full. moving on
2019-08-01T17:38:43.743354557Z queue full. moving on
2019-08-01T17:38:43.766748899Z queue full. moving on
2019-08-01T17:38:43.803794600Z queue full. moving on
2019-08-01T17:38:43.827770206Z queue full. moving on
2019-08-01T17:38:43.859205385Z queue full. moving on
2019-08-01T17:38:43.879155879Z queue full. moving on
2019-08-01T17:38:43.890891318Z queue full. moving on
2019-08-01T17:38:43.919947550Z queue full. moving on
2019-08-01T17:38:43.933176252Z queue full. moving on
2019-08-01T17:38:43.954765746Z queue full. moving on
2019-08-01T17:38:44.007456000Z queue full. moving on
2019-08-01T17:38:44.028875276Z queue full. moving on
2019-08-01T17:38:44.049715343Z queue full. moving on
2019-08-01T17:38:44.081777211Z queue full. moving on
2019-08-01T17:38:44.131627438Z queue full. moving on
2019-08-01T17:38:44.178234164Z queue full. moving on
2019-08-01T17:38:44.217045992Z queue full. moving on
2019-08-01T17:38:44.287971743Z queue full. moving on
2019-08-01T17:38:44.320886537Z queue full. moving on
2019-08-01T17:38:44.347022891Z queue full. moving on
2019-08-01T17:38:44.385808497Z queue full. moving on
2019-08-01T17:38:44.397415809Z queue full. moving on
2019-08-01T17:38:44.418991971Z queue full. moving on
2019-08-01T17:38:44.456914115Z queue full. moving on
2019-08-01T17:38:44.490195419Z queue full. moving on
2019-08-01T17:38:44.501620087Z queue full. moving on
2019-08-01T17:38:44.554214493Z queue full. moving on
2019-08-01T17:38:44.571675432Z queue full. moving on
2019-08-01T17:38:44.608495471Z queue full. moving on
2019-08-01T17:38:44.618327934Z queue full. moving on
2019-08-01T17:38:44.658119944Z queue full. moving on
2019-08-01T17:38:44.693626624Z queue full. moving on
2019-08-01T17:38:44.702011990Z queue full. moving on
2019-08-01T17:38:44.720547535Z queue full. moving on
2019-08-01T17:38:44.758931705Z queue full. moving on
2019-08-01T17:38:44.771909709Z queue full. moving on
2019-08-01T17:38:44.814182866Z queue full. moving on
2019-08-01T17:38:44.846536321Z queue full. moving on
2019-08-01T17:38:44.884357874Z queue full. moving on
**2019-08-01T17:38:52.881136799Z last frame is more than 2 seconds old, restarting camera capture...**
**2019-08-01T17:38:52.881284055Z Killing the existing ffmpeg process...**
**2019-08-01T17:38:52.895203446Z ffmpeg didnt return a frame. something is wrong. exiting capture thread...**
**2019-08-01T17:38:52.895471533Z Waiting for the capture thread to exit...**
**2019-08-01T17:38:52.897379138Z Creating a new ffmpeg process...**
**2019-08-01T17:38:52.897508617Z ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://xxxx:[email protected]:554/h264Preview_01_sub -f rawvideo -pix_fmt rgb24 pipe:**
**2019-08-01T17:38:52.959275136Z Creating a new capture thread...**
**2019-08-01T17:38:52.960130265Z Starting a new capture thread...**
2019-08-01T17:39:04.180209906Z queue full. moving on
2019-08-01T17:39:04.201811183Z queue full. moving on
2019-08-01T17:39:04.241866326Z queue full. moving on
2019-08-01T17:39:04.273307939Z queue full. moving on
2019-08-01T17:39:04.301128281Z queue full. moving on
2019-08-01T17:39:04.333104419Z queue full. moving on
2019-08-01T17:39:04.353802883Z queue full. moving on
2019-08-01T17:39:04.373849306Z queue full. moving on
2019-08-01T17:39:04.411188324Z queue full. moving on
2019-08-01T17:39:04.432159856Z queue full. moving on
2019-08-01T17:39:04.453046019Z queue full. moving on
2019-08-01T17:39:04.474933512Z queue full. moving on
2019-08-01T17:39:04.495658513Z queue full. moving on
2019-08-01T17:39:04.528550741Z queue full. moving on
2019-08-01T17:39:04.549356184Z queue full. moving on
2019-08-01T17:39:04.584288701Z queue full. moving on
2019-08-01T17:39:04.596104863Z queue full. moving on
2019-08-01T17:39:04.628283719Z queue full. moving on
2019-08-01T17:39:04.648697319Z queue full. moving on
2019-08-01T17:39:04.670539572Z queue full. moving on
2019-08-01T17:39:04.692041295Z queue full. moving on
2019-08-01T17:39:04.724734214Z queue full. moving on
2019-08-01T17:39:04.745604026Z queue full. moving on
2019-08-01T17:39:04.765272420Z queue full. moving on
2019-08-01T17:39:04.807511477Z queue full. moving on
2019-08-01T17:39:04.826424147Z queue full. moving on
2019-08-01T17:39:04.847085945Z queue full. moving on
2019-08-01T17:39:04.900867319Z queue full. moving on
2019-08-01T17:39:04.939315498Z queue full. moving on
2019-08-01T17:39:04.959971259Z queue full. moving on
2019-08-01T17:39:04.994314029Z queue full. moving on

<queue full messages stopped at this point, and all working >

Im not sure at what point you built the RPi image off that branch. I did make some changes that could have fixed your issue. Can you try rebuilding the image (with the same modifications you made before) from the v0.2.0 tag to make sure you have the latest change?

1 Like

Thanks for the quick reply. I’ve checked out the v0.2.0 tag but there are no differences. I think I last pulled from the ffmpeg branch at the weekend. The only thing I omitted from the dockerfile was the i965-va-driver.

Not sure if this is worth mentioning, and I don’t think it is related, but sometimes when I start it up the stream is corrupted. It’s not completely frozen, there is still movement behind it.

Edit. I re-read the thread and found your hint about the ffmpeg debug option. I’ll give this a go and see if there are any clues. Also, I didn’t change any of the ffmpeg flags, I’ll try and do some experimenting there too.

1 Like

Turn off stream encryption.

1 Like

I assume that’s on the camera? My cameras do not have an option to disable it. The majority of the time I can view the stream just fine and I haven’t had to explicitly decrypt it?

Cheers

I want to run frigate on NUC (NUC8i5BEH, RAM 16GB, HDD 120GB) but I don’t have Google Coral USB Accelerator, how?

You can run version 0.1 which uses the CPU, but be prepared for the fact that you will likely be sacrificing that entire machine for the purpose of doing this. At $75, the Coral is a steal for the power it gives you (more than your entire CPU on that NUC will).

I have a machine running zoneminder, and bought a Google Coral USB device a while back with a plan to run some sort of object detection. I managed to get frigate up and running very easily over a weekend. It’s currently processing the output from 6 cameras (with a totoal of 19 regions) working off the secondary rstp stream. It’s been rock solid so far - good work!

I’m seeing a lot of “queue full. moving on” messages though. Does this suggest I need to either drop the frame rate or reduce the number of regions? CPU usage seems to be about 65% of a core (35% python process plus 6-7% per ffmpeg) on an acnient core i5-3470S. I tried to get ffmpeg to use hwaccel (qsv), but all I get is blank frames

At the moment I have frigate configured to drop the processed frame rate down to 5fps (take_frame=5 with 25fps streams). I also use the 2nd camera stream elsewhere though (live remote viewing for example) and would rather not change the frame rate in the camera.

From my layman’s reading of video.py, it drops frames it gets from ffmpeg at the specified rate. Would it not be more efficient to get ffmpeg to drop the frame rate using a video filter?
https://trac.ffmpeg.org/wiki/ChangingFrameRate

James

Try running benchmark.py. That will tell you what your inference times are on your machine. In general, yes, ongoing queue full messages mean that the processing queue for the coral is getting full and it can’t keep up. I don’t think changing the framerate with Ffmpeg will be more efficient. It still has to decode every frame, and frigate just grabs and throws away extra frames with little overhead.

Cool, thanks. How does one go about running benchmark.py? :thinking: