Thanks! I’ll keep you posted.
I tired running the latest dev
image last night (confirmed with the hash against what was listed on DuckerHub). It looks like one of my cameras still stopped working last night. When I look at it, Frigate still seems to be detecting now without a jammed up queue. So that’s progress!
But the front_window
ffmpeg stream stopped responding shortly after 2 am and never restarted.
The green light in the screenshot is when I pulled your latest dev
image and started it up.
Here are the logs (in UTC). It looks like it was a pretty quiet night in the logs from the time that it first started up:
2020-03-07T01:40:49.744759216Z On connect called
2020-03-07T01:40:49.896175893Z /arrow/cpp/src/plasma/store.cc:1226: Allowing the Plasma store to use up to 0.4GB of memory.
2020-03-07T01:40:49.896192501Z /arrow/cpp/src/plasma/store.cc:1253: Starting object store with directory /dev/shm and huge page support disabled
2020-03-07T01:40:50.758495797Z Starting detection process: 29
2020-03-07T01:40:50.760509429Z Camera_process started for front_window: 30
2020-03-07T01:40:50.762050807Z Starting process for front_window: 30
2020-03-07T01:40:50.762337847Z ffprobe -v panic -show_error -show_streams -of json "rtmp://192.168.10.2:1935/bcs/channel0_main.bcs?channel=0&stream=0&user=admin&password=fakepassword"
2020-03-07T01:40:50.762711458Z Camera_process started for kitchen: 31
2020-03-07T01:40:50.764457526Z Starting process for kitchen: 31
2020-03-07T01:40:50.764815766Z ffprobe -v panic -show_error -show_streams -of json "rtmp://192.168.10.1:1935/bcs/channel0_main.bcs?channel=0&stream=0&user=admin&password=fakepassword"
2020-03-07T01:40:50.777726862Z * Serving Flask app "detect_objects" (lazy loading)
2020-03-07T01:40:50.777746230Z * Environment: production
2020-03-07T01:40:50.777750469Z WARNING: This is a development server. Do not use it in a production deployment.
2020-03-07T01:40:50.777780693Z Use a production WSGI server instead.
2020-03-07T01:40:50.777794254Z * Debug mode: off
2020-03-07T01:40:56.140539822Z {'streams': [{'index': 0, 'codec_name': 'aac', 'codec_long_name': 'AAC (Advanced Audio Coding)', 'profile': 'LC', 'codec_type': 'audio', 'codec_time_base': '1/16000', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'sample_fmt': 'fltp', 'sample_rate': '16000', 'channels': 1, 'channel_layout': 'mono', 'bits_per_sample': 0, 'r_frame_rate': '0/0', 'avg_frame_rate': '0/0', 'time_base': '1/1000', 'start_pts': 499272321, 'start_time': '499272.321000', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}, {'index': 1, 'codec_name': 'h264', 'codec_long_name': 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10', 'profile': 'High', 'codec_type': 'video', 'codec_time_base': '1/20', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'width': 2560, 'height': 1440, 'coded_width': 2560, 'coded_height': 1440, 'has_b_frames': 0, 'sample_aspect_ratio': '0:1', 'display_aspect_ratio': '0:1', 'pix_fmt': 'yuv420p', 'level': 51, 'chroma_location': 'left', 'field_order': 'progressive', 'refs': 1, 'is_avc': 'true', 'nal_length_size': '4', 'r_frame_rate': '30/1', 'avg_frame_rate': '10/1', 'time_base': '1/1000', 'start_pts': 499272285, 'start_time': '499272.285000', 'bits_per_raw_sample': '8', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}]}
2020-03-07T01:40:56.148909376Z Creating ffmpeg process...
2020-03-07T01:40:56.148940531Z ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -use_wallclock_as_timestamps 1 -i rtmp://192.168.10.1:1935/bcs/channel0_main.bcs?channel=0&stream=0&user=admin&password=fakepassword -f rawvideo -pix_fmt rgb24 pipe:
2020-03-07T01:40:56.620343048Z {'streams': [{'index': 0, 'codec_name': 'aac', 'codec_long_name': 'AAC (Advanced Audio Coding)', 'profile': 'LC', 'codec_type': 'audio', 'codec_time_base': '1/16000', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'sample_fmt': 'fltp', 'sample_rate': '16000', 'channels': 1, 'channel_layout': 'mono', 'bits_per_sample': 0, 'r_frame_rate': '0/0', 'avg_frame_rate': '0/0', 'time_base': '1/1000', 'start_pts': 67265274, 'start_time': '67265.274000', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}, {'index': 1, 'codec_name': 'h264', 'codec_long_name': 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10', 'profile': 'High', 'codec_type': 'video', 'codec_time_base': '1/20', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'width': 2560, 'height': 1920, 'coded_width': 2560, 'coded_height': 1920, 'has_b_frames': 0, 'sample_aspect_ratio': '0:1', 'display_aspect_ratio': '0:1', 'pix_fmt': 'yuv420p', 'level': 51, 'chroma_location': 'left', 'field_order': 'progressive', 'refs': 1, 'is_avc': 'true', 'nal_length_size': '4', 'r_frame_rate': '12/1', 'avg_frame_rate': '10/1', 'time_base': '1/1000', 'start_pts': 67265191, 'start_time': '67265.191000', 'bits_per_raw_sample': '8', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}]}
2020-03-07T01:40:56.634381155Z Creating ffmpeg process...
2020-03-07T01:40:56.634412103Z ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -use_wallclock_as_timestamps 1 -i rtmp://192.168.10.2:1935/bcs/channel0_main.bcs?channel=0&stream=0&user=admin&password=fakepassword -f rawvideo -pix_fmt rgb24 pipe:
Lastly, here are the stats from the debug endpoint:
{
"coral": {
"detection_queue": 0,
"detection_start": 0,
"fps": 25.3,
"inference_speed": 9.07
},
"front_window": {
"detection_fps": 9.6,
"fps": 10,
"skipped_fps": 0
},
"kitchen": {
"detection_fps": 15.7,
"fps": 12.1,
"skipped_fps": 0
},
"plasma_store_rc": null,
"tracked_objects_queue": 0
}
Hi,
First thing first - HUGE thanks for this great system!
I’ve got the Coral usb stick yesterday so I’ve installed Frigate for the first time.
all seems to work ok for 1 camera, so I’ve added another 7 (so 8 1080p cameras in total).
I have a few issues.
- it seems that everything stopped working a while back.
i’m getting those message in repeate in the docker log:
ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://[email protected]:554/11 -f rawvideo -pix_fmt rgb24 pipe:
backshachen: ffmpeg_process exited unexpectedly with 0
Terminating the existing ffmpeg process...
Waiting for ffmpeg to exit gracefully...
Creating ffmpeg process...
ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://[email protected]:554/11 -f rawvideo -pix_fmt rgb24 pipe:
kvishback: ffmpeg_process exited unexpectedly with 0
Terminating the existing ffmpeg process...
Waiting for ffmpeg to exit gracefully...
Creating ffmpeg process...
ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://[email protected]:554/11 -f rawvideo -pix_fmt rgb24 pipe:
those are the stats:
and the debug:
coral: {
"detection_queue": 3761,
"detection_start": 0,
"fps": 0,
"inference_speed": 8.72
}
frontkvish: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
backkvish: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
kvishback: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
kvishfront: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
backshachen: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
shachenback: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
shachenfront: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
frontshachen: {
"detection_fps": 0,
"fps": 0.1,
"skipped_fps": 0
}
friendly_name: Frigate Debug
-
the other issue is with false detections, some examples:
https://i.imgur.com/TJV2LkR.jpg
hxxps://i.imgur.com/AWdSG3Y.jpg
hxxps://i.imgur.com/rnY3zhc.jpg
hxxps://i.imgur.com/Gvna0xo.jpg
hxxps://i.imgur.com/sGVMZEC.jpg -
I have one camera that I want to get detection for the full image, and another detection for a smaller part of the image (to turn on a nearby light automatically). do I need to setup 2 “cameras” from for same source to do this? one normally and one with a mask?
Thanks a lot again.
I’m not sure if this is it, so I’ll have to wait for it to die on it’s own to confirm, but it looks promising.
If I simply kill the ffmpeg process and let it restart, the parent is still running and the stream doesn’t continue.
Killing just this, does nothing
root 368 367 22 09:36 ? 00:00:05 ffmpeg -loglevel verbose -rtsp_transport tcp -i rtsp://username:[email protected]/live -f rawvideo -pix_fmt rgb24 pipe:
If I restart BOTH the ffmpeg AND the parent, it resumes.
UID PID PPID C STIME TTY TIME CMD
root 1 0 6 09:26 ? 00:00:38 python3.7 -u detect_objects.py
root 13 1 0 09:26 ? 00:00:04 /usr/local/bin/plasma_store -m 400000000 -s /tmp/plasma
root 19 1 0 09:26 ? 00:00:00 python3.7 -u detect_objects.py
root 21 1 47 09:26 ? 00:04:49 python3.7 -u detect_objects.py
root 25 21 74 09:26 ? 00:07:32 ffmpeg -loglevel verbose -rtsp_transport tcp -i rtsp://username:[email protected]:554/cam/realmonitor?channel=1&subtype=0 -f rawvideo -pix_fmt rgb
root 66 0 0 09:26 pts/0 00:00:00 /bin/bash
root 331 1 1 09:35 ? 00:00:01 [ffmpeg]
root 367 1 11 09:36 ? 00:00:03 python3.7 -u detect_objects.py
root 368 367 23 09:36 ? 00:00:06 ffmpeg -loglevel verbose -rtsp_transport tcp -i rtsp://username:[email protected]/live -f rawvideo -pix_fmt rgb24 pipe:
root 396 66 0 09:36 pts/0 00:00:00 ps -ef
The logs aren’t printing anything after 2am? And the video feed for front_window is frozen with an old timestamp?
This is probably the same issue some others are having. I can see your detection queue is full. I am not sure what the cause is yet, but it is awesome to see your coral hitting near 100fps. That would never have been possible in previous versions. You will probably benefit from some motion masks so coral can focus its attention on areas that matter.
Some of those false detections can probably be filtered out with masks or other heuristics like min size requirements for a person. Long term, you will probably need to train a custom layer on top of the base model to avoid detections for large dogs.
At the moment you will need to create 2 cameras with masks. When I implement this feature, you can switch to a single camera.
Just to clarify, killing 368 alone doesn’t work, but killing 368+367 does? Is that behavior consistent or does it sometimes work when killing just the ffmpeg process?
@Kyle I pushed up a new dev image with some more advanced debug capabilities. I added a pid
value for each camera and the coral to /debug/stats
. If a camera or the detection process gets stuck, you can go to /debug/print_stack?pid=<pid>
and it will signal to the external process to write its current stack trace to the logs. That should tell us where it is “stuck”.
correct.
I’ve tried 2 scenarios and both time restarting the ffmpeg and parent process allowed the stream to continue.
1: killed ffmpeg… when it restated on it’s own, nothing happened until I restarted both
2: rebooted the camera which causes the stream to die. I restated both processes and the stream continued.
I’ll test this again once it dies on it’s own, but so far that seems to take care of it…
That’s correct. Thanks for pushing a new image with more debugging. I’ll keep you posted.
Can you give the latest dev image a try? I made a change that may help it recover. I am hoping to avoid restarting the parent process because it looses all context for motion detection and objects being tracked.
I ran into the detection_queue issue myself. Nothing I had implemented fixes the issue. Turns out that python’s Queue class can end up in a deadlock in certain situations. I switched to SimpleQueue, which should fix it, but the the downside is that it is no longer possible to check queue sizes.
Thanks. I’ve updated to blakeblackshear/frigate:dev
@ 55199bd1f8538d2e9ed6e94d4dcd8efd1ffb51c450f53bec7f23bbfb647a931e
. I’ll let you know what the stacktrace is on the next freeze.
I’m on your latest dev build.
It died at 7pm after about an hour.
This time, I killed the ffmpeg process and it restarted and the video resumed.
Here are the latest logs
Past duration 0.619987 too largerame=329923 fps= 20 q=-0.0 size=2886166404kB time=04:34:56.15 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x
2020-03-08T03:42:41.004543378Z Last message repeated 3 times
Past duration 0.619987 too largerame=329931 fps= 20 q=-0.0 size=2886236388kB time=04:34:56.55 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x
2020-03-08T03:42:41.518867458Z Last message repeated 3 times
Past duration 0.619987 too large=2886717528kB time=04:34:59.30 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x =2339 drop=2187 speed= 1x
2020-03-08T03:42:43.828271714Z Last message repeated 1 times
2020-03-08T03:42:43.828454632Z Past duration 0.639992 too large
2020-03-08T03:42:44.155493545Z Last message repeated 2 times
Past duration 0.639992 too largerame=329997 fps= 20 q=-0.0 size=2886813756kB time=04:34:59.85 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x
2020-03-08T03:42:44.648607074Z Last message repeated 1 times
frame=331403 fps= 20 q=-0.0 size=2899034712kB time=04:36:09.70 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x =2339 drop=2187 speed= 1x
frame=332785 fps= 20 q=-0.0 size=2911203180kB time=04:37:18.75 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x 1x
frame=334162 fps= 20 q=-0.0 size=2923249176kB time=04:38:28.10 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x
frame=335556 fps= 20 q=-0.0 size=2935443888kB time=04:39:37.80 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x
File “detect_objects.py”, line 278, in e=04:40:01.70 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x
2020-03-08T03:47:46.378191068Z main()
2020-03-08T03:47:46.378830795Z File “detect_objects.py”, line 165, in main
2020-03-08T03:47:46.378998173Z camera_process[‘process’].start()
2020-03-08T03:47:46.379049256Z File “/usr/lib/python3.7/multiprocessing/process.py”, line 112, in start
2020-03-08T03:47:46.379106889Z self._popen = self._Popen(self)
2020-03-08T03:47:46.379153554Z File “/usr/lib/python3.7/multiprocessing/context.py”, line 223, in _Popen
2020-03-08T03:47:46.379222718Z return _default_context.get_context().Process._Popen(process_obj)
2020-03-08T03:47:46.379272735Z File “/usr/lib/python3.7/multiprocessing/context.py”, line 277, in _Popen
2020-03-08T03:47:46.379328012Z return Popen(process_obj)
2020-03-08T03:47:46.379372409Z File “/usr/lib/python3.7/multiprocessing/popen_fork.py”, line 20, in init
2020-03-08T03:47:46.379422611Z self._launch(process_obj)
2020-03-08T03:47:46.379467125Z File “/usr/lib/python3.7/multiprocessing/popen_fork.py”, line 74, in _launch
2020-03-08T03:47:46.379517436Z code = process_obj._bootstrap()
2020-03-08T03:47:46.379562751Z File “/usr/lib/python3.7/multiprocessing/process.py”, line 297, in _bootstrap
2020-03-08T03:47:46.379627804Z self.run()
2020-03-08T03:47:46.379671721Z File “/usr/lib/python3.7/multiprocessing/process.py”, line 99, in run
2020-03-08T03:47:46.379721534Z self._target(*self._args, **self._kwargs)
2020-03-08T03:47:46.379769283Z File “/opt/frigate/frigate/video.py”, line 201, in track_camera
2020-03-08T03:47:46.379820747Z frame_bytes = ffmpeg_process.stdout.read(frame_size)
frame=337433 fps= 20 q=-0.0 size=2951776404kB time=04:41:11.15 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x =2339 drop=2187 speed= 1x
frame=338822 fps= 20 q=-0.0 size=2964014856kB time=04:42:20.60 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x 1x
*** 2 dup!99 fps= 20 q=-0.0 size=2967312852kB time=04:42:39.95 bitrate=1433272.3kbits/s dup=2339 drop=2187 speed= 1x
2020-03-08T03:50:24.620382634Z *** dropping frame 339208 from stream 0 at ts 339205
2020-03-08T03:50:24.662056113Z *** dropping frame 339208 from stream 0 at ts 339206
Here are the logs from the following
/debug/print_stack?pid=22
File “detect_objects.py”, line 278, in e=04:45:10.70 bitrate=1433272.3kbits/s dup=2351 drop=2199 speed= 1x =2351 drop=2199 speed= 1x
2020-03-08T03:52:55.300052733Z main()
2020-03-08T03:52:55.300197304Z File “detect_objects.py”, line 165, in main
2020-03-08T03:52:55.300264538Z camera_process[‘process’].start()
2020-03-08T03:52:55.300319575Z File “/usr/lib/python3.7/multiprocessing/process.py”, line 112, in start
2020-03-08T03:52:55.300382062Z self._popen = self._Popen(self)
2020-03-08T03:52:55.300443709Z File “/usr/lib/python3.7/multiprocessing/context.py”, line 223, in _Popen
2020-03-08T03:52:55.300510379Z return _default_context.get_context().Process._Popen(process_obj)
2020-03-08T03:52:55.300585380Z File “/usr/lib/python3.7/multiprocessing/context.py”, line 277, in _Popen
2020-03-08T03:52:55.300655701Z return Popen(process_obj)
2020-03-08T03:52:55.300719668Z File “/usr/lib/python3.7/multiprocessing/popen_fork.py”, line 20, in init
2020-03-08T03:52:55.300784732Z self._launch(process_obj)
2020-03-08T03:52:55.300841109Z File “/usr/lib/python3.7/multiprocessing/popen_fork.py”, line 74, in _launch
2020-03-08T03:52:55.300914346Z code = process_obj._bootstrap()
2020-03-08T03:52:55.300973306Z File “/usr/lib/python3.7/multiprocessing/process.py”, line 297, in _bootstrap
2020-03-08T03:52:55.301039450Z self.run()
2020-03-08T03:52:55.301096635Z File “/usr/lib/python3.7/multiprocessing/process.py”, line 99, in run
2020-03-08T03:52:55.301159627Z self._target(*self._args, **self._kwargs)
2020-03-08T03:52:55.301220066Z File “/opt/frigate/frigate/video.py”, line 201, in track_camera
2020-03-08T03:52:55.301284070Z frame_bytes = ffmpeg_process.stdout.read(frame_size)
Hi,
1.
should I update to the dev version? any input that I can help with?
What’s the best way to reduce the coral fps - with the “take_frame” or the “fps”?
the cameras are set for 24fps output for the DVR system.
It’s not only large dogs, you can see a “person” bird here:
I also get false detection on tree parts.
How can I train the system to avoid such detections?
The dev version could help prevent it from getting stuck, but it could break for other reasons. The best way to reduce your FPS will be to use a substream directly on the camera with a lower FPS. If your cameras don’t support that, take_frame: 2
will drop the FPS to 12 by taking every other frame. The fps
value should be updated to 12
to reflect the new expected framerate. take_frame
reduces the frame rate,fps
tells frigate how many FPS it needs to process to stay realtime.
For model training, frigate doesnt provide anything that helps in this regard, but you can look here: https://coral.ai/docs/edgetpu/models-intro/
Setting your minimum and maximum person size can help to get rid of birds and tree bits, as they are likely too small to be a person anywhere in frame.
Just pushed a new dev image that will kill the ffmpeg process if frigate is stuck trying to read from its output for over 10 seconds.
the ffmpeg process is being restarted but the stream is not resuming.