Local realtime person detection for RTSP cameras

I didn’t but I stopped, removed and restarted the container so the logs are gone.
I’ll try to catch it sooner.
How can I turn the tracked_objects_queue into an attribute?
Once I have that, I can create an alert and hopefully catch it earlier when the logs are more manageable.

Yes, unfortunately. I adjusted everything back to previous 0.4.0 FPS levels and it lasts a couple of hours, but then it starts happening again. Here’s an excerpt from the log:

2020-02-24T05:02:20.278834475Z Last frame for front is more than 30 seconds old...
2020-02-24T05:02:20.279008104Z Waiting for process to exit gracefully...
2020-02-24T05:02:20.289082455Z Process for front is not alive. Starting again...
2020-02-24T05:02:20.293133472Z Camera_process started for front: 2143
2020-02-24T05:02:20.293259129Z Last frame for cars is more than 30 seconds old...
2020-02-24T05:02:20.293495885Z Waiting for process to exit gracefully...
2020-02-24T05:02:20.295294752Z Starting process for front: 2143
2020-02-24T05:02:20.295478390Z ffprobe -v panic -show_error -show_streams -of json "rtsp://<redacted>:554/Streaming/Channels/2/preview"
2020-02-24T05:02:20.311840327Z Process for cars is not alive. Starting again...
2020-02-24T05:02:20.316754250Z Camera_process started for cars: 2146
2020-02-24T05:02:20.318413900Z Starting process for cars: 2146
2020-02-24T05:02:20.318675695Z ffprobe -v panic -show_error -show_streams -of json "rtsp://<redacted>:554/Streaming/Channels/2/preview"
2020-02-24T05:02:26.763670438Z {'streams': [{'index': 0, 'codec_name': 'h264', 'codec_long_name': 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10', 'profile': 'Main', 'codec_type': 'video', 'codec_time_base': '1/8', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'width': 640, 'height': 480, 'coded_width': 640, 'coded_height': 480, 'has_b_frames': 0, 'sample_aspect_ratio': '4:3', 'display_aspect_ratio': '16:9', 'pix_fmt': 'yuvj420p', 'level': 22, 'color_range': 'pc', 'color_space': 'bt709', 'color_transfer': 'bt709', 'color_primaries': 'bt709', 'chroma_location': 'left', 'field_order': 'progressive', 'refs': 1, 'is_avc': 'false', 'nal_length_size': '0', 'r_frame_rate': '25/1', 'avg_frame_rate': '4/1', 'time_base': '1/90000', 'start_pts': 132300, 'start_time': '1.470000', 'bits_per_raw_sample': '8', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}]}
2020-02-24T05:02:26.766983403Z ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -rtsp_transport tcp -stimeout 10000000 -use_wallclock_as_timestamps 1 -i rtsp://<redacted>:554/Streaming/Channels/2/preview -vf mpdecimate -f rawvideo -pix_fmt rgb24 pipe:
2020-02-24T05:02:26.876470992Z {'streams': [{'index': 0, 'codec_name': 'h264', 'codec_long_name': 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10', 'profile': 'Main', 'codec_type': 'video', 'codec_time_base': '1/8', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'width': 640, 'height': 480, 'coded_width': 640, 'coded_height': 480, 'has_b_frames': 0, 'sample_aspect_ratio': '4:3', 'display_aspect_ratio': '16:9', 'pix_fmt': 'yuvj420p', 'level': 22, 'color_range': 'pc', 'color_space': 'bt709', 'color_transfer': 'bt709', 'color_primaries': 'bt709', 'chroma_location': 'left', 'field_order': 'progressive', 'refs': 1, 'is_avc': 'false', 'nal_length_size': '0', 'r_frame_rate': '25/1', 'avg_frame_rate': '4/1', 'time_base': '1/90000', 'start_pts': 132300, 'start_time': '1.470000', 'bits_per_raw_sample': '8', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}]}
2020-02-24T05:02:26.878808796Z ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -rtsp_transport tcp -stimeout 10000000 -use_wallclock_as_timestamps 1 -i rtsp://<redacted>:554/Streaming/Channels/2/preview -vf mpdecimate -f rawvideo -pix_fmt rgb24 pipe:
2020-02-24T05:02:33.393187519Z WARNING: Invalid RefPicListX[] entry!!! It is not included in DPB
2020-02-24T05:02:33.474966978Z WARNING: Invalid RefPicListX[] entry!!! It is not included in DPB
2020-02-24T05:02:50.347313806Z Last frame for cat_cam is more than 30 seconds old...
2020-02-24T05:02:50.347383006Z Waiting for process to exit gracefully...
2020-02-24T05:02:50.364062942Z Process for cat_cam is not alive. Starting again...
2020-02-24T05:02:50.367209317Z Camera_process started for cat_cam: 2170
2020-02-24T05:02:50.370395397Z Starting process for cat_cam: 2170
2020-02-24T05:02:50.370812460Z ffprobe -v panic -show_error -show_streams -of json "rtsp://<redacted>:554/onvif1"
2020-02-24T05:02:53.246981901Z {'streams': [{'index': 0, 'codec_name': 'h264', 'codec_long_name': 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10', 'profile': 'Baseline', 'codec_type': 'video', 'codec_time_base': '0/2', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'width': 1280, 'height': 720, 'coded_width': 1280, 'coded_height': 720, 'has_b_frames': 0, 'sample_aspect_ratio': '0:1', 'display_aspect_ratio': '0:1', 'pix_fmt': 'yuv420p', 'level': 10, 'chroma_location': 'left', 'field_order': 'progressive', 'refs': 1, 'is_avc': 'false', 'nal_length_size': '0', 'r_frame_rate': '15/1', 'avg_frame_rate': '0/0', 'time_base': '1/90000', 'start_pts': 6030, 'start_time': '0.067000', 'bits_per_raw_sample': '8', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}, {'index': 1, 'codec_name': 'pcm_alaw', 'codec_long_name': 'PCM A-law / G.711 A-law', 'codec_type': 'audio', 'codec_time_base': '1/8000', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'sample_fmt': 's16', 'sample_rate': '8000', 'channels': 1, 'bits_per_sample': 8, 'r_frame_rate': '0/0', 'avg_frame_rate': '0/0', 'time_base': '1/8000', 'start_pts': 0, 'start_time': '0.000000', 'bit_rate': '64000', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}]}
2020-02-24T05:02:53.249318802Z ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -rtsp_transport udp -stimeout 10000000 -use_wallclock_as_timestamps 1 -i rtsp://172.16.16.192:554/onvif1 -vf mpdecimate -f rawvideo -pix_fmt rgb24 pipe:
2020-02-24T05:03:50.427565311Z Last frame for front is more than 30 seconds old...
2020-02-24T05:03:50.427643692Z Waiting for process to exit gracefully...
2020-02-24T05:03:50.433983196Z Process for front is not alive. Starting again...
2020-02-24T05:03:50.438162629Z Camera_process started for front: 2189
2020-02-24T05:03:50.439864889Z Starting process for front: 2189
2020-02-24T05:03:50.440163768Z ffprobe -v panic -show_error -show_streams -of json "rtsp://<redacted>:554/Streaming/Channels/2/preview"
2020-02-24T05:03:57.156244153Z {'streams': [{'index': 0, 'codec_name': 'h264', 'codec_long_name': 'H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10', 'profile': 'Main', 'codec_type': 'video', 'codec_time_base': '1/8', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'width': 640, 'height': 480, 'coded_width': 640, 'coded_height': 480, 'has_b_frames': 0, 'sample_aspect_ratio': '4:3', 'display_aspect_ratio': '16:9', 'pix_fmt': 'yuvj420p', 'level': 22, 'color_range': 'pc', 'color_space': 'bt709', 'color_transfer': 'bt709', 'color_primaries': 'bt709', 'chroma_location': 'left', 'field_order': 'progressive', 'refs': 1, 'is_avc': 'false', 'nal_length_size': '0', 'r_frame_rate': '25/1', 'avg_frame_rate': '4/1', 'time_base': '1/90000', 'start_pts': 132300, 'start_time': '1.470000', 'bits_per_raw_sample': '8', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}]}
2020-02-24T05:03:57.157939769Z ffmpeg -hide_banner -loglevel panic -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format yuv420p -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -vsync drop -rtsp_transport tcp -stimeout 10000000 -use_wallclock_as_timestamps 1 -i rtsp://<redacted>:554/Streaming/Channels/2/preview -vf mpdecimate -f rawvideo -pix_fmt rgb24 pipe:
2020-02-24T05:04:03.796864199Z WARNING: Invalid RefPicListX[] entry!!! It is not included in DPB
2020-02-24T05:04:50.471373276Z Last frame for cars is more than 30 seconds old...
2020-02-24T05:04:50.471689713Z Waiting for process to exit gracefully...
2020-02-24T05:04:50.484523723Z Process for cars is not alive. Starting again...
2020-02-24T05:04:50.487463841Z Camera_process started for cars: 2204
2020-02-24T05:04:50.489995289Z Starting process for cars: 2204

I had a look at the time when it stopped “recovering” but didn’t see anything directly. I will give RC3 a go now.

hi,what im doing wrong?

/arrow/cpp/src/plasma/store.cc:1226: Allowing the Plasma store to use up to 0.4GB of memory.
/arrow/cpp/src/plasma/store.cc:1253: Starting object store with directory /dev/shm and huge page support disabled
On connect called
Traceback (most recent call last):
  File "detect_objects.py", line 246, in <module>
    main()
  File "detect_objects.py", line 148, in main
    'fps': mp.Value('d', float(config['fps'])),
KeyError: 'fps'
Starting detection process: 21

1 Like

You don’t have fps defined in your camera config. It is a required value now.

cameras:
  principal:
    ffmpeg:
      ################
      # Source passed to ffmpeg after the -i parameter. Supports anything compatible with OpenCV and FFmpeg.
      # Environment variables that begin with 'FRIGATE_' may be referenced in {}
      ################
      input: rtsp://xxxx:[email protected]:554/h264Preview_01_main
      take_frame: 1
      fps: 5

@blakeblackshear, i have it like this

It’s dying quicker now. I’ve deleted the container and restarted it and it still died after about 15 minutes.

Here’s a few lines from the logs before it dies… it repeats this over and over until it just stops. ( the formatting is weird since the logs are Exported from the Synology docker windows… I can’t figure out how to extract the logs directly from the server)

2020-02-24 22:55:07,stderr,"frame= 595 fps= 19 q=-0.0 size= 5205060kB time=00:00:29.75 bitrate=1433272.3kbits/s speed=0.97x ^M/arrow/cpp/src/plasma/eviction_policy.cc:134: There is not enough space to create this object, so evicting 9 objects to free up 80628480 bytes. The number of bytes in use (before this eviction) is 391446528.

2020-02-24 22:55:08,stderr,"frame= 606 fps= 19 q=-0.0 size= 5301288kB time=00:00:30.30 bitrate=1433272.3kbits/s speed=0.971x ^M/arrow/cpp/src/plasma/eviction_policy.cc:134: There is not enough space to create this object, so evicting 9 objects to free up 80628480 bytes. The number of bytes in use (before this eviction) is 391446528.

2020-02-24 22:55:08,stderr,"/arrow/cpp/src/plasma/eviction_policy.cc:134: There is not enough space to create this object, so evicting 9 objects to free up 80628480 bytes. The number of bytes in use (before this eviction) is 391446528.

take_frame and fps are indented under ffmpeg. Pull them up a level.

1 Like

Those messages are a symptom of an error earlier in the logs. Still trying to figure out where it could be happening.

I think I’m having the same issue as @surge919. My captures on rc3 just seem to stop working, I’m happy to dump you my logs if you want them.

That would be helpful. I am also running into some issues, but my queue never fills up.

Ok, i’ll leave it going over night tonight and see if it reproduces. I’ll clear out logs tonight so we know its fresh. Thanks

I continue to get the errors on RC3.

The MJPEG endpoints continue to function but nothing gets detected. The Coral inference speed stays more or less the same but the FPS on the Coral drops to zero. The little white light on the Coral flashes (not pulsing as when it is detecting, just flashes constantly).

Coral FPS:
image
I restarted it around 23:00 and it ran until 1:26 and then stopped. It crashed earlier in the day too.

No Plasma RC value in the stats debug (“null”) and the queue on the Coral seems fine as well (it actually still seems like it updates):

Coral queue:
image

Had a look in the logs at around the time it crashed out and can’t really see something different than the other times.

2020-02-24T23:26:16.088823171Z Last frame for cat_cam is more than 30 seconds old...
2020-02-24T23:26:16.089212981Z Waiting for process to exit gracefully...
2020-02-24T23:26:16.100738214Z /arrow/cpp/src/plasma/store.cc:738: Disconnecting client on fd 7
2020-02-24T23:26:16.105777191Z Process for cat_cam is not alive. Starting again...
2020-02-24T23:26:16.111637219Z Camera_process started for cat_cam: 379Starting process for cat_cam: 379
2020-02-24T23:26:16.111847316Z 
2020-02-24T23:26:16.112083374Z ffprobe -v panic -show_error -show_streams -of json "rtsp://172.16.16.192:554/onvif1"
...

I did notice in the Docker log that one of the cameras seem to have connection issues, so I’ve removed that one for now and will continue monitoring the rest.

And you are sure there is motion that would trigger detections? The Coral FPS will drop to zero when there is no motion.

Yes, there should have been motion. I restarted again this morning (with only 2 instead of 3 cameras), both cameras point to the street and a lot of trees/bushes and I know my wife walked around there some time afterwards.

You can see after I started it it picked up a lot of motion and sent it to the Coral, but then at 9:47 it abruptly stopped and there was “nothing” since then.

Screenshot 2020-02-25 at 14.15.36

Cameras were definitely still alive (I removed the blue one from Frigate this morning):

image

I think you and I are seeing the same problem:
image

When my container gets in this state, I can see some defunct ffmpeg processes. I am trying to think of a better way to manage the subprocess.

hello

we are in the same boat :slight_smile:

thanks for your amazing Work :clap:

Did you restart the Docker container in-between those spikes?

Thanks for confirming Blake. I’ve set up an automation to warn me if the FPS drops to 0 for longer than 5 mins, then I’ll look at the logs and at top to see if there’s any stuck processes (anything to look for?)

On that topic, I installed “htop” to look at processes and their threads (green). Anything you can gleam from this? Should there be so many?

yes between .it work after dockers restart

Thanks again @blakeblackshear for your continuous work and sharing this with the world! “Coral Inference” = 10.0ms means it took 10ms to detect an object? is that correct? so smaller the number better performance?