Local realtime person detection for RTSP cameras

Thing is i’m only using the custom integration, no other sensors being created manually. Stopping the integration causes the errors stop etc
My config is fairly basic at this point too

[edit] host reboot may have sorted it

@biggen1684
Thanks for the hints, i will try it first with a higher resolution. I have set the max_disappeared to 75, so it needs a motion between 15 seconds, this works a little bit better. The main problem ist the presence detection in the kitchen, my wife loves the automatic lights but only when she comes in or leaves, not when the lights go out when she is in the kitchen and preparing the lunch :wink:
@hasshoolio
Stupid Question, how is the official way for a feature request? Post it as issue on github (sorry i am not very used to things like that) or here?

Last Question, how can i check if the hardware accelaration ist working? I have an AMD 4350G with on Board Graphics, set the environtment variable like in the docs and the ffmpeg settings. But CPU Usage and energy usage is nearly the same. Do i have to do something on the hostsystem (debian with a backport kernel) too?

EDIT: FYI Found it myself, it is working (without doing anything on the host system). But on low resolutions (640x360) is uses slightly more CPU. (3,3% against 4.0 - 4.5 %). But on higher resolution (1920x1080) it drops from about 13.5 to 9%, so even if using a coral its a good choice to enable the hardware acceleration.

Thanks in Advance

Mike

not stupid at all

That’s what I’ve seen people do in the past. I’m not sure if Blake or Paul have ever specified how they want feature requests, whether discussion or issue in github. It’s pretty straight forward, take a look around and you’ll see titles of [FR] or [feature request].

If you don’t specify CPU in the config I think it will only use your edge tpu or error out and not start. see https://blakeblackshear.github.io/frigate/configuration/detectors

I’d do a feature request (or bug notification) on github. It’s much easier for the devs to track there.

It was a good idea. I tried to add the detectors: section, but since I’m using a usb coral, it seems that leaving it out was okay (as evidenced by it working with the same config file if run by command line). Anyhow, here’t the config and matching log. I can’t figure it out. Same config works with the command line posted in my previous comment.
Config:

mqtt:
  host: 10.10.10.20
  user: homeassistant 
  password: '{FRIGATE_MQTT_PASSWORD}'
objects:
  track:
    - person
    - cat
    - dog
    - bird
  filters:
    person:
      min_area: 4000
      min_score: 0.5
      threshold: 0.72
    cat:
      threshold: 0.72
    dog:
      threshold: 0.72
    bird:
      threshold: 0.72

detectors:
  coral:
    type: edgetpu
    device: usb

ffmpeg:
  hwaccel_args:
    - -hwaccel
    - vaapi
    - -hwaccel_device
    - /dev/dri/renderD128
    - -hwaccel_output_format
    - yuv420px

cameras:
  right:
    width: 1080
    height: 1920
    fps: 4
    ffmpeg:
      inputs:
        - path: rtsp://scstraus:{FRIGATE_RTSP_PASSWORD}@192.168.1.54:554/Streaming/channels/1
          roles:
            - detect
            - rtmp
            - clips
    objects:
      filters:
        person:
          min_area: 9000
        
    snapshots:
      enabled: True
      timestamp: False
      bounding_box: True
    clips:
      enabled: True
    motion:
      mask:
        - 961,218,933,0,1080,0,1080,1527,1004,1920,1080,1920,1080,948,994,392,623,129,505,376,383,672,140,1353,0,1866,0,0,687,0,678,30

  left:
    width: 1080
    height: 1920
    fps: 4
    ffmpeg:
      inputs:
        - path: rtsp://scstraus:{FRIGATE_RTSP_PASSWORD}@192.168.1.34:554/Streaming/Channels/101?transportmode=multicast&profile=Profile_1
          roles:
            - detect
            - rtmp
            - clips
    objects:
      filters:
        person:
          min_area: 14000
    snapshots:
      enabled: True
      timestamp: False
      bounding_box: True
    clips:
      enabled: True
    motion:
      mask:
        - 653,128,654,179,886,169,872,176,905,0,0,0,0,406,335,144
        - 1080,0,1080,205,952,178,945,101,987,0
        - 1080,952,1031,917,1047,817,1014,796,1059,768,1080,667
        - 813,368,948,368,951,292,808,298

  front:
    width: 1920
    height: 1080
    fps: 4
    ffmpeg:
      inputs:
        - path: rtsp://scstraus:{FRIGATE_RTSP_PASSWORD}@192.168.1.26:554/Streaming/Channels/101?transportmode=multicast&profile=Profile_1 
          roles:
            - detect
            - rtmp
            - clips
    snapshots:
      enabled: True
      timestamp: False
      bounding_box: True
    clips:
      enabled: True
    objects:
      track:
        - car
        - truck
        - person
        - cat
        - dog
        - bird
    zones:
    # Coordinates can be generated at https://www.image-map.net/

      on_property:
        coordinates: 552,0,581,60,823,172,869,207,1198,443,1231,424,1736,807,1810,679,1920,768,1917,1073,3,1080,0,0
      on_sidewalk:
        coordinates: 570,0,863,103,1206,315,1264,314,1919,802,1919,683,1318,254,923,61,721,0
      on_street:
        coordinates: 573,0,855,90,862,117,1191,303,1275,317,1919,783,1919,0
      at_gate:
        coordinates: 598,0,682,0,780,0,987,0,871,97
      inside_gate:
        coordinates: 621,380,821,212,801,71,661,37,504,119
      car_pulling_in:
        coordinates: 871,182,1177,408,1225,391,1222,501,636,835,415,531,615,391,574,277,852,101
    motion:
      mask:
        - 0,1080,0,0,62,0,792,1080


  back:
    width: 1920
    height: 1080
    fps: 4
    ffmpeg:
      inputs:
        - path: rtsp://scstraus:{FRIGATE_RTSP_PASSWORD}@192.168.1.23:554/Streaming/Channels/101?transportmode=multicast&profile=Profile_1 
          roles:
            - detect
            - rtmp
            - clips
    objects:
      filters:
        person:
          min_area: 9000
    snapshots:
      enabled: True
      timestamp: False
      bounding_box: True
    clips:
      enabled: True
    motion:
      mask:
        - 739,0,745,54,124,56,120,0

Log:

 * Starting nginx nginx,
   ...done.,
Starting migrations,
peewee_migrate                 INFO    : Starting migrations,
There is nothing to migrate,
peewee_migrate                 INFO    : There is nothing to migrate,
frigate.mqtt                   INFO    : MQTT connected,
detector.coral                 INFO    : Starting detection process: 35,
frigate.app                    INFO    : Camera processor started for right: 38,
frigate.edgetpu                INFO    : Attempting to load TPU as usb,
frigate.app                    INFO    : Camera processor started for left: 39,
Process detector:coral:,
frigate.edgetpu                INFO    : No EdgeTPU detected.,
frigate.app                    INFO    : Camera processor started for front: 40,
frigate.app                    INFO    : Camera processor started for back: 41,
frigate.app                    INFO    : Capture process started for right: 42,
frigate.app                    INFO    : Capture process started for left: 43,
frigate.app                    INFO    : Capture process started for front: 46,
frigate.app                    INFO    : Capture process started for back: 48,
Traceback (most recent call last):,
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 152, in load_delegate,
    delegate = Delegate(library, options),
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 111, in __init__,
    raise ValueError(capture.message),
ValueError,
,
During handling of the above exception, another exception occurred:,
,
Traceback (most recent call last):,
  File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap,
    self.run(),
  File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run,
    self._target(*self._args, **self._kwargs),
  File "/opt/frigate/frigate/edgetpu.py", line 124, in run_detector,
    object_detector = LocalObjectDetector(tf_device=tf_device, num_threads=num_threads),
  File "/opt/frigate/frigate/edgetpu.py", line 63, in __init__,
    edge_tpu_delegate = load_delegate('libedgetpu.so.1.0', device_config),
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 154, in load_delegate,
    raise ValueError('Failed to load delegate from {}\n{}'.format(,
ValueError: Failed to load delegate from libedgetpu.so.1.0,
,
frigate.watchdog               INFO    : Detection appears to have stopped. Exiting frigate...,
frigate.app                    INFO    : Stopping...,
frigate.record                 INFO    : Exiting recording maintenance...,
frigate.object_processing      INFO    : Exiting object processor...,
frigate.events                 INFO    : Exiting event processor...,
frigate.events                 INFO    : Exiting event cleanup...,
frigate.watchdog               INFO    : Exiting watchdog...,
frigate.app                    INFO    : Stopping...,
 * Starting nginx nginx,
   ...done.,
Starting migrations,
peewee_migrate                 INFO    : Starting migrations,
There is nothing to migrate,
peewee_migrate                 INFO    : There is nothing to migrate,
frigate.mqtt                   INFO    : MQTT connected,
detector.coral                 INFO    : Starting detection process: 33,
frigate.app                    INFO    : Camera processor started for right: 36,
frigate.app                    INFO    : Camera processor started for left: 37,
frigate.app                    INFO    : Camera processor started for front: 38,
frigate.app                    INFO    : Camera processor started for back: 39,
frigate.app                    INFO    : Capture process started for right: 40,
frigate.app                    INFO    : Capture process started for left: 41,
frigate.app                    INFO    : Capture process started for front: 42,
frigate.app                    INFO    : Capture process started for back: 43,
frigate.edgetpu                INFO    : Attempting to load TPU as usb,
Process detector:coral:,
frigate.edgetpu                INFO    : No EdgeTPU detected.,
Traceback (most recent call last):,
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 152, in load_delegate,
    delegate = Delegate(library, options),
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 111, in __init__,
    raise ValueError(capture.message),
ValueError,
,
During handling of the above exception, another exception occurred:,
,
Traceback (most recent call last):,
  File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap,
    self.run(),
  File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run,
    self._target(*self._args, **self._kwargs),
  File "/opt/frigate/frigate/edgetpu.py", line 124, in run_detector,
    object_detector = LocalObjectDetector(tf_device=tf_device, num_threads=num_threads),
  File "/opt/frigate/frigate/edgetpu.py", line 63, in __init__,
    edge_tpu_delegate = load_delegate('libedgetpu.so.1.0', device_config),
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 154, in load_delegate,
    raise ValueError('Failed to load delegate from {}\n{}'.format(,
ValueError: Failed to load delegate from libedgetpu.so.1.0,
,
frigate.watchdog               INFO    : Detection appears to have stopped. Exiting frigate...,
frigate.app                    INFO    : Stopping...,
frigate.record                 INFO    : Exiting recording maintenance...,
frigate.object_processing      INFO    : Exiting object processor...,
frigate.events                 INFO    : Exiting event processor...,
frigate.events                 INFO    : Exiting event cleanup...,
frigate.watchdog               INFO    : Exiting watchdog...,

That is strange. It says “TPU found” when you run it from the command line with that long command?

In your docker run command, you are passing the usb bus as a volume. In the compose file, you are passing it as a device. That’s the difference.

1 Like

Thank you Blake. You are the best person in the world. It works now. That’s one of those things I would have looked at 100 times and just said “no, it’s exactly the same” and not thought about. Out of curiosity, why is it this way in the documentation? Is there something different about my environment, or should the documentation be updated?

It would be good to have a consistent approach in the docs. It is a bit of a moving target depending on which docker and docker-compose version you are using.

1 Like

@blakeblackshear: I just wanted to say a BIG thank you for all the work you are investing in the development of Frigate. I used the “old” 0.7x version for quite some time and was very satisfied with that.
But the additions in 0.8x and the HA integration are a huge improvement. The switch from 0.7 to 0.8 was very smooth and the new web interface looks fantastic. Enabling/disabling of clips/snapshots is a very nice addition as well because I always wanted clips to be saved only if my alarm system is active.

The only bad thing :face_with_hand_over_mouth: most of my node red automations that performed tasks like auto discovery of frigate related sensors for HA, clip & snapshot handling etc. are not needed any longer, because your components are already taking care of all this.

So: thanks again for your great work!

I’ve think I’ve fixed the corruption in the stream coming from my Reolink! Last night I updated the firmware of my poe switch as it was quite old and I haven’t seen any events with artefacts since :grinning:

I’ve only had a few events of car detection and I think they were triggered by the bushes and shadows so I reduced the motion mask. Is there any way to get motion boxes on the snapshots?

BTW the copy function when setting the masks isn’t working for me in Chrome on Windows x64. Also, the mask areas seem to change randomly when adding new ones. If you need any more info let me know.

EDIT: Nope it’s back again :joy: :roll_eyes:

One more thing, any tips on improving detection of smaller objects like cats? I tried on 1920x1080 and got it working but as soon as I switch down to 640x480 it stops.

Upping the resolution is the only way to improve detection of smaller/further objects. At least, that is what I’ve found. I run 1080 on all my detect streams now because of that.

1 Like

How many cameras are people running on a Raspberry Pi 4?

I’ve been running two fine for a while now (both 1280x720) and just added a third earlier today (1920x1080). I ran into an issue with a green image for one of the cameras until I came across the gpu_mem=256 fix in /boot/config.txt. I’m wondering now how much that change will help me in the future. Was it just enough to get a third camera running? Or should I be able to add a few more yet?

Hi please help i don’t get it. I am walking around, red squers apear, but no events and no text above me with person string.
Can somebody share his config for me to check settings?
config:

detectors:
  coral:
    type: cpu
mqtt:
  host: 192.168.5.6
  port: 1883
  topic_prefix: frigate
  client_id: frigate
  user: xxxxxx
  password: xxxxx
  stats_interval: 60

cameras:
  back:
    ffmpeg:
      inputs:
        - path: rtsp://xxxx:[email protected]:554/cam/realmonitor?channel=1&subtype=0&authbasic=aG9tZXNwRGUxc3RyYXRvMnNmZXJhMw==
          roles:
            - detect
    width: 1920
    height: 1080
    fps: 5

    best_image_timeout: 60

    detect:
      enabled: True
      max_disappeared: 25

    clips:
      enabled: False
      pre_capture: 5
      post_capture: 5
      objects:
        - person
      required_zones: []
      retain:
        default: 10
        objects:
          person: 15

    record:
      enabled: False
      retain_days: 30
    rtmp:
      # Required: Enable the live stream (default: True)
      enabled: False

    snapshots:
      enabled: False
      timestamp: False
      bounding_box: False
      crop: False
      height: 175
      required_zones: []
      retain:
        default: 10
        objects:
          person: 15

    mqtt:
      enabled: False

      timestamp: True
      bounding_box: True
      crop: True
      height: 270
      required_zones: []

    objects:
      track:
        - person
        - car

      filters:
        person:
          min_area: 1000
          max_area: 100000
          min_score: 0.5
          threshold: 0.7

Agreed, v0.8 has already exceeded every expectation I ever had for this project, and we aren’t even at v1 yet. Well done Blake, you are a coding monster.

If I knew enough to know the cases in which one was appropriate or the other, I would do it. Unfortunatley I only know what works on my case.

Silly question, is there a way to create an object of type generic animal or that have to be something on the tflite side? Reason I ask, in addition to tracking people/vehicles on our driveway, it’s also a game trail so we get all sorts of animals (deer, bear, bobcats, coyotes, etc) that come up and down. Right now I rely on a generic motion sensor that takes a snapshot from the camera if something walks by the motion sensor. Would love to capture if a generic “animal” goes by. However guessing not really possible.

Hi as you are online, would you mind to share your config? <3

Not sure how much help mine would be, I deployed mine via basically this helm chart into Kubernetes, not sure how many others are doing this; https://github.com/blakeblackshear/blakeshome-charts/blob/9197203fc97ffe92e978a66fafc66a5d94184b9b/charts/frigate/values.yaml