Local realtime person detection for RTSP cameras

Hwaccel for ffmpeg is completely unrelated to inference times and separate from the job that the coral performs. You always want both if possible. The Coral does object detection and the hwaccel args for ffmpeg are for decoding the video stream.

Does this dog know how to cook? xD

1 Like

Thank you for letting us know about the results!

I think I may have to give it a shot soon. Edit: That T430 might be exactly what I was looking for. Knowing that I may pick one up as well.

I’m setting up Frigate on an old zbox hd11 (nvidia ION) and Coral USB but i got this error when i rise up the container after having pulled it.

i’m using docker-compose and this image:

image: blakeblackshear/frigate:stable-amd64

root@cams:~/docker-compose.d/frigate# docker-compose up


Starting frigate ... done
Attaching to frigate
frigate    |  * Starting nginx nginx
frigate    |    ...done.
frigate    | Fatal Python error: Illegal instruction
frigate    |
frigate    | Current thread 0x00007fcf4a990740 (most recent call first):
frigate    |   File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
frigate    |   File "<frozen importlib._bootstrap_external>", line 1101 in create_module
frigate    |   File "<frozen importlib._bootstrap>", line 556 in module_from_spec
frigate    |   File "<frozen importlib._bootstrap>", line 657 in _load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 991 in _find_and_load
frigate    |   File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
frigate    |   File "<frozen importlib._bootstrap>", line 1042 in _handle_fromlist
frigate    |   File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 36 in <module>
frigate    |   File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
frigate    |   File "<frozen importlib._bootstrap_external>", line 783 in exec_module
frigate    |   File "<frozen importlib._bootstrap>", line 671 in _load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 991 in _find_and_load
frigate    |   File "/opt/frigate/frigate/edgetpu.py", line 15 in <module>
frigate    |   File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
frigate    |   File "<frozen importlib._bootstrap_external>", line 783 in exec_module
frigate    |   File "<frozen importlib._bootstrap>", line 671 in _load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 991 in _find_and_load
frigate    |   File "/opt/frigate/frigate/app.py", line 19 in <module>
frigate    |   File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
frigate    |   File "<frozen importlib._bootstrap_external>", line 783 in exec_module
frigate    |   File "<frozen importlib._bootstrap>", line 671 in _load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
frigate    |   File "<frozen importlib._bootstrap>", line 991 in _find_and_load
frigate    |   File "/opt/frigate/frigate/__main__.py", line 7 in <module>
frigate    |   File "/usr/lib/python3.8/runpy.py", line 87 in _run_code
frigate    |   File "/usr/lib/python3.8/runpy.py", line 194 in _run_module_as_main

Any ideas?

Thank You!

Your CPU needs to have the AVX instruction set for TensorFlow to work.

This is a “feature” of the iOS app. If you use the same URL as your internal or external URL as configured in the app, it assumes you want to open it in a web view inside the app itself. The way I got around it is by adding another subdomain for media.mydomain.com that is different than my external URL, but still points to Home Assistant.

I also created a pretty thorough blueprint you can use as well here: Frigate Mobile App Notifications

ooohhhh… thank you…
maybe i supposed that even a commodore64 with usb coral could work… :slight_smile:

thank you

Frigate is an amazing project. However, I still cannot get hardware acceleration to work with Frigate. Whenever I enable it the camera output goes green and the log shows an error ffmpeg hardware acceleration sent a broken frame. memoryview assignment: lvalue and rvalue have different structures

I have tried on a RPi 4 running HassOS and now have tried on VM running in Virtualbox on an Ubuntu 20.04 host. I am running 8 cameras - 4 Sercomm iCamera2, and 4 Wyze cams (with the RTSP firmware). I could not get hardware acceleration working on either machine with any of the cameras.

Is it just my cameras?
Has anyone had any success with hardware acceleration on a Virtualbox VM? I have enabled Nested VT-x/AMD-V in the VM.I have also installed the VA-API on the host. When I run vainfo I get the following output:

libva info: VA-API version 1.7.0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_7
libva error: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so init failed
libva info: va_openDriver() returns 1
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so
libva info: Found init function __vaDriverInit_1_6
libva info: va_openDriver() returns 0
vainfo: VA-API version: 1.7 (libva 2.6.0)
vainfo: Driver version: Intel i965 driver for Intel(R) Ivybridge Desktop - 2.4.0
vainfo: Supported profile and entrypoints
      VAProfileMPEG2Simple            : VAEntrypointVLD
      VAProfileMPEG2Simple            : VAEntrypointEncSlice
      VAProfileMPEG2Main              : VAEntrypointVLD
      VAProfileMPEG2Main              : VAEntrypointEncSlice
      VAProfileH264ConstrainedBaseline: VAEntrypointVLD
      VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
      VAProfileH264Main               : VAEntrypointVLD
      VAProfileH264Main               : VAEntrypointEncSlice
      VAProfileH264High               : VAEntrypointVLD
      VAProfileH264High               : VAEntrypointEncSlice
      VAProfileH264StereoHigh         : VAEntrypointVLD
      VAProfileVC1Simple              : VAEntrypointVLD
      VAProfileVC1Main                : VAEntrypointVLD
      VAProfileVC1Advanced            : VAEntrypointVLD
      VAProfileNone                   : VAEntrypointVideoProc
      VAProfileJPEGBaseline           : VAEntrypointVLD

I am not sure if the errof ro the iHD driver is significant or not because the i965 driver seems to be working.

I would appreciate any suggestions.

Same error when i put the hardware acceleration on my RPI4
Config:
RPI4 (4Go)
Debian Buster 10 + Docker HomeAssistant + Addin Frigate

INFO : camera_entree: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video INFO : camera_entree: ffmpeg process is not running. exiting capture thread…
ffmpeg.camera_entree.detect ERROR : mmal: mmal_vc_shm_init: could not initialize vc shared memory service
ffmpeg.camera_entree.detect ERROR : mmal: mmal_vc_component_create: failed to initialise shm for ‘vc.ril.video_decode’ (7:EIO)
ffmpeg.camera_entree.detect ERROR : mmal: mmal_component_create_core: could not create component ‘vc.ril.video_decode’ (7)
ffmpeg.camera_entree.detect ERROR : Error while opening decoder for input stream #0:0 : Unknown error occurred
frigate.video INFO : camera_entree: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video

Blake shared the following in another post. You can access Reolink RTMP streams as follows: (Don’t forget to add the ffmeg input params in the config file - I have added them below).
Main stream:
rtmp://192.168.1.xx/bcs/channel0_main.bcs?channel=0&stream=0&user=admin&password={FRIGATE_RTSP_PASSWORD}
Sub stream:
rtmp://192.168.1.xx/bcs/channel0_sub.bcs?channel=0&stream=1&user=admin&password={FRIGATE_RTSP_PASSWORD}

ffmpeg:
  input_args: #Use for RTMP
    - -avoid_negative_ts
    - make_zero
    - -fflags
    - nobuffer
    - -flags
    - low_delay
    - -strict
    - experimental
    - -fflags
    - +genpts+discardcorrupt
    - -rw_timeout
    - '5000000'
    - -use_wallclock_as_timestamps
    - '1'
1 Like

Frigate works great! Thanks for putting this together.

Now I need to know how to train it to recognize different objects. In particular, I wish we had it trained for “generic animal” rather than “dog”/“cat”/etc. My cameras don’t tend to see many giraffes or elephants - but lots of squirrels, raccoons, deer & coyotes. It gets the coyote as a “dog” (which is it) and bobcats as “cat” (OK, I can live with that) but it normally totally ignores deer and other critters I’d like to alert on.

3 Likes

Coral dual m.2 with E->M adapter in NUC 8 running on HAOS6rc2.

As noted elsewhere, the adapter only gives you a single PCIe lane therefore you only get access to a single TPU on the board. Getting a 20.95ms inference speed compared to 100+ ms on the CPU.

Now the question is…can the other TPU be presented via USB off of the adapter headers to the internal USB headers on the NUC?

I was never able to figure out how to ffprobe the stream but I did find a log from someone else who did manage to do it.

Hi guys,

I have a netatmo camera whose live feed apparently can be accessed by a browser connecting to:

http://CAMERA_IP/ACCESS_KEY/live/files/high/index.m3u8

does someone know if this is supported by frigate?

Im still getting this error multiple times in the log and I not sure how to solve it.
ffmpeg.drivewaycam.detect ERROR : [segment @ 0x56352b04c480] Non-monotonous DTS in output stream 0:0; previous: 75608497, current: 75608490; changing to 75608498. This may result in incorrect timestamps in the output file.
Any thoughts or advice please.

I have two cams working and the other one seems ok.

Ok, after some test with a desktop pc i setup a rpi3+ with usb coral with frigate and 2 ezviz cams.
All is ok (a big big thank you @blakeblackshear for this amazing work… well done) with inference speed about 100ms (from 80ms to 200ms), but i know about the usb2 speed limit and after a testing period i will buy a rpi4 with usb3.

I looked in all thread but i didn’t find an answer about how to delete manually all videos and snapshot. is it possible? or should i wait the automatic jobs to do that?

Thank you

Hi @mr6880… trying to use your command_line sensor to pull the temp of my Coral, but no success - like because I’m running HassOS as a VirualBox VM on my Ubuntu host system. Do you (or anyone) know any workarounds to get access to command_line calls to a host system?

First, big probs to @blakeblackshear for creating this.

Then I have two questions

  1. I have a Interference speed of 130ms. Is this getting better with a coral? And I use the automation from the docs for notification, but the notification is slow. Does this comes from the interference speed?
  2. The automation is firing multiple times within a second. I thought it would fire only once, as long as the object isn’t gone. Am I misunderstanding something?

My setup:
Raspi 4, 4GB
HomeAssistant 64-Bit
core-2021.5.0
Frigate installed via add-ons v1.13

frigate.yml

mqtt:
  host: xxx
  user: xxx
  password: xxx
cameras:
  hof:
    ffmpeg:
      hwaccel_args:
        - -c:v
        - h264_v4l2m2m
      inputs:
        - path: rtsp://user:[email protected]:554/h264Preview_01_main
          roles:
            - clips
        - path: rtsp://user:[email protected]:554/h264Preview_01_sub
          roles:
            - detect
    width: 640
    height: 352
    fps: 5
    objects:
        track:
          - person
          - car
          - dog
    motion:
      mask:
        - 304,0,253,187,201,207,207,352,0,352,0,0 
        - 640,352,525,352,586,0,640,0
      threshold: 50
      contour_area: 150
    zones:  
      strasse:
        coordinates: 312,82,576,88,585,0,332,0  
      einfahrt:
        coordinates: 194,352,524,352,574,86,318,85
    snapshots:
      enabled: true
      retain:
        default: 10
        objects:
          person: 15
    clips:
      enabled: true
      retain: 
        default: 7


detectors:
  cpu1:
    type: cpu
  cpu2:
    type: cpu

Automation

- alias: kamera_hof_benachrichtigung
  id: kamera_hof_benachrichtigung
  description: >-
    Benachrichtigung wenn eine Person in der Einfahrt erkannt wird.
  trigger:
    platform: mqtt
    topic: frigate/events

  condition:
    - "{{ trigger.payload_json['after']['label'] == 'person' }}"
    - "{{ 'einfahrt' in trigger.payload_json['after']['entered_zones'] }}"

  action:
    - service: notify.mobile_app_suedpack_iphone
      data_template:
        message: "A {{trigger.payload_json['after']['label']}} has entered the yard."
        data:
          image: "https://l0s78v5e5n18jvi2khsnff0axlg80pnf.ui.nabu.casa/api/frigate/notifications/{{trigger.payload_json['after']['id']}}/thumbnail.jpg"
          tag: "{{trigger.payload_json['after']['id']}}"

    - service: notify.mobile_app_suedpack_iphone
      data_template:
        message: 'Es wurde Bewegung im Hof registriert um {{now().strftime("%H:%M %d-%m-%y")}} '
        data:
          attachment:
            content-type: jpeg
          push:
            badge: 0
            sound:
              name: bewegung_hof
              critical: 1
              volume: 1.0
            category: camera
          entity_id: camera.garten_kamera_hof

The problem is, that the last action part is fired multiple times. Why does this notification comes multiple and the other one not.

Using coral will make big difference. This is my i. speed. :slight_smile:
image

Wonderful. Does this also affect the speed of the automation? So my problem is not that the automation will fire 120ms to late. Sometimes it needs a second or more, and then the person is out of sight.

Or maybe another question, where do I benefit from fast inference speed?
because the load on the raspi is quite low with one camera.