Local realtime person detection for RTSP cameras

Yes, but you’ll still need to add the Frigate custom component for that. The custom component provides the integration with Home Assistant. The docker container is the equivalent of the add-on, i.e., the main instance of Frigate.

You would have to pass through the PCI device to the virtual machine. I haven’t tried it myself. It would probably work but based on other posts here and in the instructions, using a Coral with a VM adds a lot of latency. Honestly, if you’ve ever set up another Docker container in Unraid, it’s almost as simply as using the add on.

In theory it should work, adapters for mac mini are available, but it has to be tested.
I assume, that it should be also possible to use usb external enclosure for m.2 b+m key version, in case usb versions are not available and it might be also cheaper dependent on the enclosure price.

Hello Folks, Trying run Frigate on a Synology DS415+.
It seems I have a few issues, but first is that I can’t seem to get Frigate find the Coral TPU.
Can you have a look and tell me what I might be doing wrong?
@scstraus , looks like you are running on a Synology as well, maybe you can spot my mistake easily =)

Logs:

2021-05-06 14:37:17 stdout * Starting nginx nginx
2021-05-06 14:37:17 stdout …done.
2021-05-06 14:37:20 stderr Starting migrations
2021-05-06 14:37:20 stderr peewee_migrate INFO : Starting migrations
2021-05-06 14:37:20 stderr There is nothing to migrate
2021-05-06 14:37:20 stderr peewee_migrate INFO : There is nothing to migrate
2021-05-06 14:37:20 stderr frigate.mqtt INFO : MQTT connected
2021-05-06 14:37:20 stderr detector.coral INFO : Starting detection process: 33
2021-05-06 14:37:20 stderr frigate.edgetpu INFO : Attempting to load TPU as usb
2021-05-06 14:37:20 stderr frigate.app INFO : Camera processor started for back: 36
2021-05-06 14:37:20 stderr frigate.app INFO : Capture process started for back: 37
2021-05-06 14:37:20 stderr frigate.edgetpu INFO : No EdgeTPU detected.
2021-05-06 14:37:20 stderr Process detector:coral:
2021-05-06 14:37:20 stderr Traceback (most recent call last):
2021-05-06 14:37:20 stderr File /usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py, line 152, in load_delegate
2021-05-06 14:37:20 stderr delegate = Delegate(library, options)
2021-05-06 14:37:20 stderr File /usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py, line 111, in init
2021-05-06 14:37:20 stderr raise ValueError(capture.message)
2021-05-06 14:37:20 stderr ValueError
2021-05-06 14:37:20 stderr
2021-05-06 14:37:20 stderr During handling of the above exception, another exception occurred:
2021-05-06 14:37:20 stderr
2021-05-06 14:37:20 stderr Traceback (most recent call last):
2021-05-06 14:37:20 stderr File /usr/lib/python3.8/multiprocessing/process.py, line 315, in _bootstrap
2021-05-06 14:37:20 stderr self.run()
2021-05-06 14:37:20 stderr File /usr/lib/python3.8/multiprocessing/process.py, line 108, in run
2021-05-06 14:37:20 stderr self._target(*self._args, **self._kwargs)
2021-05-06 14:37:20 stderr File /opt/frigate/frigate/edgetpu.py, line 124, in run_detector
2021-05-06 14:37:20 stderr object_detector = LocalObjectDetector(tf_device=tf_device, num_threads=num_threads)
2021-05-06 14:37:20 stderr File /opt/frigate/frigate/edgetpu.py, line 63, in init
2021-05-06 14:37:20 stderr edge_tpu_delegate = load_delegate(‘libedgetpu.so.1.0’, device_config)
2021-05-06 14:37:20 stderr File /usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py, line 154, in load_delegate
2021-05-06 14:37:20 stderr raise ValueError(‘Failed to load delegate from {}\n{}’.format(
2021-05-06 14:37:20 stderr ValueError: Failed to load delegate from libedgetpu.so.1.0
2021-05-06 14:37:20 stderr
2021-05-06 14:37:20 stderr frigate.video INFO : back: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
2021-05-06 14:37:20 stderr frigate.video INFO : back: ffmpeg process is not running. exiting capture thread…
2021-05-06 14:37:30 stderr ffmpeg.back.detect ERROR : Option rw_timeout not found.
2021-05-06 14:37:30 stderr frigate.video INFO : back: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures

Docker run:

docker run -d \
  --name frigate \
  --privileged \
  --shm-size=1g \
  --mount type=tmpfs,target=/tmp/cache,tmpfs-size=1000000000 \
  --device /dev/bus/usb:/dev/bus/usb \
  -v /volume1/docker/frigate/config.yml:/config/config.yml \
  -v /etc/localtime:/etc/localtime:ro \
  -v /volume1/docker/frigate/media:/media/frigate \
  -p 25000:5000 \
  -p 1935:1935 \
  -e FRIGATE_RTSP_PASSWORD=******** \
  blakeblackshear/frigate:stable-amd64

and config:

mqtt:
  host: ********
  port: 1883
  topic_prefix: frigate
  client_id: frigate
  user: ********
  password: ********
  stats_interval: 60
detectors:
  coral:
    type: edgetpu
    device: usb
cameras:
  back:
    ffmpeg:
      inputs:
        - path: rtsp://********:********@********:554/h264Preview_01_sub
          roles:
            - detect
            - rtmp
      input_args:
        - -avoid_negative_ts
        - make_zero
        - -fflags
        - nobuffer
        - -flags
        - low_delay
        - -strict
        - experimental
        - -fflags
        - +genpts+discardcorrupt
        - -rw_timeout
        - '5000000'
        - -use_wallclock_as_timestamps
        - '1'
    width: 640
    height: 480
    fps: 7

from lsusb I know Coral is in 3-1

ash-4.3# lsusb
|__usb1          1d6b:0002:0310 09  2.00  480MBit/s 0mA 1IF  (ehci_hcd 0000:00:16.0) hub
  |__1-1         8087:07db:0002 09  2.00  480MBit/s 0mA 1IF  ( ffffffd1ffffffb2ffffffdbffffffad) hub
    |__1-1.1     f400:f400:0100 00  2.00  480MBit/s 200mA 1IF  (Synology DiskStation 65004B7A2BB76D65)
|__usb2          1d6b:0002:0310 09  2.00  480MBit/s 0mA 1IF  (Linux 3.10.105 etxhci_hcd-170202 Etron xHCI Host Controller 0000:04:00.0) hub
  |__2-2         2109:2815:0704 09  2.10  480MBit/s 0mA 1IF  (VIA Labs, Inc.          USB2.0 Hub              ffffff94ffffffb4ffffff94ffffffb0) hub
    |__2-2.1     0451:16a8:0009 02  2.00   12MBit/s 50mA 2IFs (Texas Instruments TI CC2531 USB CDC __0X00124B0014D9AA5F)
    |__2-2.2     0a12:0001:8891 e0  2.00   12MBit/s 100mA 2IFs ( ffffff84ffffffb5fffffff4ffffffd4)
    |__2-2.4     0658:0200:0000 02  2.00   12MBit/s 100mA 2IFs ( ffffffd1ffffffb2ffffffdbffffffa0)
|__usb3          1d6b:0003:0310 09  3.00 5000MBit/s 0mA 1IF  (Linux 3.10.105 etxhci_hcd-170202 Etron xHCI Host Controller 0000:04:00.0) hub
  |__3-1         1a6e:089a:0100 00  3.10 5000MBit/s 896mA 1IF  ( ffffffd1ffffffb2ffffffdbffffffa1)
  |__3-2         2109:0815:0704 09  3.20 5000MBit/s 0mA 1IF  (VIA Labs, Inc.          USB3.0 Hub              ffffff94ffffffb4ffffff94ffffffb4) hub

Thanks for the help in advance!

Hello,
I’ve been trying to follow this thread since it started. I finally got a PC to run Frigate on, but it looks like a bad time to buy a Coral. I’m probably going to try to set it up before I get the Coral.

My question is about which Coral to buy…
The computer I have to run Frigate is a Dell OptiPlex 9020M. The “M” is for micro case size.
It has an M2 B&M slot, but it is tucked in between the motherboard and the SATA drive.
My concern is that there would be little air flow for cooling. and there doesn’t appear to be much heat sinking on the M2 devices.
Would I be better off with the USB device?

Thanks,
-Mike

@MikeSherman i have a dell optiplex 9020m and I tried the coral a+e m2 card in the wifi slot and it worked fine under unraid. I did that as a test before moving it to the wifi slot in my more powerful optiplex 7060.
No issues with heat as far as I can tell.

The a+e card was cheaper than usb and is actually available right now.

Both optiplex machines have SSD in the b+m m2 slot.

1 Like

You can also get Home Assistant to keep an eye on the Coral PCI temps. Mine hit 57.8C today, under heavy load inside a case with an room temp of about 30c. They begin to throttle performance at 85C and shut down at crazy temps (105C or so).

I feel that the USB Corals actually get hotter than the PCI Corals, despite having a heatsink and running in restricted performance mode. The case can certainly feel toasty to the touch.

So you’ll probably be fine with the M.2 Coral… perhaps better off.

That’s really good information! I didn’t know an a+e card would work. I’ll order one.

Thanks!
-Mike

1 Like

I have recently migrated to portainer on my synology using this guide and couldn’t be happier. Works so much better than the built in docker management (which won’t really allow you to run frigate right). If you want to do that, I can share my docker compose. Otherwise you have to run it from the command line. I think the last example I have from when I did that was v5, but maybe I’ve done it with v7 too.

for those interested, I found here confirmation, that m.2 version works in NUC, but I assume, that version for B+M key slot was used:
https://www.reddit.com/r/GoogleCoral/comments/l6qw4t/any_one_succeeded_in_using_coral_m2_tpu_with_usb/

Here is some video about version for A+E slot:

but no proof, that it really works and one guy is complaining, that it doesn’t work for him…

Please share your experience with M.2 version for A+E slot in NUC, if you have some.
Thanks.

Thanks @scstraus , So far indeed I have tried the command line but was not successful. I already have portainer running so I can try both if you can share your docker compose and docker run. Cheers!

Can frigate config files be split?

I am using MQTT and an automation to send detected object images to Telegram. I took a while but I finally got it to work by sending the image at: “‘http://192.168.1.32:5000/api/events/{{trigger.payload_json[“after”][“id”]}}/thumbnail.jpg’” (with 192.168.1.32 being my HA install).

The image quality is very low. I assume it is from the detect feed. Is there any way to get it from the higher resolution clip feed? Also is there any way to send the entire frame with the object circled and identified rather than the object cut out?

Something closed to the “best image”, right below the video clip, showing in the GUI after clicking on the event.

try best image

/api/<camera_name>/<object_name>/best.jpg

The best snapshot for any object type. It is a full resolution image by default.

Example parameters:

* `h=300`: resizes the image to 300 pixes tall
* `crop=1`: crops the image to the region of the detection rather than returning the entire image

or maybe set snapshot to high res feed and figure out which of photo api is pulling from that feed. I think best.jpg is but not sure

Is the best image only available after the end of the action?

Sorry is snapshot a role that can be defined for cameras? ive only seens detect, clips, rtmp, and record in the guide https://blakeblackshear.github.io/frigate/configuration/cameras

My original use was to get the following small image: http://192.168.1.32:5000/api/events/1620408962.264504-j0nokf/thumbnail.jpg
Testing your method for best using the address as follows yielded a full black image that would not be usable: http://192.168.1.32:5000/api/doorBell/1620408962.264504-j0nokf/best.jpg

no it was my mistake.

this use object not ID so should be something like http://192.168.1.32:5000/api/doorBell/person/best.jpg
I realize you need for specified event. maybe try below instead

EDIT
http://192.168.1.32:5000/api/events/1620408962.264504-j0nokf/snapshot.jpg

I`m running Frigate on small and slow NUC but I have no Coral yet. Could somebody help with HW acceleration config for this PC to try to squeeze max from it?

CPU: Intel Celeron N4000 (2) @ 2.600GHz
GPU: Intel GeminiLake [UHD Graphics 600]

image

Try mine:

detectors:
  cpu1:
    type: cpu
  cpu2:
    type: cpu

ffmpeg:
  # Optional: global ffmpeg args (default: shown below)
  global_args: -hide_banner -loglevel warning
  # Optional: global hwaccel args (default: shown below)
  # NOTE: See hardware acceleration docs for your specific device
  hwaccel_args:
    - -hwaccel
    - qsv
    - -qsv_device
    - /dev/dri/renderD128
  # Optional: global input args (default: shown below)
  input_args: -avoid_negative_ts make_zero -fflags +genpts+discardcorrupt -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1
  # Optional: global output args
  output_args:
    # Optional: output args for detect streams (default: shown below)
    detect: -f rawvideo -pix_fmt yuv420p
    # Optional: output args for record streams (default: shown below)
    record: -f segment -segment_time 60 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c copy -an
    # Optional: output args for clips streams (default: shown below)
    clips: -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c copy -an

thank you ukro, it worked but I`m getting 600ms inference per CPU and on one camera only.
There must be something wrong because N4000 is better CPU than RPI4 has and on RPI4 I get 200ms on 3 cameras using just one CPU.

How did you get that info from terminal?

Babe i have no idea what are inference and what are they for, i have 11cameras, all working fine? :open_mouth: Without any coral