Local realtime person detection for RTSP cameras

I’ve got two PCIe corals working happily. Make sure both corals are showing in your pci device list.

lspci | grep -i coral
02:00.0 System peripheral: Global Unichip Corp. Coral Edge TPU
03:00.0 System peripheral: Global Unichip Corp. Coral Edge TPU

Then from the frigate config

detectors:
coral1:
type: edgetpu
device: pci:0
coral2:
type: edgetpu
device: pci:1

And from a docker layer, I used the config straight from the frigate documentation and it seemed to be working fine.

Looks like I may be out of luck getting examples of custom models. Another way I thought of handling “Squirrelcam” is to simply record all motion events. That’s what I’d been doing with Arlo cameras prior to the move to IP.

EDIT: I see that there is an open feature request for recording clips on any motion.

I totally get that it’s opposite of the intention of Frigate, but is possible to record clips of all motion events on a single camera without using object detection? There are other native and HA ways to do that, of course, but I like the idea of centralizing all cameras through Frigate, and I love the Media Browser integration.

I’m ready baby :smiley: i go PM you :heart:

I don’t think there is a way to record on motion without object detection since that is the whole point of Frigate to begin with. Frigate may not be what you want to use if this is your goal.

Thanks for the response. Yep, as mentioned I understand that it’s not the goal of Frigate. It’s a potential means to end for a single camera in a multi camera set up which otherwise uses object detection. My goal is to centralize control of those cameras with object detection together with one camera that would not use object detection but rather just motion.

I may try object detection of a variety of objects types with very low confidence levels as a way of picking up (most?) motion.

Glad to hear it works! I thought I’d tried that config, but I’ll have another go at it when my second PCI Coral’s connected back up. Thanks for your help :grin:

Has anyone managed to get hwaccel working for HIKVision cameras on Pi4?
The recommended parameters work fine for h264 stream from yi-hack cam, but I’m getting green screen and errors on HIKvision stream. I tried to play with camera params, changing fps, variable/constant bitrate - no success. If I remove hwaccel - works fine.

      hwaccel_args:
        - -c:v
        - h264_mmal
frigate    | frigate.video                  INFO    : yard: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate    | frigate.video                  INFO    : yard: ffmpeg process is not running. exiting capture thread...
frigate    | ffmpeg.yard.detect             ERROR   : mmal: mmal_vc_port_enable: failed to enable port vc.ril.video_decode:in:0(H264): ENOMEM
frigate    | ffmpeg.yard.detect             ERROR   : mmal: mmal_port_enable: failed to enable port vc.ril.video_decode:in:0(H264)(0x1b46720) (ENOMEM)
frigate    | ffmpeg.yard.detect             ERROR   : mmal: mmal_port_disable: port vc.ril.video_decode:in:0(H264)(0x1b46720) is not enabled
frigate    | ffmpeg.yard.detect             ERROR   : mmal: mmal_port_disable: port vc.ril.video_decode:out:0(I420)(0x1b46a40) is not enabled
frigate    | ffmpeg.yard.detect             ERROR   : Error while opening decoder for input stream #0:0 : Unknown error occurred

Any idea how to get video clip working? Tried this but got a 404 https://ha.domain.org/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}/clip.mp4.

And is there any way to specify which camera it gets the snapshot/video clip from?

You need the camera name in the path between the ID and the clip.

Something like this?
https://ha.domain.org/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}/gate/clip.mp4 ?

Anyone knows how to install this on a Jetson Nano? Tried to follow the documentation but no luck.
I docker pulled this image blakeblackshear/frigate:stable-amd64nvidia, created a config.yaml file, ran this on terminal

docker run -d \
  --name frigate \
  --restart=unless-stopped \
  --device /dev/bus/usb:/dev/bus/usb \
  --device /dev/dri/renderD128
  -v <path_to_directory_for_media>:/media/frigate \
  -v <path_to_config_file>:/config/config.yml:ro \
  -v /etc/localtime:/etc/localtime:ro \
  -e FRIGATE_RTSP_PASSWORD='password' \
  -p 5000:5000 \
  -p 1935:1935 \
  blakeblackshear/frigate:stable-amd64nvidia

But couldn’t get it to run. Any help would be appreciated!

Btw, would the performance be better than a raspberry pi 4, I am currently running on pi4 but performance is real bad with 2 cameras.

Frigate working fine in a docker on my Dell R710, thank you very much but unfortunately that thing eats electric!

So I have tried the exact same set up on a older smaller pc to run 24/7, a HP Proliant nl36.
but I am having a major issue!
AMD Athlon™ II Neo N36L Dual-Core @ 1300 MHz
Unraid set up
Home Assistant VM
Trying either Docker or Home Assistant addon, I am getting the message below,

The NL36 is old and I found this thread that I ‘think’ is related but could do with some help please
Fatal Python error: Illegal instruction · Issue #695 · blakeblackshear/frigate · GitHub but I don’t think its been implemented in frigate yet, if it is this issue, anyway round it please?

I do have a PCI and also a USB coral plugged in and i have tried setting detectors as coral, PCI and USB
I have also tried removing the hardware config input arguments.

Any suggestions appreciated please?

 * Starting nginx nginx
   ...done.
Fatal Python error: Illegal instruction
Current thread 0x00007f11fdbc6740 (most recent call first):
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 1101 in create_module
  File "<frozen importlib._bootstrap>", line 556 in module_from_spec
  File "<frozen importlib._bootstrap>", line 657 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 991 in _find_and_load
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1042 in _handle_fromlist
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 36 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 783 in exec_module
  File "<frozen importlib._bootstrap>", line 671 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 991 in _find_and_load
  File "/opt/frigate/frigate/edgetpu.py", line 15 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 783 in exec_module
  File "<frozen importlib._bootstrap>", line 671 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 991 in _find_and_load
  File "/opt/frigate/frigate/app.py", line 19 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 783 in exec_module
  File "<frozen importlib._bootstrap>", line 671 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 975 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 991 in _find_and_load
  File "/opt/frigate/frigate/__main__.py", line 7 in <module>
  File "/usr/lib/python3.8/runpy.py", line 87 in _run_code
  File "/usr/lib/python3.8/runpy.py", line 194 in _run_module_as_main

I have dogs and cats configured for detection - mostly cats are detected as dogs and in some cases the other way round so you may as well configure both. In one case a small dog inside a car, I was puzzled by the detection of a car as a cat, then I saw the dog in it.

Jetson nano is ARM cpu, images for amd64 wont work. You need to start with the base aarch64 images. Then find out what hardware acceleration you can use if any with ffmpeg and the jetson nano GPU. I just googled it and it seems nvidia released an ffmpeg binary with hw acceleration, if you can swap the ffmpeg binary in the image and add any dependencies you may have some luck. decoder - Is ffmpeg support GPU acceleration on Jetson platform? - Stack Overflow

1 Like

Hi @blakeblackshear, what a amazing piece of software you designed, i tried for a long time to make it work and finally got it for a week now, amazing !

For some reason it didn’t work at all until i specifically added the cpu as detector, the documentation specified that in absence of tpu, it fallback to cpu, but it crashed instead + my intel GPU was on dri render 129. I’ve learned a lot with that installation !

Also there are still some dark parts in the documentation, i can’t understand or find anywhere a answer to those questions :

  • If i put a low bitrate/res input for detection and the Hires for clips and rtmp, which height/width do i have to specifiy in frigate.yml ? Only the detection ? (so for the masks and zones too - i have a good idea of the answer, but precision is important !)

  • every night my backyard camera detect a cat as a person, what are the units for min_area ? Pixels ? I can’t find a sweet spot and i wish we could use the UI to draw the approximate min / max height of a person / cat / car on the viewfinder to copy as with mask and zones :))

Thanks a lot again, waiting for my google coral PCIe to play more with it !

Thanks for the help! I managed to get it running on Docker now and successfully installed the ffmpeg binary with hw acceleration. The only thing is I am not sure how do I pass the hardware acceleration arguments into the config.yml file. It seems to be using h264_nvmpi. Tried this

ffmpeg:
  hwaccel_args:
    -  - c:v
    - h264_nvmpi

But didn’t work :frowning:

@blakeblackshear is this how It should be https://ha.domain.org/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}/gate/clip.mp4 ? It didn’t work. Not sure how to get video clip working in the first place, snapshot is working fine though.
I have tried this as well https://ha.domain.org/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}/{{trigger.payload_json["before"]["camera"]}}/clip.mp4

The next thing you need is your accelerated ffmpeg binary inside docker container/image. I am still learning about docker so cant help much with this. You will need any additional libraries in there too and also if there are files in /dev to be exposed those too. I think this would be a worthwhile exercise because jetson nano could turn out to be a good platform for frigate if this works out. cheap enough and low power consumption.

1 Like

@technikhaus, I have the same setup as you (RPi4 w/ a smb network mount for clips). And I got the same error, “database is locked”. Can you tell me how you added the nolock option to your mount?

I mounted my network folder with an entry into </etc/fstab> like this:

//192.168.1.5/shares/frigate /media/networkshare cifs username=user1,password=pass1,uid=1000,gid=1000 0

Wondering how to add the nolock option to that line.

Thanks!

Hi, there is an option in documentation to make the database local
in config of the frigate.yml

database:
  path: /media/frigate.db

Would you mind to alaborate and explain how did you managed to make the smb mount for the clips?
Please, i am not able to do that, and i was trying for many times and few hours.