Local realtime person detection for RTSP cameras

I faced similar problems. Have been working a lot lately, so haven’t had time to look into it.

What’s your cpu load like? I got a tip to try h264_qsv instead of libx264(of you are on quick sync enabled hardware). Haven’t had time to dive deeper into that either. If you have any success, please shout out.

1 Like

Yes. I run frigate on a NUC but have HA installed on Proxmox. All you do is point the HA Frigate integration to the installation where frigate is running.

Anyone have any idea if there are any FFMPEG flags that I can apply to the clips that make the videos smoother. Mines stutter even though the detection and rtmp feeds don’t. Its like it’s missing a frame or two when someone walks so they stick and then teleport forward.

Strange thing is, I have always had the RTSP stream for clips but since changing my detect stream to RTSP the clips have developed this stutter

Yes, that’s what I’ve been doing lately… But when running Frigate docker on a another server it does not show up in NabuCasa. Frigate is not accessible remote.

Hi guys,
I’m trying use the add-on running home assistant OS on a RPI4

I have hikvision coax cameras connected to an NVR I can currently access my feed via

http://user:[email protected]:64998/ISAPI/Streaming/channels/301/picture

I added the following

mqtt:
  host: 10.0.0.9
cameras:
  back:
    ffmpeg:
      inputs:
        - path: rstp://user:[email protected]:64998/ISAPI/Streaming/channels/301/picture
          roles:
            - detect
            - rtmp
    width: 704
    height: 576
    fps: 5

but I’m getting the following error

frigate.mqtt                   INFO    : MQTT connected
detector.coral                 INFO    : Starting detection process: 33
frigate.app                    INFO    : Camera processor started for back: 36
frigate.app                    INFO    : Capture process started for back: 37
frigate.edgetpu                INFO    : Attempting to load TPU as usb
Process detector:coral:
frigate.edgetpu                INFO    : No EdgeTPU detected.
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 152, in load_delegate
    delegate = Delegate(library, options)
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 111, in __init__
    raise ValueError(capture.message)
ValueError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
    self.run()
  File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/frigate/frigate/edgetpu.py", line 124, in run_detector
    object_detector = LocalObjectDetector(tf_device=tf_device, num_threads=num_threads)
  File "/opt/frigate/frigate/edgetpu.py", line 63, in __init__
    edge_tpu_delegate = load_delegate('libedgetpu.so.1.0', device_config)
  File "/usr/local/lib/python3.8/dist-packages/tflite_runtime/interpreter.py", line 154, in load_delegate
    raise ValueError('Failed to load delegate from {}\n{}'.format(
ValueError: Failed to load delegate from libedgetpu.so.1.0
frigate.video                  INFO    : back: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures

could someone please tell me if i’m doing something wrong?

1 Like

You need to specify a CPU detector if you aren’t using a Coral device. It defaults to looking for one.

2 Likes

thank you
I missed that in the docs.
could I please bother you again.
I have the web UI running but the image displayed is blank green, am I missing something else.

Open an issue on github and provide the requested info in the template.

it looks like this more a problem with my hikvision cameras as I can’t get a clear image when feeding a stream to VLC

So i’ve been using Frigate for a week now, and its great! Kudos @blakeblackshear
I’ve decommissioned Zoneminder and no longer looking at Shinobi.

I’m using a M2 Coral Edge TPU running on Unraid and inference is about 6.1ms

The last thing I needed was to ignore previous objects that were detected.
Previously any motion in my driveway (tree, shadow etc) would alert me to my car, that has been there all day. To avoid dumbing down the motion detection (since i’m already using node-red) I just compared the before bounding box, with the after bounding box and accounted for 5% tolerance.

My first, written from scratch javascript in a function node :slight_smile:

  1. The flow is MQTT node in. Topic is frigate/events
  2. JSON node
  3. function node - code below
  4. switch node - if ‘same’, or if ‘new’
  5. Notify me
var Apct = ((msg.payload.after.box[0] - msg.payload.before.box[0])/ msg.payload.before.box[0])
var Bpct = ((msg.payload.after.box[1] - msg.payload.before.box[1])/ msg.payload.before.box[1])
var Cpct = ((msg.payload.after.box[2] - msg.payload.before.box[2])/ msg.payload.before.box[2])
var Dpct = ((msg.payload.after.box[3] - msg.payload.before.box[3])/ msg.payload.before.box[3])

if ( Math.abs(Apct >= 0.05) || Math.abs(Bpct >= 0.05) || Math.abs(Cpct >= 0.05) || Math.abs(Dpct >= 0.05)) {
msg.compare = "new";
} else {
msg.compare = "same";
}
return msg;

HTH someone

Edit - 0.05 is the 5% tolerance, adjust to your taste

4 Likes

Nice. Hope this can bee included in Frigate as an option…

Edit: Now that I look at it from my PC… That should be doable in HA automations…

The next version of Frigate should avoid duplicate clips. They’re generally triggered due to multiple objects in a single event. There have been some changes that will reduce the number of these saved. It may not reduce the number of events sent to MQTT, though.

1 Like

I hate FFMPEG lol
Weird issue with just 1 camera. Image in the Frigate UI is green and repeated multiple times (size is correct).
Snapshot is green in the media browser but the clip video is absolutely fine - nothing wrong with it at all.
Any thoughts on what to adjust?

Anyone here using this with an Amcrest camera via a RTSP stream? SIngle camera, even switched over to the 2nd stream at 640x480, my logs are filled with these messages:

frigate.video                  INFO    : middle_driveway: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : middle_driveway: ffmpeg process is not running. exiting capture thread...
[h264 @ 0xaaaae6e4c140] error while decoding MB 67 98, bytestream -5
watchdog.middle_driveway       INFO    : Terminating the existing ffmpeg process...
watchdog.middle_driveway       INFO    : Waiting for ffmpeg to exit gracefully...
frigate.video                  INFO    : middle_driveway: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : middle_driveway: ffmpeg process is not running. exiting capture thread...
watchdog.middle_driveway       INFO    : Terminating the existing ffmpeg process...
watchdog.middle_driveway       INFO    : Waiting for ffmpeg to exit gracefully...
frigate.video                  INFO    : middle_driveway: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : middle_driveway: ffmpeg process is not running. exiting capture thread...
watchdog.middle_driveway       INFO    : Terminating the existing ffmpeg process...
watchdog.middle_driveway       INFO    : Waiting for ffmpeg to exit gracefully...
watchdog.middle_driveway       INFO    : Terminating the existing ffmpeg process...
watchdog.middle_driveway       INFO    : Waiting for ffmpeg to exit gracefully...

I have several Amcrest cameras I’m using with frigate without issue. How solid is your network connection to the camera? What settings are you using under the “Video” tab on the camera for the main stream and/or substream?

Thanks for the reply! I stand corrected, the sub stream is 704x480:

So I have a cat5e cable going outside to a PoE splitter to these cameras, ping wise they tend to be solid and no drops. This (and several other cameras) use RTSP back to my bluecherry DVR, and it’s not having/reporting any issues which is interesting.

I am looking for a way to send the “http://192.168.1.***:5000/api/side_door/person/best.jpg” image as an attachment via pushover. Unfortunately, since pushover cannot access a local address it does not attach the image. When I am on WiFi I can send the URL via pushover and then click on the link to see the image but once I am off WiFi, that, of course, doesn’t work anymore.
Is there any way to store the image locally first and then send it as an attachment?

Any suggestion is appreciated.

Not sure if it would make a difference, but my PoE Amcrests are set on H.264 (not H.264H) and CBR. I set my substream at 5fps - Frigate doesn’t need more than this, in my experience.

1 Like

Have you checked out the notification examples in the docs? They should give you what you need to access the image.

Thanks for the reply, I did look at the docs but could not get it to work - at least not with pushover.