Local realtime person detection for RTSP cameras

Not sure what you mean protection mode? Are you talking about privileged mode?

Not by default, try removing -an from your output args. If that doesnā€™t work, replace -an with -codec:a aac

The latter worked in my case.

1 Like

Thank you ā€œ-codec:a aacā€ worked a treat

1 Like

Hi there,

Other people seem to have got this to work on older CPUs: https://github.com/blakeblackshear/frigate-hass-addons/pull/12

So it should be possible to get Frigate working without swapping hardware.

Where do you put the output_args?
All I get is:

 * Starting nginx nginx
   ...done.
Error parsing config: expected a dictionary for dictionary value @ data['ffmpeg']['output_args']

You can place under an individual camera, but I wanted all cameras, so put it here:

ffmpeg:
  hwaccel_args:
    - -hwaccel
    - vaapi
    - -hwaccel_device
    - /dev/dri/renderD128
  output_args:
    clips: -f segment -segment_time 10 -segment_format mp4 -reset_timestamps 1 -strftime 1 -c copy -codec:a aac
2 Likes

So simpleā€¦ Thanks!

You could possibly achieve that with best_image_timeout. So something like:

  Nextone:
    ffmpeg:
      inputs:
        - path: rtsp://xxx:[email protected]:2554/axis-media/media.amp
          roles:
            - detect
    width: 1920
    height: 1080
    fps: 5
    best_image_timeout: 5

The default value is 60.

The easiest way to keep all images is to assign a clips role to the camera, unless your use case prevents that.

Did you solve this?

Did you got it working?

Nope, unfortunately not. :frowning:

1 Like

Hi

Firstly thank you! Frigate is awesome and i canā€™t wait to see what it becomes with Further development. Iā€™m looking at Migrating away from BlueIris because of the great homeassistant integration.

Iā€™m just struggling with one issue.
I want to use my sub-stream for 24/7 recording and my Main stream for recording clips which works great. However when i click on the feed in HomeAssistant Iā€™d also like to see the main stream and not the low res sub-stream. Is there a way to do this?

My config looks like this

cameras:
  frontyard:
    ffmpeg:
      inputs:
        - path: rtsp://admin:@[email protected]:554//h264Preview_01_sub
          roles:
            - detect
            - record
        - path: rtsp://admin:@[email protected]:554//h265Preview_01_main
          roles:
            - clips
            - rtmp

Thanks again!

Good day gents,
anyone can suggest recommended ffmpeg settings to improve the latency with homeassistant?
basically I get a presence notification/picture about 10 seconds before I can see the object in the homeassistant camera view.

Thank you

Hi all,

I have a question about Frigate on a Odroid N2+ with a Coral USB TPU
My CPU load is relative high :


I can even reach 90 ~ 95%

My suspicion is that the ordoid N2+ does not do ffmpeg on hardware causing this? Is that true and can I do something about this?

One other thing, object recognition is awesome! But only with daylight, when itā€™s getting dark it is not working anymore, 2 camera are Ubiquiti G3 and G3 flex, I also have a EZViz C3W, but that doesnā€™t work eitherā€¦ it works better with Unifi Protect, but I want HA to detect :slight_smile:

Itā€™s complicated but thereā€™s also this (still need to try the latter myself).

Hi all, fairly new frigate user here! First of all thanks for the awesome project! Unfortunately I have some troubles setting it up properly. Setup is working fine, I have / had it running as a hassio addon and in a separate docker container (Raspberry Pi and Macbook), but I canā€™t seem to get a stable stream to detect objects / view my camera images. I read a lot of posts and guides but none seem to answer my specific question.

  • Camera: Reolink E1 Pro (it has an rtsp stream, but apparently no rtmp, as far as I can tell).
  • Hardware: Raspberry Pi 3b - Docker Version or Macbook Pro (late 2013). Both not the most powerful HW

First problem: I cannot get an stable substream to detect object. It is either all green screen (while using some input_arguments) or it is flickering / has artefacts. Stream size is 640x360px, 7fps, 192kbit/s

Seconds Problem: Using the main stream to detect images, I think both of my hardware options are not powerful enough. Frigate only works with 0,2 to 1,3 fps (displayed in frontend).

What are my options here? Is this only a hardware problem and more powerful hardware, e.g. Raspberry Pi 4 and Coral TPU will solve this? Or is this related to the Camera Hardware (esp. the Substream Problem), or do I need to tune the ffmpeg stream with input_arguments?

Hi,

so lowering the timeout for best_image produces more imagesā€¦ Will try that tomorrow and report back.

Regarding your suggestion about the ā€œclipsā€: I have multiple cameras, some show areas with heavy pedestrian trafic. If I am forced to watch all clips (not really happy with the post_capture und pre_capture), the evaluation of all cameras takes up way more time! Checking all pictures is way faster. But thanks for the tip!

As i said, be back tomorrow with more about best_image_timeoutā€¦

Thank you.
Cheers
Chris

Interesting, that makes sense. Iā€™m not 100% certain that reducing the best_image timeout will work. I just checked the documentation and it says the following:

# Optional: timeout for highest scoring image before allowing it
# to be replaced by a newer image. (default: shown below)
best_image_timeout: 60

Another thing to play with could be max_disappeared in the detector settings:

  # Optional: Number of frames without a detection before frigate considers an object to be gone. (default: 5x the frame rate)
  max_disappeared: 25

Lowering it drastically would perhaps create a lot of unique events and snapshots per target as Frigate bounces above and below the detection threshold - which itself could be adjusted to achieve the desired effect.

Iā€™ve just been playing with Node-RED as another possible solution, and it did indeed save several snapshots when I walked around in front of the camera. Not sure how elegant this solution is as Iā€™ve never really used Node-RED, but Iā€™d expect it to save loads of snapshots in a high traffic area.

thank you very much for your response.
Im trying the the FFMPEG cameras with the Live555 RTSP proxy and testing

for webrtx i just cant get the card to work. keeps saying ā€œCustom element doesnā€™t exist: webrtc-cameraā€

Hi,

Iā€™m running frigate on a dedicate docker container,
Iā€™m wondering if it could be possible to display MQTT frigate/stats as entities card

anyone already explored it?

Regards
Dario

1 Like