Local realtime person detection for RTSP cameras

Tell me please. One of the cameras stopped working in Frigate.

2023-04-10 15:10:53.431563271  [2023-04-10 18:10:53] frigate.video                  ERROR   : camera_1: Unable to read frames from ffmpeg process.
2023-04-10 15:10:53.432104632  [2023-04-10 18:10:53] frigate.video                  ERROR   : camera_1: ffmpeg process is not running. exiting capture thread...
2023-04-10 15:11:00.780775066  [2023-04-10 18:11:00] watchdog.camera_1              ERROR   : Ffmpeg process crashed unexpectedly for camera_1.
2023-04-10 15:11:00.780850148  [2023-04-10 18:11:00] watchdog.camera_1              ERROR   : The following ffmpeg logs include the last 100 lines prior to exit.
2023-04-10 15:11:00.780965677  [2023-04-10 18:11:00] ffmpeg.camera_1.detect         ERROR   : [rtsp @ 0x564644ef6e40] method DESCRIBE failed: 401 Unauthorized
2023-04-10 15:11:00.780997434  [2023-04-10 18:11:00] ffmpeg.camera_1.detect         ERROR   : rtsp://*:*@192.168.1.62:554/user=admin&password=XXX%211&channel=1&stream=0.sdp?: Server returned 401 Unauthorized (authorization failed)

The code in yaml looks like this:

mqtt:
  host: 192.168.1.134
  user: Alexey
  password: Alexey!1
  
cameras:
  camera_1: # <------ Name the camera
    mqtt:
      crop: True
      height: 500
    rtmp:
      enabled: false
    ffmpeg:
      input_args: -rtsp_transport tcp
      inputs:
        - path: rtsp://root:[email protected]:554/user=admin&password=XXX&channel=1&stream=0.sdp? # <----- Update for your camera
          roles:
            - detect
            - rtmp
            - record
            - clips
    detect:
      width: 1280 # <---- update for your camera's resolution
      height: 720 # <---- update for your camera's resolution
      fps: 14
    objects:
      track:
        - person
        - bicycle
        - car
        - motorcycle
        - bird
        - cat
        - dog
      filters:
        person:
          min_score: 0.6 # min score for object to initiate tracking (default: 0.5
          threshold: 0.8 # min decimal percentage for tracked object's computed score to be considered a true positive (default: 0.7)
        car:
          threshold: 0.85
          min_score: 0.6
      mask:
        - 880,73,884,32,1280,26,1280,67
        - 112,175,556,134,554,327,123,360
    zones:
      parking:
        coordinates: 470,315,264,334,190,569,413,685,710,680,883,540,1236,525,1241,307,875,150,461,175
        objects:
          - car
    record:
      enabled: True
      retain:
        days: 3
        mode: motion
      events:
        pre_capture: 15
        post_capture: 15
        objects:
          - person
        retain:
            default: 10
            mode: motion

HI,

anyone can help me to understand how to configure Birdseye Restream, I read the documentation but I don’t understand how to do it

https://deploy-preview-4055–frigate-docs.netlify.app/configuration/restream#birdseye-restream

Birdseye Restream
Birdseye RTSP restream can be enabled at birdseye -> restream and accessed at rtsp://<frigate_host>:8554/birdseye. Enabling the restream will cause birdseye to run 24/7 which may increase CPU usage somewhat.

This means at

birdseye:
  restream: True

see Configuration File | Frigate for a full config example

1 Like

Hi,

I’m running Frigate container on raspberry

Raspberry Pi 4 Model B Rev 1.4
aarch64

as soon I add hwaccel arg the ffmeg stop to work.

cameras:
 camera1:
  ffmpeg:
    inputs:
    - path: rtsp://<user:password@IP>/videoMain
      hwaccel_args: preset-rpi-64-h264
      input_args: preset-rtsp-generic
      roles:
       - detect
2023-04-11 23:02:39.890394298  [2023-04-11 23:02:38] frigate. Video                  ERROR   : camera1: Unable to read frames from ffmpeg process.
2023-04-11 23:02:39.891043423  [2023-04-11 23:02:38] frigate.video                  ERROR   : camera1: ffmpeg process is not running. exiting capture thread...
2023-04-11 23:03:26.991838334  [2023-04-11 23:03:26] ffmpeg.camera1.detect          ERROR   : Guessed Channel Layout for Input Stream #0.1 : mono
2023-04-11 23:03:26.991841723  [2023-04-11 23:03:26] ffmpeg.camera1.detect          ERROR   : [h264_v4l2m2m @ 0x55c1bee610] Failed to find Size=2304x1536, fmt=NV12 in 1 frame size enums
2023-04-11 23:03:26.991844760  [2023-04-11 23:03:26] ffmpeg.camera1.detect          ERROR   : Error while opening decoder for input stream #0:0 : Invalid argument

any suggestions?

Hi,

I have just discovered that the snapshots taken by Frigate, are easily accessible without authentication via a simple link. This means that anyone with the link can access the images from my camera.

https://HOMEASSISTANT.URL/api/hassio_ingress/HASH//api/events/ID-NUM/snapshot.jpg

I am concerned about the security of my data and would like to disable this feature. I do not want my snapshots to be shared on the internet, as I fear that an error could lead to unintended access to my data.

unfortunately I found nothing in this direction is there really no option to do this?

There are settings for this directly in the integration. Unauthenticated endpoints are the only way to make it possible to include snapshots in notifications for most users.

You would have to know the event id, which is randomly generated and not easily guessed. You can use the setting to only allow the requests for events within the last X seconds, so now you have to guess it within a small window of time before it expires.

You can also just disable the unauthenticated proxy and use a different notification service than the home assistant app.

3 Likes

Out of curiosity what benefits would i get switching to or adding webRTC/Go2RTC for restreaming?

I am using the Frigate lovelace card which i assume takes the feed from Frigate directly and not from the camera? I tend to view cameras through the HA app on my iOS device.

I very rarely use the frigate web interface unless im looking at historic stuff.

I have updated to 0.12 and havent changed any configs and things appear to be working as before. However dont want to miss out on any enhancements or performance improvements by not switching to the new stuff.

I have just updated my HACS integration (v4.0.0) and unRAID frigate docker ( 0.12.0-da3e197) to the newer versions, switched over to go2rtc, and removed rtmp entries from my config.

Everything seems OK.

However, I note in my HA logs I am still seeing attempts to connect to rtmp ports which are failing ?

well, RTMP has deprecated so using go2rtc will be required for the camera stream in HA once RTMP is removed.

Lower latency. For me HA had at least 5 second delay, really more. Webrtc is realtime, maybe .5 second lag but it was too short to care or notice.

Less connections to camera. Single connection to camera and all others connections are from go2rtc. Helps if you have bad connections to camera or multiple connections needed.

2way audio support.

I guess at this point i am poorly repeating the readme for the project

5 Likes

I just wondering is there no possibility to import and use own environment variables?

2023-04-14 15:21:04.116940913  [2023-04-14 15:21:04] frigate.detectors.plugins.edgetpu_tfl INFO    : TPU found
Traceback (most recent call last):
  File "/usr/local/go2rtc/create_config.py", line 90, in <module>
    go2rtc_config["streams"][name][i] = stream.format(**FRIGATE_ENV_VARS)
KeyError: 'AICAM_PASSWORD'

Env vars need to start with FRIGATE and I think it only works for mqtt password and camera inputs.

Eg my reolinks password is used like so {FRIGATE_REOLINK_PASSWORD}

1 Like

The current list is:

Does it work fast like that if you use the HA mobile app?

Yes it can

Hi,

I’m running Frigate on a Raspi4 with 8Gb
image

and I’m streaming the video from a camera with a main and a sub stream

image
image

do you have any suggestions to reduce the inference speed?
what should be the best resolution for detection?

this is my config.yaml

detectors:
  coral:
     type: edgetpu
     device: usb

go2rtc:
  streams:
    videoSub: 
      - rtsp://user:[email protected]:XX/videoSub
      #- "ffmpeg:rtsp_cam#audio=opus"
    videoMain:
      - rtsp://user:[email protected]:XX/videoMain
      #- "ffmpeg:rtsp_cam_sub#audio=opus"
  webrtc:
    candidates:
      - X.X.X.X:8555
      - stun:8555

birdseye:
  enabled: True
  mode: continuous
  #restream: True

objects:
 track:
   - person
#   - car
   - bicycle
   - motorcycle
   - bird
   - cat
   - dog

cameras:
 camera1:
  enabled: True
  
  ffmpeg:
    inputs:
    - path: rtsp://user:[email protected]:XX/videoMain
      hwaccel_args: preset-rpi-64-h264
      #input_args: preset-rtsp-generic
      input_args: preset-rtsp-restream
      roles:
       - record
   
    - path: rtsp://user:[email protected]:XX/videoSub
      hwaccel_args: preset-rpi-64-h264
      #input_args: preset-rtsp-generic
      input_args: preset-rtsp-restream
      roles:
       - detect

    output_args:
      record: preset-record-generic
      
  detect:
    enabled: True 
    width: 1280
    height: 720
    fps: 5
    #width: 2304
    #height: 1536
    #fps: 5

  record:
   enabled: True
   retain:
     days: 7
     mode: motion

   events:
     pre_capture: 15
     post_capture: 15
     retain:
       default: 5
       mode: active_objects
       objects:
        person: 5
     objects:
      - person

  snapshots:
    enabled: True
    timestamp: True
    bounding_box: True
    retain:
      default: 5
      objects:
        person: 5

  rtmp:
    enabled: False

Not sure if related but you can reduce the fps on the sub stream. you have it set to 5 in the config but 15 on the camera.

Also you are not using your gortc streams, is that intentional?

i expected:

  ffmpeg:
    inputs:
    - path: rtsp://127.0.0.1:8554/videoMain?video=copy&audio=aac
      input_args: preset-rtsp-restream
      roles:
       - record
   
    - path: rtsp://127.0.0.1:8554/videoSub?video=copy
      input_args: preset-rtsp-restream
      roles:
       - detect

Did you ever find anything on this? I have some cameras do the same thing when I enable detect.

Thank you, I am now using CPU acceleration for ffmpeg decoding which helped quite a bit.

1 Like

Hi @SgtBatten,

thank you.
I’m not getting what you mean with “you are not using gortc stream”, can you help me to understand what I should do?