Local realtime person detection for RTSP cameras

How would the high frame rate (25fps) affect clips? If I can’t reduce the frame rate and an object is not detected for a half a second but then re-detected, would two separate clips be recorded?

I have a few days of clips now (22 or so) and the integration w/ media browser is awesome. Before I was using the basic HA media browser pointed to my clips folder but there were no thumbnails or anything, this is much better. The pending features will further improve navigation and usability, can’t wait!

I also love the integration’s automatic creation of the entities for sensors, etc. I was able to clean out a lot of stuff from my ha config file. Just a few things need cleaned up like unit_of_measurement.

How is everyone’s playback of clips/stream looking? I notice a brief pause every so often when watching a clip. I’m still testing to see if it’s part of the source file from frigate or the playback from HA.

I’d be really curious if you figure this out, though I’m considering just purchasing a cheap SSD to move my entire VM to and expand the storage just for this purpose since it has been working so wonderfully thus far. I tried with the stable version and motioneye to have clips on a separate drive but nothing really worked out with Supervised.

I never had complete success w/ an cifs/smb mount for docker container volume mounts. It worked for most but some containers would just not work permission-wise. I gave up and went to nfs and haven’t looked back. I’m using it now for the 8.0 beta and saved clips.

Please help a fairly new HA user with Frigate integration.

I don’t have a Coral USB yet, I ordered one today, but I wanted to familiarise myself with Frigate already with my Hassio on NUC and CPU based detection.

I installed the Frigate NVR addon from HA Supervisor-page and used this as my config:

web_port: 5000
detectors:
  cpu1:
    type: cpu
  cpu2:
    type: cpu
save_clips:
  clips_dir: /media/frigate
mqtt:
  host: core-mosquitto.local.hass.io
  user: my-mqtt-username
  password: my-mqtt-password
ffmpeg: {}
cameras:
  back:
    ffmpeg:
      input: 'rtsp://192.168.1.230/ch0_0.h264'

This seems to work as I managed to start the service and the log shows:

Fontconfig error: Cannot load default config file
On connect called
Starting detection process: 19
ffprobe -v panic -show_error -show_streams -of json "rtsp://192.168.1.230/ch0_0.h264"
Starting detection process: 20
{'streams': [{'index': 0, 'codec_name': 'h264', 'codec_long_name': 'unknown', 'profile': '77', 'codec_type': 'video', 'codec_time_base': '0/2', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'width': 1920, 'height': 1080, 'coded_width': 1920, 'coded_height': 1088, 'closed_captions': 0, 'has_b_frames': 0, 'pix_fmt': 'yuvj420p', 'level': 32, 'color_range': 'pc', 'color_space': 'bt709', 'color_transfer': 'bt709', 'color_primaries': 'bt709', 'chroma_location': 'left', 'field_order': 'progressive', 'refs': 1, 'is_avc': 'false', 'nal_length_size': '0', 'r_frame_rate': '20/1', 'avg_frame_rate': '0/0', 'time_base': '1/90000', 'start_pts': 1, 'start_time': '0.000011', 'bits_per_raw_sample': '8', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}, {'index': 1, 'codec_name': 'pcm_mulaw', 'codec_long_name': 'unknown', 'codec_type': 'audio', 'codec_time_base': '1/8000', 'codec_tag_string': '[0][0][0][0]', 'codec_tag': '0x0000', 'sample_fmt': 's16', 'sample_rate': '8000', 'channels': 1, 'bits_per_sample': 8, 'r_frame_rate': '0/0', 'avg_frame_rate': '0/0', 'time_base': '1/8000', 'start_pts': 4, 'start_time': '0.000500', 'bit_rate': '64000', 'disposition': {'default': 0, 'dub': 0, 'original': 0, 'comment': 0, 'lyrics': 0, 'karaoke': 0, 'forced': 0, 'hearing_impaired': 0, 'visual_impaired': 0, 'clean_effects': 0, 'attached_pic': 0, 'timed_thumbnails': 0}}]}
Camera capture process started for back: 23
Camera process started for back: 24
Creating ffmpeg process...
ffmpeg -hide_banner -loglevel panic -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -rtsp_transport tcp -stimeout 5000000 -use_wallclock_as_timestamps 1 -i rtsp://192.168.1.230/ch0_0.h264 -f rawvideo -pix_fmt yuv420p pipe:
 * Serving Flask app "detect_objects" (lazy loading)
 * Environment: development
 * Debug mode: off

I have a Frigate NVR link on my sidebar, but it just opens a white page with the following info:

{"back":{"camera_fps":20.3,"capture_pid":23,"detection_fps":0.4,"frame_info":{"detect":1607533457.282656,"process":0.0},"pid":24,"process_fps":19.0,"skipped_fps":0.0},"detection_fps":0.4,"detectors":{"cpu1":{"detection_start":0.0,"inference_speed":287.7,"pid":19},"cpu2":{"detection_start":0.0,"inference_speed":285.91,"pid":20}}}

I tried to install the Frigate HACS Integration as well, but after installing and rebooting nothing happens.

According to the HACS info I should have the following, but can see non of them:

  • Rich media browser with thumbnails and navigation
  • Sensor entities
  • Camera entities

Has anyone got Frigate working with Unifi cameras that are also connected to a CloudKey 2?

I have at my disposal

  • Various Raspberry Pis
  • A Coral USB TPU
  • A Coral Dev Board
  • An i5 Shuttle running Ubuntu 2020
  • An i7 NUC running Ubuntu 2020
  • A dual processor beast running Proxmox

I couldn’t get Frigate to run on the Dev board - this I have discussed previously in this thread about 6 weeks ago.
I can’t get Frigate to run for more than a few seconds on the i5 or i7 using the USB accelerator. What commonly happens is I get a few rectangles drawn around things, then the logs fill up with

frigate | On connect called

for a few seconds and then it dies.

If I run it on a 4Gb Pi 4 with the accelerator on 32-bit Raspbian or 64-bit Ubuntu I get out of memory errors.

However, the more I look at it I don’t think “Frigate” - as in the whole thing - is the problem because lots of people are using it without issue. However, I can’t find many people using a mixture of Unifi cameras on a CloudKey (even though the RTSP stream comes from the camera) and it’s leading me to believe that FFMPEG is the problem and it needs very specific settings for each camera (I have 4 different Unifi models including one G4 Pro, 6 cameras in total).

So my opening question - does anyone have a working setup using Unifi cameras and, if so, did you have to change various parameters to get it to work with Frigate?

Thanks

Steve

Are you also using the frigate custom_component? That is what will create your bottom 3 bullet points.

Based on data from the stats endpoint, it looks like the frigate addon is running. Now you just need the HA side.

I’d like to jump in and test the 0.8.0 beta. I’m currently using the home assistant addon. Are there instructions on how to get the beta release to show up as possible to upgrade to?

I installed the “Frigate integration for homeassistant” from using HACS to Integrations. It just installed, but where do I configure it?

Does it provide consistent fps? I am thinking about adding few cheap cameras inside house.

1 Like

I have it working with 5 unifi cameras, but those are on the Unifi Video Controller instead of the Cloud key. I understand Video Controller is EOL, so I just ordered the Gen2 Plus, so interested to see if that was a rash decision

In HA, check in configuration -> integrations -> add button on bottom right. Search for frigate. If you are using the add-on I think the prepopulated dns name will work.

That was my missing link! Thank you so much! Now I can see the entities. What about the rich media browser? Where can I find that?

Great! welcome!

In HA, the list on the left (overview, logbook, ec) you should see media browser. Depending on your HA version and what is in your config. Media browser was added in 115. It is part of the default_config configuration.yaml entry, so if you have that it should show up.

If you dont use default_config you need to define media_source in your ha configuration.yaml file - see https://www.home-assistant.io/integrations/media_source/

You could also add default_config: but that may turn on quite a few things you may not already have or want. - https://www.home-assistant.io/integrations/default_config/

1 Like

You are being extremely kind. I feel stupid, but I’m still taking my first steps with this…

I can see Media Browser and Frigate has a “folder” on that page with “Clips” and “Recordings” folders under it. Both are empty.

I swapped to the beta version and this is how my config looks like now:

detectors:
  coral:
    type: edgetpu
    device: 'usb:0'
mqtt:
  host: 192.168.1.2
  user: mqtt-username
  password: mqtt-password
ffmpeg: {}
cameras:
  YiCamera1:
    ffmpeg:
      inputs:
        - path: 'rtsp://192.168.1.230/ch0_0.h264'
          roles:
            - detect
            - rtmp
    height: 720
    width: 1280
    fps: 5

Is there something I should add to the config to start recording clips after movement is detected?

Try adding the save clips config and also you need -clips under the role:-

detectors:
  coral:
    type: edgetpu
    device: 'usb:0'
mqtt:
  host: 192.168.1.2
  user: mqtt-username
  password: mqtt-password
save_clips:
  # Optional: Maximum length of time to retain video during long events. (default: shown below)
  # NOTE: If an object is being tracked for longer than this amount of time, the cache
  #       will begin to expire and the resulting clip will be the last x seconds of the event.
  max_seconds: 180
  # Optional: Retention settings for clips (default: shown below)
  retain:
    # Required: Default retention days (default: shown below)
    default: 7
    # Optional: Per object retention days
    objects:
      person: 7
ffmpeg: {}
cameras:
  YiCamera1:
    ffmpeg:
      inputs:
        - path: 'rtsp://192.168.1.230/ch0_0.h264'
          roles:
            - detect
            - rtmp
            - clips
    save_clips:
      # Required: enables clips for the camera (default: shown below)
      enabled: True
      # Optional: Number of seconds before the event to include in the clips (default: shown below)
      pre_capture: 10
      # Optional: Objects to save clips for. (default: all tracked objects)
      objects:
        - person
      # Optional: Camera override for retention settings (default: global values)
      retain:
        # Required: Default retention days (default: shown below)
        default: 1
        # Optional: Per object retention days
        objects:
          person: 1 
    height: 720
    width: 1280
    fps: 5
      

Hah, worked like a charm. Thank you for your help!

Hi,
Lovely project. I’ve been doing some testing today on my NUC10i3FNK.
Setup 1 camera, 15fps, 1024x576
For the moment i don’t own a google coral. But I was already astonished by the processing speed of the NUC itself.

If i just run the container with no hwaccl, my inference is steady around ~10-20ms

docker run -d \
--name frigate \
--privileged \
--mount type=tmpfs,destination=/cache,tmpfs-size=1024m,tmpfs-mode=1770 \
-v /DATA/frigate:/config:ro \
-v /DATA/frigate/clips:/clips:rw \
-v /etc/localtime:/etc/localtime:ro \
-p 5000:5000 \
-e FRIGATE_RTSP_PASSWORD='password' \
blakeblackshear/frigate:stable-amd64

So i thought i would also give it a shot adding hwaccl.
I’ve added in the docker run this line:

 -e LIBVA_DRIVER_NAME=iHD

And in the config.yml:

hwaccel_args:
  - -hwaccel
  - vaapi
  - -hwaccel_device
  - /dev/dri/renderD128
  - -loglevel
  - info

But if i do that… my inference drops to ~250ms and more…
Any idea what i might be doing wrong here?

Anyway i am already happy with the OOB performance, but using hwaccl it would be cleaner…

The inference speeds you are seeing are misleading. CPU performance is probably more in line with 250ms, not 10-20ms. It starts with a value of 10ms and computes a rolling average, so it takes a while to even out. Also, the hwaccel parameters reduce CPU usage for decoding the video stream. They will have no impact on inference speeds.

1 Like

Thanks for the quick reply! But when i enabled the hwaccel the CPU usage was even higher…
(enabled between the red lines)

Would it help adding more cores as detector?

cpu1:
  type: cpu
cpu2:
  type: cpu