Local realtime person detection for RTSP cameras

Thanks - I did see that, but got it confused with the HassOS addon (which I had already installed for pre-docker testing)

Just checked intel_gpu_top with the container both running and then stopped - all metrics drops to almost zero when stops, so hardware acceleration is definitely working. Def need to get a Coral in play through–CPU idles around 70+% during the day (wind and shadows I’d guess)–order has been picked up in China (so some time between 2 weeks and 3 months to wait :slight_smile: )

Thanks again!

No problem Ben,

Good luck ! I’m not quite ready for Frigate just yet, it’s on my to do :slight_smile:
But as a new user to HA I’m still evaluating the wealth of options that this community has posted - For example In terms of TensorFlow it may be more efficient to do this through nodered instead of Frigate etc etc etc

I’m so tempted to order a coral but I have bigger fish to fry like getting a stable system running prior to playing with topics like this.

Good luck and kind regards

Hello… dumb quesstion… when i add the cameras generated from Frigate to my lovelace dashboard… the “fullscreen” (more info) of the camera is really small… I have NO stream activated. My cameras are configured as follows:

  garage:
    ffmpeg:
      inputs:
        - path: rtsp://40.40.40.220:554/media/stream1
          roles:
            - detect
            - rtmp
            - clips
            - record
    width: 1280
    height: 720
    fps: 23

BUT it is too small when opennng the entity:

Any idea why it is so small? and how can i make it bigger??? :frowning:

If i’m not wrong the video (camera) i see here is the rtmp stream? Possible to make the resolution bigger?

Now I have my system running on a more powerful system I’ve been doing some testing with notifications and delays and noticed there is some considerable delay by using the event trigger. AFAIK the delay is because the clip can’t be sent until the event is over?

Using the binary sensor for motion seems to be a lot quicker (almost instant) so I’ve started using this on top of the one with the clip:

- id: Camera Back Garden Person Alert
  alias: Camera Back Garden Person Alert
  trigger:
    platform: state
    entity_id: binary_sensor.back_garden_person_motion
    to: 'on'
  action:
  - service: notify.mobile_app
    data_template:
      title: "Back Garden Camera Person Detected"
      message: "Full clip will follow"
      data:
        image: "/api/camera_proxy/camera.back_garden"
        actions:
          - action: URI
            title: View Cameras in HA
            uri: '/lovelace-security/cameras'

It’s not as refined as the other as I need one automation per camera and per object. Ideally I’d like to include a link to events from the first notification but that would mean getting Frigate events inside HA for external viewing or exposing frigate web gui to the web. Is there a way to turn of the clips api so this isn’t exposed to the web openly?

1 Like

Node-RED notification flow for IOS Users - Send notification when an object is detected, 3D/force touch into notification for live view of camera, use actionable notification to send the clip (intelligently waits for clip to be ready)

Example:

NodeRED Flow:

Requirements:

  • Frigate HA Integration
  • Home Assistant Companion App (IOS)
  • Home Assistant accessible from the internet (or via Nabu Casa)

Installation:

Copy the flow to NodeRED (Hamburger Menu => Import):
https://paste.ubuntu.com/p/XD62VwNj7s/

Edit the first “Format notification” function node and add your external URL. Optionally, edit case statement to customize your camera to location mapping. (For example, I customize the camera.doorbell location to be “front door”)

Add following to your home assistant configuration for actionable notifications.

ios:
  push:
    categories:
      - name: camera
        identifier: 'camera'
        actions:
          - identifier: 'SENDSTILLANDCLIP'
            title: 'Send still and video clip'
            authenticationRequired: false

Restart HA and sync push categories to your phone (HA App => App Configuration => Notifications => Categories)

That’s it, you’re all done!

For my personal setup, I also add in history checking of my external doors and current state of doors to limit alerts. For example, don’t notify if someone is at the front door if the front door opened in the last five minutes. I also use a input_boolean.person_notification to control if alerts are enabled, and add a actionable notifcation above called “SNOOZEPERSON” that temporarily disables notifications for four hours (useful when the kids are playing in the yard).

I also push all Frigate clips to my Roku TV (if it’s idle) as well and announce alerts on Google Homes / Amazon Dots.

That full flow, for advanced users, is here:
https://paste.ubuntu.com/p/bPjDwDC9ms/

5 Likes

has anyone an idea how to overlay the time with output / input parameters to ffmpeg ?

hi,
could some one please help

I’m running frigate on hassio, RPi 4 8gb
I have a feed showing can I can see it tracked me


yet when i click in the events nothing is there, also i don’t see anything sent via mqtt

here is my settings

mqtt:
  host: 10.0.0.9
  user: *****
  password: ****
  port: 1883
  topic_prefix: frigate
  client_id: frigate
cameras:
  front:
    ffmpeg:
      inputs:
        - path: rtsp://user:[email protected]:port/Streaming/channels/302/
          roles:
            - detect
            - rtmp

    width: 960
    height: 576
    fps: 12

detectors:
  cpu1:
    type: cpu
  cpu2:
    type: cpu

what am I missing or what am I doing wrong?

Do you have the integration installed and setup?

Yes I have MQTT setup and running

I meant for frigate. I’m running HassOS so I don’t know if it looks different for other types of installs, but what I have is if in your HA you go to Configuration - Integrations.
Do you have one for Frigate?
Mine looks like this:
image

1 Like

I have frigate running on the same rpi as home assistant via the supervisor add-on, do I also need an integration?

Frigate works best with Home Assistant when using the official integration. Check the docs.

1 Like

Thank you @danbutter and @hawkeye217 I completely missed the fact there is also an integration.

I thought the add on would just post to HA via MQTT

The integration is working

Hi @S-Przybylski, if I understand well you were able to enable HW acceleration on a raspberry. I’m using HA on a Raspi4 as well and before acquiring the costly Coral accelerator I would like to experiment a little bit more with “frigate” which is always working “borderline” with no HW acceleration. Would you mind sharing the steps you have done to allow me to obtain the same result? Thank you

hi all,
i have this set up and yet I could do with some help with capturing my cars on my drive.

i can see car was triggered by people walking past how can i get it to detect both cars instead of just one

Screenshot 2021-03-28 132153

@blakeblackshear, thank you for sharing your effort I’m just starting to get a glimpse of all the possibilities offered by this nice piece of software. Being far from an expert after having assembled the bare minimum (raspi4 HA, on docker and 4 dahua cameras) I am evaluating the purchase of the costly Coral but for now not having seen much of a “person” identification and after having played with some settings I found out that on the cameras there are “regions” (green rectangles which move and change over time on the screen) which I did not set. Would you please help me understand where they come from what’s their purpose and if/how I can modify them or point me to some sort of a guide. Thank you

For Pi4 try “-c:v h264_v4l2m2m”

Thank you, I failed trying that but now I finally understood what “blake” wrote in the guide about disabling the “Protection mode” (for someone with little experience like me: it’s a a virtual switch on the interface of the addon) and it seems to work now with no errors in the log. I will probably have now to decide the purchase of the coral to establish how effective all that is for my necessity. Again thank you :slight_smile: Btw do you happen to know why the “retain instruction” set to 1 doesn’t cancel the clips after one day? May I cancel them manually without the DB getting cross at me?

Firstly, Frigate looks fantastic. Amazing work.

I am setting up Frigate for the first time. My set up:

  • Intel NUC 5i5RYH
  • Docker on Ubuntu 20.04
  • Coral on order, but I’ll be using CPU for now

It’s my fist time using docker-compose, and I’d like some advice, please: Based on the documentation, I have so far modified docker-compose.yml as follows:

amd64version: '3.9'
services:
  frigate:
    container_name: frigate
    privileged: true # this may not be necessary for all setups
    restart: unless-stopped
    image: blakeblackshear/frigate:stable-amd64
    devices:
      - /dev/bus/usb:/dev/bus/usb
      - /dev/dri/renderD128 # for intel hwaccel, needs to be updated for your hardware
    volumes:
      - /etc/localtime:/etc/localtime:ro
      - <path_to_config_file>:/config/config.yml:ro
      - <path_to_directory_for_media>:/mnt/servernvr
      - type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
        target: /tmp/cache
        tmpfs:
          size: 1000000000
    ports:
      - '5000:5000'
      - '1935:1935' # RTMP feeds
    environment:
      FRIGATE_RTSP_PASSWORD: 'password'

Some questions:

  1. For the media directory, I will use a mounted Windows share via Samba accessible at /mnt/serverdvr. Is that the correct way to specify the path for docker-compose?
  2. I’m unsure what to put for Intel hwaccel. lshw -c video returns configuration: driver=i915. Should I specify /dev/dri/i915 or does hwaccel refer to something different. Googling “NUC 5i5RYH hwaccel” gives zero results.
  3. Am I likely to need to change amd64version?
  4. Should anything change in docker-compose when I start using Coral?

Thanks a a lot!