Local realtime person detection for RTSP cameras

Hi all,

when setting the height and width parameters do they need to match the stream or do you set them lower to use less resources?

I have a wyze cam v3 but have not figured out how to have a high res (for clips) and low res stream (for detect) at the same time.

Edit: seems the cam itself doesn’t do substreams.

Object is represented as a box in image when related to zone. Large object close to zone will be considered as “entered zone”

Define area on outside fence. If “current zone” outside and “entered zone” inside fence maybe ignore since object likely outside fence

If “current zone” inside and “entered zone” outside object is inside

If only inside it’s inside

I see a few people have mentions issues with their USB Coral suddenly having issues, does anyone have any suggestions how I can fix this.

Running Frigate in a docker on an unRAID server. Has been running without issue for several months. Suddenly last Friday, the docker died with a python error. I’ve tried re-starting the docker several times - when checking the logs I see the following errors, either:

F :1150] HandleEvent failed. USB transfer error 1 [LibUsbDataInCallback]

or

F :1150] HandleQueuedBulkIn transfer in failed. Unknown: USB transfer error 1 [LibUsbDataInCallback]

I see from GitHub that a similar issue has been raised before, but the solution provided (running privileged) doesn’t work as the container is already running privileged.

  • Changing USB ports (USB2 → USB3 → USB2) has made no difference.
  • Cables are the ones provided with the TPU and haven’t changed.
  • TPU LED is on
  • No other devices on the server are USB, so power shouldn’t be an issue
  • Docker container can see the device
  • Re-installing the docker has made no difference

I’ve run out of ideas.

Log extract:

frigate.mqtt INFO : MQTT connected
detector.coral INFO : Starting detection process: 37
frigate.app INFO : Camera processor started for drive: 40
frigate.edgetpu INFO : Attempting to load TPU as usb
frigate.app INFO : Capture process started for drive: 41
frigate.edgetpu INFO : TPU found
F :1150] HandleQueuedBulkIn transfer in failed. Unknown: USB transfer error 1 [LibUsbDataInCallback]

Fatal Python error: Aborted

Frigate 1.14

1.14

  • Allow access to side panel for non-admins

Omg omg omg xD :smiley: wooohoooo
This is freaking awesome…Finaly my parents and my wife will not be admins :smiley:

Is it possible to use Coral ( M.2 Accelerator with Dual Edge TPU 8 bit Module) with adapter, to plug it in PCIX1 port on computer running debian? If so, what adapters are you using? Im bit confused, how to connect it to pc without M.2 port.

@Minglarn Scheisse!!!

Like I said I was at my summer cottage to exclude possible router NAT problem (even if port probing sees that port is open). And inside same wlan I cannot get vlc stream working from 510WA.

Is it anyhow possible that you copy stream address (just in case) “again” from vlc player to me and I still have to recheck it. This is frustrating me like a hell.

rtsp stream is somewhoh not usable - there is some artifacts and weird lines…

Sure… Just a tiny question…
Is WA battery (solar?) powerd? In that case it’s not possible to get a stream from that cam.
Anyhow…
The RTMP stream copied directly from my VLC is:

rtmp://<<replace this with your IP>>/bcs/channel0_main.bcs?channel=0&stream=1&user=admin&password=TOPSECRETSUPERLONGPASSWORD

No it is powered with cable and included DC Power. Thx for this… have to recheck. Reolink support hasn’t been answered my question even if I asked this now 2nd time during 6 months ;(

Hi did you manage to sort out this error ?As i am having similar issue I am running headless mini/server pc to run HA and Frigate NVR thru my MAC.I am trying to load Frigate in the container thru Portainer CE how ever i am receiving the same error : error parsing config :[error 21] Is a directory config/config.yml. I have a Frigate config.yml file which is is on my Mac , however, it does pick it.up the file path file. Do you know if this could be because the File is on the mac and not the proxmox server. Any advice would e helpful.

Man i have such a terrible headache of trying to get my camera working on frigate. Its terrible that it doesnt support H265. And i tried like a 1000 args and commands but it just isnt working. In fact i have actually not a single clue what all the args like -f or -fflags and - -hwaccel all should do. So i seem to be stuck on this.

When opening the stream in vlc player i get this codec info:
Knipsel

Can some one please help me in what to put in: input_args, hwaccel_args and output_args for this to work.

It is an imou camera. I also have a foscam camera which works without have to use any of the args…

Also good to mention is the fact that the low res stream of the imou cameras do work. But i would really like to use the highres ones.

First off I have to say that Frigate is an amazing add-on: I run my HA instance on an old mini PC, which has a 1.83GHz Intel Celeron N2930 (a Zotac Zbox CI320 nano Desktop Computer to be precise :), and have been running Frigate with a usb Coral for a couple of months now.

It took me a few weeks for it to be reliable though, as when I used the recommended hardware acceleration settings for the <10 Gen Intel, although it did reduce CPU consumption from the mid 30s to the mid teens, the entire mini PC would crash and lockup within a day or two. There was never anything particularly useful in the logs, but I eventually figured out that it would only occur with Frigate enabled, and more specifically only if I was running the hardware acceleration settings.

I decided it was obviously better to have Home Assistant constantly running at 25-35% CPU than to have it crash every couple of days, and have since let it do its thing, but recently thought I should find out if implementing some of the settings would result in a slight CPU reduction or if it was an all or nothing situation.

The settings which relatively quickly crash my install are the github ones mentioned for Intel-based <10Gen, ie:

ffmpeg:
  hwaccel_args:
    - -hwaccel
    - vaapi
    - -hwaccel_device
    - /dev/dri/renderD128
    - -hwaccel_output_format
    - yuv420p

I know that the CPU supports VA-Api, so those arguments should work, but are there any other less intensive hwaccel settings I could use?
Weirdly enough after I upgraded my Debian install this afternoon from 10 to 11, the un-accelerated CPU usage rose from 25% to 35%, putting the overall system at around 40% with nothing happening, so I have turned all the standard settings back on and I will see if the crashing persists.

Hi did you find a solution this as i having the same problem

Hi. I installed and configured Frigate on my RPI4B running HassOS just few days ago, and I already love it!!! It works wonderfully with 3 cameras. I ordered Coral TPU, but even without it, it works perfectly.
I’m missing just one thing… a switch to turn 24/7 recording on/off, so I can schedule it or make it dependant on the state of the sonoff that is giving power to cameras.
Great job, thank you.

I find myself in a situation comparable to @tieke’s, only worse. Also running on my previous HTPC with an Intel N3700 (gen 8), supervised install.Got Frigate (latest beta) addon with Coral and all running without problems or hickups. But … when I activate the hwaccel with the recommended parameters the system goes crash boom within 5 minutes, each and every time. I’ve tried all sorts of parameter mods, even tried the standalone docker version, but all to no avail.
The ‘funny’ thing is that until frigate takes the whole system down, everything works perfectly till the last second. Nothing in the logs, cpu utilization more or less only half as compared to non-hwaccel running, gpu being used as shown by intel-gpu-top, etc.
As I’m hovering around 70% with only 3 cameras doing nothing (with even ‘detection’ switched off !), this will be unworkable without getting hwaccel going - final installation is supposed to monitor 6 cameras, maybe more …

I have just started playing with frigate now that I received my coral m2! The first day I had a bad lockup but since its been stable. I am using these devices in my docker:

/dev/apex_0 (coral via m2 pcie 1x adapter)
/dev/dri/renderD128 (have an older 2nd gen i3)

And my frigate config has this, with no mention of hwaccel

detectors:
coral:
type: edgetpu
device: pci

I am confused if i should be using hwaccell AND the coral. and are my docker devices ok do I just need to add hwaccel to my frigate config file?

Thanks for any advice

You should use both coral and HW acceleration in your config as they both have different duties in Frigate.
You will need an 4th gen CPU or newer for hwaccel.

Defining the FPS is just the FPS of detection. The Blake says in the docs to keep it at 5 fps.

Thank you. So for 2nd gen there is no need to put anything in the config for hwaccel, since it isn’t supported until 4th gen?

Anyone been able to get google nest cams, ie from google home max, into frigate? I see there was some talk last july but i didn’t see any resolution. Elsewhere i’ve seen some people publishing their feed publicly in order to get at it via ispy but i’m not too keen on that idea.

Can any one PLEASE PLEASE help me with this problem I have created an new container and still receiving this error for which i cannot find a solution

Error parsing config: [Errno 21] Is a directory: ‘/config/config.yml’

  • Starting nginx nginx

    …done.


 my config.yml  file is located  imac   /user/mike/config/config.yml or on proxmox pve storage i can use /mnt/data/frigate /config.yml.

 I cannot understand why i always getting this error regarding  Is a directory . yes it is a directory up to the point  of either config or frigate location where the nano config.yml  is.

mqtt:
host: 192.168.0.74
user: mikey
password: mqtt6283

cameras:

Reolink

garage_driveway:
ffmpeg:
inputs:
- path: rtsp://admin:[email protected]:554/h264Preview_01_sub
- roles:
- detect
- clips
# motion:
# mask:
width: 640
height: 480
fps: 5
objects:
track:
- person
snapshots:
enabled: true
timestamp: false
bounding_box: true
retain:
default: 1
clips:
enabled:
retain:
default: 1
front_driveway:
ffmpeg:
inputs:
- path: rtsp://admin:[email protected]:554/h264Preview_01_sub
- roles:
- detect
- clips
# motion:
# mask:
width: 640
height: 480
fps: 5
objects:
track:
- person
snapshots:
enabled: true
timestamp: false
bounding_box: true
retain:
default: 1
clips:
enabled: true
retain:
default: 1
back_garden:
ffmpeg:
inputs:
- path: rtsp://admin:[email protected]:554/h264Preview_01_sub
roles:
- detect
- clips
width: 640
height: 480
fps: 5
objects:
track:
- person
snapshots:
enabled: true
timestamp: false
bounding_box: true
retain:
default: 1
clips:
enabled: true
retain:
default: 1
koi_pond:
ffmpeg:
inputs:
- path: rtsp://admin:[email protected]:554/h264Preview_01_sub
roles:
- detect
- clips
# motion:
# mask:
width: 640
height: 480
fps: 5
objects:
track:
- person
snapshots:
enabled: true
timestamp: false
bounding_box: true
retain:
default: 1
clips:
enabled: true
retain:
default: 1
detectors:
cpu1:
type: cpu
cpu2:
type: cpu```

 I hope someone can help.

You can change ether imou cameras to h.265.

There is software you can install on windows to get access to all the streaming settings. Check their website