Local realtime person detection for RTSP cameras

Not for now. I will be adding that in v4 because there is quite a bit of processing to track objects, and there is no reason to waste CPU on things you don’t care about. I will likely also create a way to combine object types. At the moment, trucks and cars are often confused on my cameras. I want to be able to merge them into a single vehicle type without having to retrain a model.

ok cool, i wasn’t sure if it was more efficient to process them and ignore or to avoid processing in the first place.

+1 for the region labels and per region granularity. great idea.

Hello, it looks like a really cool project! (I’m just starting with Home Assistant)

Is there a way to test the image with CPU only? (with a low frame rate or every 10secs / 20secs)
I would like to test the integration with my existing environment without buying the coral device yet :slight_smile:

sudo docker run --privileged --rm blakeblackshear/frigate:0.3.0-beta python3 -u benchmark.py
Traceback (most recent call last):
  File "benchmark.py", line 9, in <module>
    engine = DetectionEngine(PATH_TO_CKPT)
  File "/usr/local/lib/python3.6/dist-packages/edgetpu/detection/engine.py", line 72, in __init__
    super().__init__(model_path)
  File "/usr/local/lib/python3.6/dist-packages/edgetpu/basic/basic_engine.py", line 40, in __init__
    self._engine = BasicEnginePythonWrapper.CreateFromFile(model_path)
**RuntimeError: No Edge TPU device detected!**

While i understand that its abit hard to get rid of the timestamp for now, but is there anyway to adjust timezone or where does it reference it from? Mine is coming in at GMT where I need GMT +2.

Has someone tried this yet?

I would be interested to get a PCIe 1x card with two M.2 B+M slots like those:


And adding one or two of these new Coral M.2 accelerators.

Then passing through the PCI slot in ESXi to a dedicated ubuntu/debian VM and running one or two docker containers in there.
With the right driver installed (https://coral.ai/docs/m2/get-started/) one or two “/dev/apex_0” devices should pop up.

Would the frigate container recognise these devices? Or is it only looking for the usb one at the moment? " /dev/bus/usb"

@blakeblackshear Do we still need to manually build the images for a RPi4? I’m running a fresh install of Hypriot but I’m getting this error:
standard_init_linux.go:211: exec user process caused "exec format error" if I use either
image: blakeblackshear/frigate:0.3.0-beta
or
image: blakeblackshear/frigate:0.2.2 in my docker-compose.yml

However, the image @ivelin built here does work for me. I’ve tried manually building previously but was getting errors (haven’t tried again yet as it takes hours - so just wanted to check if there was a way to skip that step).

Otherwise, it’s all working great and can’t wait to use it in anger :grinning:

Yes, you can use the old CPU only version from here. But there have been so many changes that it is very different than the new Coral versions.

It still requires a manual build for a raspberry pi. Automating that is on the to do list, but it doesn’t seem like many people are running on a pi.

1 Like

It should recognize them. I’m using the official sdk from Google. I’m not specifying the location of the device. It just finds it.

Take a look at the example command in the Readme. Passing in /etc/localtime as a volume will set the time to whatever your host is.

1 Like

Im running this off of UnRaid, so the compose is different and I don’t get that option.

Maybe something like this will work: https://stackoverflow.com/a/39181626

Hi Blake

Couple of days into running my setup and it’s great (other than discovering the old i3 laptop actually has USB2 only, so I am severely limited by the number of regions and FPS I can process).

On the one hand I like the idea of regions. I have, for example, a camera facing the front of my yard. There’s a big sliding gate in the fence where I’ve set up one region which is actively looking for cars there (i.e. somebody is waiting outside, waiting for me to open up the gate for them). Then when they park inside on my driveway, I have another region where I don’t want to find cars since it will find them the whole time while they’re parked there.

Person detection is exactly the opposite in the two regions - I want to ignore pedestrians walking on the sidewalk but I want to detect persons walking around inside my yard.

Not sure if I’d loose this ability with the dynamic regions?

I’m expecting that use case to still be supported.

1 Like

Any chance a config option could be added to only report the objects specified in the config? that way the retained snapshots for all the mistaken detections doesn’t take up bandwidth when a mqtt client connects. Or possible a blacklist of objects you don’t care about?

I just updated to to the beta, and most objects are false detections :50 PM

That will be in the next version

@scstraus I was getting similar issues you had with your Hikvision cameras a couple of months back. Running a substream of 640x480, 6fps but had it running variable framerate.

I just switched over to a fixed framerate - does seem to improve some things.

I then tried switching on hardware acceleration for Intel processors (I’m running an Intel Core i3-2348M), as per suggestion code here, but I Immediately get this warning

WARNING: Invalid RefPicListX[] entry!!! It is not included in DPB

and I also now seem to lose both my camera feeds at the same time. In other words, I get two of these simultaneously:

 ffmpeg didn't return a frame. something is wrong.exiting capture thread...
 ffmpeg didn't return a frame. something is wrong.exiting capture thread... 

Anybody know anything about this? I’m probably going to switch off hardware acceleration again…

Hello, I read the thread and I am very envious because I use hass.io and my knowledge is not many. Have you thought about integrating this as an addon for hass.io?
Greetings and thank you

I wasn’t able to get hardware acceleration working on my synology so far. I probably need to compile a custom container for it or something. Someday I will probably get around to trying to figure it out, but so far I can’t help you.

I ended up figuring out that

  1. hardware acceleration works even though I get the weird Invalid RefPiclistX[] entry
  2. I really need to put the machine running frigate on the network. My ffmpeg streams would last 30 mins and then start going down every minute. Switching over to a wired connection from Wifi solved this. It is much more stable now.
1 Like