Local realtime person detection for RTSP cameras

small update, got access to the container and ran the following, which seems like it’s not port forwarding correctly:

  root@9ff2793f0e48:/opt/frigate# awk 'function hextodec(str,ret,n,i,k,c){>     ret = 0
     n = length(str)
     for (i = 1; i <= n; i++) {
         c = tolower(substr(str, i, 1))
         k = index("123456789abcdef", c)
         ret = ret * 16 + k
     }
     return ret
 }
 function getIP(str,ret){
     ret=hextodec(substr(str,index(str,":")-2,2));
     for (i=5; i>0; i-=2) {
         ret = ret"."hextodec(substr(str,i,2))
     }
     ret = ret":"hextodec(substr(str,index(str,":")+1,4))
     return ret
 }
 NR > 1 {{if(NR==2)print "Local - Remote";local=getIP($2);remote=getIP($3)}{print local" - "remote}}' /proc/net/tcp
  Local - Remote
  0.0.0.0:5000 - 0.0.0.0:0
  127.0.0.1:38171 - 127.0.0.1:42314
  127.0.0.1:42314 - 127.0.0.1:38171
  172.17.0.2:53206 - 192.168.45.230:554
  172.17.0.2:33849 - 192.168.45.249:1883

Edit: Fixed! the container was in bridge mode rather than host mode… My bad - thanks for helping!

I’ve finally got things going, still doing some tuning on the region. When I walk into the view of the camera, I see the following:

queue full. moving on

Thoughts?

That would mean the processing queue is full for the Coral. If you only see it one time, that shouldn’t be a problem. With one camera you shouldn’t see that after the initial boot up at all. The Coral tops out at about 100 detections per second.

1 Like

Ok, that makes sense. I double checked my streaming URL and I think I was accidentally set to the substream, which (I think?) is much higher FPS.

I just got my first result today. At first I tried to run the container in a virtual machine on my Mac, but it couldn’t connect to the Coral.

Yesterday I upgraded from a RPI to a Celeron NUC running HassOS, I use portainer to manage the container, downloaded the image from docker hub.

In my first test I have got result straight away ! Very impressive, now I will start to figure out the regions and masking.

How difficult is it to run or build this project on a Nvidia GPU?

It would require some significant code changes. The underlying libraries for detection are specific to the Coral.

Sure, thank you! Do you have plans to support GPU too?

Maybe at some point. Using a Coral is a substantially lower cost to get started and extremely power efficient to run. I don’t think I would ever run a power hungry GPU instead even if I had a spare one. It would be possible for someone else to fork and modify the detection thread to use the GPU. The remaining architecture would still apply.

This is incredible! Thank you for all your work @blakeblackshear

I’m so close to getting this to work but declaring the MQTT Sensor in Home Assistant doesn’t work. I’m certain it is something simple but I’ve been looking at it too long to fix. It doesn’t like the device_class: moving

sensor:
  - name: Camera Person
    platform: mqtt
    state_topic: "frigate/<camera_name>/objects"
    value_template: '{{ value_json.person }}'
    device_class: moving
    availability_topic: "frigate/available"

I think I had the same issue. create it as a binary sensor, not a normal sensor.

that was it should have looked in the dev branch. Thank you !

So close to getting this to work. My sensors within Home Assistant are unavailable. I’m guessing it has to do with the MQTT not being able to communicate with Home assistant from Frigate. I am able to pull in the camera still from last person into Home Assistant.

with in dev tools

device_class: motion binary_sensor.camera_motion	unavailable	friendly_name: Camera Motion
device_class: motion```

**within config file:**

binary_sensor:
  - name: Camera Motion
    platform: mqtt
    state_topic: "frigate/back/objects"
    device_class: motion
    availability_topic: "frigate/available"

I'm getting these within the [MQTT Add-On](https://github.com/hassio-addons/addon-mqtt/blob/master/README.md)

1557328742: Denied PUBLISH from Ej4AulnI9xfoG50n`7<eM;tUmirqt1[_\M:x@c:pdQc0qC@p]cevl?eAbL37pX>c (d0, q0, r0, m0, 'frigate/back/objects', ... (17 bytes))
1557328766: Received PINGREQ from Ej4AulnI9xfoG50n`7<eM;tUmirqt1[_\M:x@c:pdQc0qC@p]cevl?eAbL37pX>c
1557328766: Sending PINGRESP to Ej4AulnI9xfoG50n`7<eM;tUmirqt1[_\M:x@c:pdQc0qC@p]cevl?eAbL37pX>c


This is the config within the MQTT Add-On

```{
  "log_level": "debug",
  "certfile": "fullchain.pem",
  "keyfile": "privkey.pem",
  "web": {
    "enabled": true,
    "ssl": false
  },
  "broker": {
    "enabled": true,
    "enable_ws": false,
    "enable_mqtt": true,
    "enable_ws_ssl": false,
    "enable_mqtt_ssl": false,
    "allow_anonymous": true
  },
  "mqttusers": []
}```

what do you do if your camera stream doesn’t require a username/password? my unifi camera doesn’t have an user/password on the rtsp stream when enabled.

You have to configure user/pw at the moment. Does setting one break your RTSP feed?

odd the first few times I tried it didn’t work maybe because I made the user name and password like null or 1 just tried with some random words for them and now working.

Any chance you would consider making the detection threshold configurable in the config.yml file?

I know this isn’t the best quality to test with, but this is from a wyze cam that has been outside for a while so it’s a bit beat up, but the .5 threshold is a bit too low I think:

Hi!

Just got my Coral a few weeks ago and decided to try it out on an old laptop of mine (Intel i5). Installed Debian 9.9 and docker-ce accordingly and compiled the docker and created the config.yml.
It compiles okay (with some warnings).

Three issues :slight_smile:
1.) SOLVED it is user: – will make a pull-req on the git on the config.yml to clarify. :slight_smile:
How do I define username and password for MQTT now-a-days when there is a config.yml file?
Is the following syntax correct? Tried both user: and username: and it doesn’t seem to work.

mqtt:
  host: 192.168.1.xx
  user: username
  password: password
  topic_prefix: frigate

2.)
When start up frigate it throws the following error:

On connect called
On connect called
On connect called
W third_party/darwinn/driver/package_registry.cc:65] Minimum runtime version required by package (5) is lower than expected (10).
On connect called
Capture process for uppfart: 31

  • Serving Flask app “detect_objects” (lazy loading)
  • Environment: production
    WARNING: Do not use the development server in a production environment.
    Use a production WSGI server instead.
  • Debug mode: off
  • Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
    Exception in thread Thread-5:
    Traceback (most recent call last):
    File “/usr/lib/python3.5/threading.py”, line 914, in _bootstrap_inner
    self.run()
    File “/opt/frigate/frigate/object_detection.py”, line 92, in run
    cropped_frame_rgb = cv2.cvtColor(cropped_frame, cv2.COLOR_BGR2RGB)
    cv2.error: OpenCV(4.0.1) /usr/local/src/opencv-4.0.1/modules/imgproc/src/color.cpp:181: error: (-215:Assertion failed) !_src.empty() in function ‘cvtColor’

3.) SOLVED Do not use trailing “/” … just http://192.168.1.xx:5000/uppfart
And when trying to access http://192.168.1.x:5000/uppfart/ my browser says “error 404” and the docker throws the following error:

192.168.1.120 - - [10/May/2019 13:38:57] "GET /uppfart/ HTTP/1.1" 404 -
/usr/local/lib/python3.5/dist-packages/werkzeug/filesystem.py:60: BrokenFilesystemWarning: Detected a misconfigured UNIX filesystem: Will use UTF-8 as filesystem encoding instead of 'ascii'
  BrokenFilesystemWarning,
192.168.1.120 - - [10/May/2019 13:38:58] "GET /favicon.ico HTTP/1.1" 500 -
Error on request:
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/werkzeug/serving.py", line 302, in run_wsgi
    execute(self.server.app)
  File "/usr/local/lib/python3.5/dist-packages/werkzeug/serving.py", line 292, in execute
    for data in application_iter:
  File "/usr/local/lib/python3.5/dist-packages/werkzeug/wsgi.py", line 507, in __next__
    return self._next()
  File "/usr/local/lib/python3.5/dist-packages/werkzeug/wrappers/base_response.py", line 45, in _iter_encoded
    for item in iterable:
  File "/opt/frigate/detect_objects.py", line 79, in imagestream
    frame = cameras[camera_name].get_current_frame_with_objects()
KeyError: 'favicon.ico'
192.168.1.120 - - [10/May/2019 13:38:59] "GET /uppfart/ HTTP/1.1" 404 -

My full config.yml

web_port: 5000

mqtt:
  host: 192.168.1.10
  username: username
  password: password
  topic_prefix: frigate

cameras:
  uppfart:
    rtsp:
      user: camop
      host: 192.168.1.30
      port: 554
      # values that begin with a "$" will be replaced with environment variable
      password: ultrasecret!
      path: /Streaming/Channels/2
    mask: back-mask.bmp
    regions:
      - size: 350
        x_offset: 0
        y_offset: 300
        min_person_area: 5000
      - size: 400
        x_offset: 350
        y_offset: 250
        min_person_area: 2000
      - size: 400
        x_offset: 750
        y_offset: 250
        min_person_area: 2000

Thanks! Hope I got it sorted! :slight_smile:

I added thresholds per region in the 0.1.2 release. See here: https://github.com/blakeblackshear/frigate/blob/master/config/config.yml#L24

Error #2 would mean that one of the regions results in an empty set of pixels. What is the resolution of your camera? Can you view the RTSP stream in VLC?