Local realtime person detection for RTSP cameras

ok will try, anyway, I can see this in the logs:

2020-12-25T16:55:14.026782560Z frigate.app                    INFO    : Creating tmpfs of size 256m

so it seems the size it not really used…
EDIT: forget this, I have seen and disabled in config file.

My MQTT issues seem to be resolved by adding a username and password to the mqtt config. I use mosquito Add-on and it is set to allow anonymous connections but it seems intermittent. Setting up a frigate user and using that user seemed to fix my issues. I wonder if MQTT was failing the whole time, I noticed there was another issue submitted indicating the connection status doesn’t properly log a failure… either way it seems to be working now

That tmpfs message is referring to the memory storage for frames during processing. That is separate from the video cache for clips.

ok, I had this below also set. commented it now. Seems running without issues now. Will let it run for some days, then move from VAAPI to QSV to check CPU load difference

EDIT: briefly tested, and seeing worse perf using QSV than VAAPI. Intel gen7 NUC. I need to check if QSV is enabled on docker host I suppose…

# Notice: If you have mounted a tmpfs volume through docker, this value should not be set in your config
# tmpfs_cache_size: 256m

Strange thing, with the 0.8.x, and the custom component, I get the previous detection image for the MQTT cameras created by the custom component. I use the service camera.snapshot in a automation to send a notification with the image. Each notification give the previous detected image.

It’s working with 0.7.x and with manually created MQTT cameras (topic frigate/back/person/snapshot)

I solved this (probably in quite an inefficient manner, haha) by using a folder watcher sensor in HA and setting up the following automation. This calls ffmpeg when a new clip is found, then trims down this clip to 8 seconds and sends to my phone via telegram. Again, this is probably really ineficient, but it works with very high reliability.

Anyone able to help me out with zones? I originally used a mask for everything I didn’t want notifications on and that worked great but I still wanted to record all detected objects so I removed the mask, and added a zone. I’m trying to detect using entered zone however it’s detecting objects outside my zone and it keeps detecting the car in my neighbors driveway (randomly?) that’s not even moving. Thoughts/ideas?

Zones don’t prevent objects from be detected or tracked. They just act as a sensor to let you know when an object enters the zone. You should setup your notifications to use a condition that checks the entered zones list for your zone before notifying. Remember that the zone is based on the bottom center of the bounding box. Based on the neighbors car location, I can see how the bottom center would be inside your zone. You can also create multiple zones that overlap if you want one for person and a separate one for car.

1 Like

Why so much ram is needed? There is no cleanup being done?
How can I try to calculate approximate size needed? It seems to top around 250MB.

root@2f7fee5081df:/opt/frigate# df -h
Filesystem                                  Size  Used Avail Use% Mounted on
...
tmpfs                                       954M  247M  708M  26% /tmp/cache

Of course there is cleanup. In the save_clips config section, the max_seconds value determines the maximum number of seconds of video that can be stored for each camera. The default is 300 seconds, so you will need enough space to store 5 minutes of video from each camera for creating clips. You don’t technically need to store that cache in memory.

1 Like

I’m attaching the best.jpg to Pushover as an automation trigger but occasionally I’m seeing the previous detection best snapshot instead of the current one. Is there a way to fix this? Would it be possible to add a last.jpg as an end point?

The way I’m doing it is to take the mqtt camera for whatever object and send the camera snapshot after the sensor fires.
Using beta 3 and the home assistant integration to do this.
If anyone needs more info let me know and when I’m on a pc I can post more.

thanks jon
just solved this the way i was planning. could have finished this a lot more efficient if i would have realized that the file was only available after the “post_detection time + some writing to disk time”. Anyway, solved now. See below for the flow. maybe usefull to someone else too.

next problem to solve is that the video file isnt opening on my iphone. format issue? because in my telegram webapp it plays normal.
the mp4 it is the YUV 4.2.0 format. Can i change this while recording the event in something playable on my iphone?

[{"id":"ed123003.62d77","type":"tab","label":"Flow 2","disabled":false,"info":""},{"id":"6eaaeada.8cda3c","type":"mqtt in","z":"ed123003.62d77","name":"frigate/events","topic":"frigate/events","qos":"2","datatype":"auto","broker":"77a96594.e33e9c","x":270,"y":620,"wires":[["7358db7e.c2dfdc","abd868da.34c728"]]},{"id":"7358db7e.c2dfdc","type":"json","z":"ed123003.62d77","name":"test","property":"payload","action":"","pretty":false,"x":490,"y":620,"wires":[["9b33a525.55e91","74c0fabc.591064"]]},{"id":"9b33a525.55e91","type":"switch","z":"ed123003.62d77","name":"start switch","property":"payload.after.end_time","propertyType":"msg","rules":[{"t":"nnull"}],"checkall":"true","repair":false,"outputs":1,"x":650,"y":620,"wires":[["a81edade.31f6b8","b681eb88.5fb2f8","dbbe21dd.50b34"]]},{"id":"53875984.567dc8","type":"template","z":"ed123003.62d77","name":"","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"192.168.1.77:5000/clips/{{payload.after.camera}}-{{payload.after.id}}.mp4","output":"str","x":940,"y":620,"wires":[["20e67aa5.0a8a46","5b16fdb.020f904"]]},{"id":"20e67aa5.0a8a46","type":"debug","z":"ed123003.62d77","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":1070,"y":700,"wires":[]},{"id":"c06d9cfc.564628","type":"debug","z":"ed123003.62d77","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":1130,"y":440,"wires":[]},{"id":"a81edade.31f6b8","type":"debug","z":"ed123003.62d77","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":830,"y":440,"wires":[]},{"id":"5b16fdb.020f904","type":"http request","z":"ed123003.62d77","name":"","method":"GET","ret":"bin","paytoqs":"ignore","url":"{{{payload}}}","tls":"","persist":false,"proxy":"","authType":"","x":1090,"y":620,"wires":[["b17648d6.250b98","c06d9cfc.564628"]]},{"id":"b17648d6.250b98","type":"function","z":"ed123003.62d77","name":"","func":"msg.payload = {\n    chatId: 11111111,\n    type: 'video',\n    caption : 'You must have a look at this!',\n    content: msg.payload,\n}\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":1300,"y":620,"wires":[["e76305e0.290538"]]},{"id":"e76305e0.290538","type":"telegram sender","z":"ed123003.62d77","name":"","bot":"8e57f19d.161d7","haserroroutput":false,"outputs":1,"x":1480,"y":620,"wires":[[]]},{"id":"b681eb88.5fb2f8","type":"debug","z":"ed123003.62d77","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":830,"y":760,"wires":[]},{"id":"abd868da.34c728","type":"debug","z":"ed123003.62d77","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":430,"y":720,"wires":[]},{"id":"48286e1e.ea221","type":"inject","z":"ed123003.62d77","name":"","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"{\"before\": {\"id\": \"1609055550.60505-q70b6d\", \"camera\": \"bdc_frigate\", \"frame_time\": 1609055551.1853, \"label\": \"person\", \"top_score\": 0.66796875, \"false_positive\": false, \"start_time\": 1609055550.60505, \"end_time\": 1609061581.059547, \"score\": 0.76171875, \"box\": [1251, 687, 1414, 879], \"area\": 31296, \"region\": [1176, 633, 1476, 933], \"current_zones\": [], \"entered_zones\": [], \"thumbnail\": null}, \"after\": {\"id\": \"1609055550.60505-q70b6d\", \"camera\": \"bdc_frigate\", \"frame_time\": 1609055551.411159, \"label\": \"person\", \"top_score\": 0.66796875, \"false_positive\": false, \"start_time\": 1609055550.60505, \"end_time\": 1609061581.059547, \"score\": 0.640625, \"box\": [1245, 693, 1415, 900], \"area\": 35190, \"region\": [1180, 629, 1480, 929], \"current_zones\": [], \"entered_zones\": [], \"thumbnail\": null}}","payloadType":"str","x":280,"y":540,"wires":[["7358db7e.c2dfdc"]]},{"id":"74c0fabc.591064","type":"debug","z":"ed123003.62d77","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":630,"y":560,"wires":[]},{"id":"dbbe21dd.50b34","type":"delay","z":"ed123003.62d77","name":"","pauseType":"delay","timeout":"30","timeoutUnits":"seconds","rate":"1","nbRateUnits":"1","rateUnits":"second","randomFirst":"1","randomLast":"5","randomUnits":"seconds","drop":false,"x":800,"y":620,"wires":[["53875984.567dc8"]]},{"id":"77a96594.e33e9c","type":"mqtt-broker","name":"Eclips Mosquito","broker":"192.168.1.77","port":"1883","clientid":"","usetls":false,"compatmode":false,"keepalive":"60","cleansession":true,"birthTopic":"","birthQos":"0","birthPayload":"","closeTopic":"","closeQos":"0","closePayload":"","willTopic":"","willQos":"0","willPayload":""},{"id":"8e57f19d.161d7","type":"telegram bot","botname":"Steenbank_Telegram","usernames":"","chatids”:”11111111”,”baseapiurl":"","updatemode":"polling","pollinterval":"300","usesocks":false,"sockshost":"","socksport":"6667","socksusername":"anonymous","sockspassword":"","bothost":"","botpath":"","localbotport":"8443","publicbotport":"8443","privatekey":"","certificate":"","useselfsignedcertificate":false,"sslterminated":false,"verboselogging":false}]

I forgot about the adjustments needed for RTMP camera’s.
Working now.

If you don’t mind pasteing this part of your automation I’d appreciate it!

Same problem here, I get the previous detection image for the MQTT cameras created by the custom component.

This is the automation I use. Should be pretty easy to edit to work for you with pushover.
Just keep in mind that this is using the .80 beta3 and the home assistant integration. Otherwise these cameras don’t exist to pull snapshots from.
If one knows where these snapshots are stored by mqtt it would be even simpler.
This is just my workaround.

alias: Frigate Person Frontyard
description: ''
trigger:
  - platform: state
    entity_id: sensor.frigate_frontyard_person
    to: '1'
condition: []
action:
  - service: camera.snapshot
    data:
      filename: /config/www/tmp/frigate_frontyard_person.jpg
    entity_id: camera.frigate_frontyard_person
  - data:
      data:
        photo:
          - caption: Person In Front
            file: /config/www/tmp/frigate_frontyard_person.jpg
      service: notify.telegram
mode: single

Hope that helps!

1 Like

If you are using the beta with the integration, you should look at the new notification examples.

I may be wrong but maybe best.jpg is the highest ranked image from that camera which may not be the most recent event. My theory: event 1’s highest % is 97, event 2 is 95%. Best is offering the event 1’s 97% image.

Blake can tell us for sure.

That’s correct. I would suggest using the integration and looking at the new notification examples. It was designed to solve exactly this outdated image problem and eliminate the need to use the camera snapshot service.