Local realtime person detection for RTSP cameras

Got the new integration installed (I added your custom repository through HACS and it came in perfectly). Also got the new image up and refactored my config file. All looks great. This is SO much cleaner.

I need to wrap my head around what’s happening in the media browser now. I see my clips going to local media still, but I now have this fancy frigate interface in there. How does that actually work? Is that something your integration is handling?

Also, I take it my DVR will be happier if I start using the restreams for everything in the house that way each device isn’t opening it’s own stream? Do I understand this right? One set of connections from DVR to HA server, then all connections downstream from there?

The integration adds a custom interface to the media browser. Technically, you don’t need to worry about sharing the directories with homeassistant anymore. The integration pulls everything from frigate directly. Information about stored clips is maintained in a database and frigate now has an API that the integration uses to render everything in the media browser UI. As you start to get more clips in the new database, you will see how it is intelligently allowing you to drill down and filter whats available. It will show you the most recent 50 events. If you have more than that, it will start to show you virtual folders that let you view clips by object type, camera, zone, this month, last month, this year, etc. I am trying to reduce the number of thumbnails shown to 50 or less to save bandwidth since homeassistant isn’t very cache friendly.

You can use the rtmp feeds however you want. They are served up using the nginx rtmp module, so it should be able to handle many simultaneous connections. You can access them at rtmp://<frigate_host>/live/<camera_name> from anywhere as long as port 1935 is exposed. I am only connecting to them with homeassistant because it handles converting it to an HLS feed and making it work with chromecasts with permissions, etc.

1 Like

All I have to say is excellent work. I now have all 6 cameras

recording 24/7 @ 720P
detection @ 720P
rtmp @ 720P
clips @ 2592x1520

I went from almost 2 cores being CONSTANTLY utilized 60-70% to 1 core at 30%. I might get a quick spike when there is motion, but its not even close to what I was doing before.

3 Likes

Can’t wait for this release! What’s the timeline you have in mind?

The main features are done. There are some small changes I want to clear out of the backlog next, but I expect to have the release candidate ready before the end of the month. I usually let the release candidate run across all the systems I manage for a week prior to an official release.

2 Likes

ffmpeg.back.detect             ERROR   :   Metadata:
ffmpeg.back.detect             ERROR   :     title           : Media Presentation
ffmpeg.back.detect             ERROR   :   Duration: N/A, start: 1607391463.243189, bitrate: N/A
ffmpeg.back.detect             ERROR   :     Stream #0:0: Video: h264, yuvj420p(pc, bt709, progressive), 640x480 [SAR 126:95 DAR 168:95], 6 fps, 25 tbr, 90k tbn, 12 tbc
ffmpeg.back.detect             ERROR   : [h264_v4l2m2m @ 0x5596266e70] Could not find a valid device
ffmpeg.back.detect             ERROR   : [h264_v4l2m2m @ 0x5596266e70] can't configure decoder
ffmpeg.back.detect             ERROR   : Stream mapping:
ffmpeg.back.detect             ERROR   :   Stream #0:0 -> #0:0 (copy)
ffmpeg.back.detect             ERROR   :   Stream #0:0 -> #1:0 (h264 (h264_v4l2m2m) -> rawvideo (native))
ffmpeg.back.detect             ERROR   : Error while opening decoder for input stream #0:0 : Invalid argument
frigate.video                  INFO    : back: ffmpeg sent a broken frame. something is wrong.
frigate.video                  INFO    : back: ffmpeg process is not running. exiting capture thread...

hi blake im getting this error on pi4 hassOS latest ver and frigate 0.8.0 Beta 1

ffmpeg:
  hwaccel_args:
    - -c:v
    - h264_v4l2m2m

is there any changes to the above setting? it was alright on 0.73. when i remove the above setting and let the CPU handle decoding. everything works just fine

I wasn’t expecting a change there, but I haven’t been able to do much testing on the RPi4 yet. Is HassOS 64bit? Those are the 64bit args.

yes 64. i did some search .anything related to this '–enable-shared --enable-libx264" during the addon building?

Hi Blake,
Do you have plans to add more metadata for the clips in the media browser? Right now I see the object type and percentage. I saw your comment above about virtual folders. I also think date and time would be useful if you are looking for input.

Thanks!

I don’t see anything obvious that would have caused that issue in my changes.

1 Like

I agree. I can play with some options and see what looks good.

1 Like

Hi Blake,

I tried to get 0.8.0-beta1-amd64nvidia up and running but it seems that libcuda.so.1 is missing and it can’t load the library when ffmpeg is launched. However, when I tried this tag 0.8.0-beta1-amd64 it seems that libcuda.so.1 is there but ffmpeg doesnt have cuda support on that tag. I also tried nvidia-smi from both containers and it only worked in the 0.8.0-beta1-amd64. Any clues?

Thank you,
John

Disregard, some reason when I switched back to the 0.8.0-beta1-amd64nvidia release it started working. Perhaps it just needed to be re-created. Sorry new to this, but seems to be working for ffmpeg now:

-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 19831 C ffmpeg 211MiB |
| 1 N/A N/A 19831 C ffmpeg 0MiB |
| 2 N/A N/A 19831 C ffmpeg 0MiB |
±----------------------------------------------------------------------------+

Thank you for your answer.
Frigate is an incredible piece of work!

The reason why I’d like to be able to configuje clip names is simple. I’m not using Homeassistant and so I can’t use the integration of Frigate-HA and the media browser.
I’m using OpenHAB. Thanks to the connection between Frigate and MQTT, I can conveniently pass information to OpenHAB. The object detection works well with a smart home.

I like the ability to save clips and I would like to be able to view save clips in Windows via a shared folder and sort them by the filename. If the filnename would be timestamp of creation.

Hi,

I’ve just tried to move to the new 0.8 version and it took a few attempts to convert my setup (PI4/x64/HASOS), but it looks I’ve succeeded. Definitely an improvement over the previous version.

One thing I’m not clear on is the integration. The readme states:
When configuring the integration, you will be asked for the Host of your frigate instance.
Where do I configure this?

Edit:
Found this under integrations, didn’t expect it that easy :-S.
In my case it didn’t like the http://ccab4aaf-frigate:5000 address, so for now I just hardcoded the ip of my home assistant instance http://<ha_ip_address>:5000/

Because you are using the beta addon, it probably needs to be http://ccab4aaf-frigate-beta:5000

2 Likes

The filename includes the epoch time. See this post for an example of renaming the files with bash: Local realtime person detection for RTSP cameras

1 Like

I’m trying to adapt my HA automation to notify Frigate Event on my Android Phone.
The example says to get the snapshot using something like: https://your.public.hass.address.com/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}.jpg?format=android
Few questions:

  • I understand that this is a public api so the security comes from the randomness in id, right?
  • Is there any similar api to get a public URL for the video of the event? My previous automation included a link to trigger video view on the phone.
  • these /api/frigate/* api lacks documentation.

It relies on the randomness and is public. How were you viewing the video? I think the camera entities support mjpeg and should work with the force touch to view the video as shown in the companion docs for iOS. Unfortunately, I don’t have an iOS device to test yet. I do have one coming in the next few weeks.