It will be a 10x speed improvement for inference AND it will eliminate all the CPU usage spikes for object detection. Inference speeds of 130ms can’t add more than 130ms to the notification delay except that it often requires multiple runs of object detection to narrow in on the object in question. This may cause skipping of frames and make it take longer to confirm the object is not a false positive.
Updates are posted to the mqtt topic as better images are found in subsequent frames. Take a look at the event type field to limit it to the initial detection or the end of the detection. Also, see the blueprint posted recently.
That lines up to my 17-19ms with a single USB Coral on a Pi4. Pre-ordered the PoE+ Hat (switch is already PoE+ complaint) so I can run a pair + a USB NVMe hopefully.
So maybe the delay comes from skipped frames. The coral should bring a improvement right?
Sorry but I am not that good with mqtt. I understand that there is a event which is only fired once, but how can I change the automation to that?
I think the problem is, that your example automation is replaced when the automation is fired multiple times. Introduction | Home Assistant Companion Docs
The other notification is a critical notification, so it isn’t replaceable. Will test it tomorrow.
Then I still need a workaround for critical notifications. Maybe clearing the old notification before sending the new.
I’m already running the pi4 8gb with the poe hat and coral usb, it works. I’m also using a data ssd. I will be curious to know if poe will be suitable to run another usb coral…
Btw, my main pain is that the pi4 can’t manage the hw accell for the h.264 when the resolution is higher then 1080p (thank you to @mr6880 to spot on that)
So, if you want to use hw acc and keep low cpu usage you have to work with 1080p, if you want to use higher resolution you will increase cpu usage
If ffmpeg supports it, so does frigate. It’s possible I may have to compile ffmpeg with another flag, but that’s the benefit of using ffmpeg for intake. It already supports almost every known source of image data.
# Getting ffmpeg with libx265 support
ffmpeg needs to be built with the `--enable-gpl` `--enable-libx265` configuration flags and requires `x265` to be installed on your system.
There has been couple of other posts regarding hardware acceleration problems with Raspberry Pi.
I have also facing same issues. When I have the hardware acceleration configuration in frigate yml file, I will get the green screen and also plenty of different errors. If I will delete the hardware acceleration config, I will not get any errors and stream is working.
Does anybody have advice?
What I have checked:
Stream settings in camera and Frigate
Protection mode is disabled in Frigate Add-On
My setup:
Raspberry Pi 4 (4GB)
Home Assistant OS 5.13 (64-bit)
Frigate Add-on 1.13
Error logs from Frigate Add-on:
frigate.video INFO : frontyard: ffmpeg process is not running. exiting capture thread...
ffmpeg.frontyard.detect ERROR : [h264_v4l2m2m @ 0x5590337990] Could not find a valid device
ffmpeg.frontyard.detect ERROR : [h264_v4l2m2m @ 0x5590337990] can't configure decoder
ffmpeg.frontyard.detect ERROR : Error while opening decoder for input stream #0:0 : Operation not permitted
frigate.video INFO : frontyard: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have
Exact, i report the same thing, i have try a lot of thing but nothing work for me.
Work only when i remove the hardware acceleration but the CPU is consumming a lot.
My setup:
Raspberry Pi 4 (4GB)
Debian buster 10
Docker Home Assistant
Frigate Add-on 1.13
I run HassOS in a Virtualbox VM but not Frigate in it’s own with a Wyze cam, so I’m not sure if this applies to you or not but I have no issues. What is your config? I would start with no HW accel and see if that works. The VM can add some unnecessary layers for accessing some of the underlying hardware, though I do use these on my 6th gen i3:
One thing I’ve noticed over the past (almost) year is that the Wyze RTSP might be the culprit - it’s terribly unstable at times, causing the lvalue and rvalue issue to pop. Meanwhile, I’ve tested other means of capturing the video through TinyCam and streaming the mjpeg stream and TinyCam–>Blue Iris–>Frigate. Those two work much better, but require more overhead. I wish TinyCam could restream RTSP instead of mjpeg since it uses the Wyze credentials to pull the video, maybe one day.
The RTSP firmware works for the most part directly with Frigate, but admittedly there were a couple of times it may have skipped frames and didn’t catch what it was supposed to. While the Wyze are nice and cheap, the RTSP firmware can be tough to work with at times.
I need some help…
When taking the latest picture from MQTT (frigate//car/snapshot) I always get bounding boxes ,timestamp and croped…
How do I disable them in snapshots from MQTT?
But when I view the same event in media browser the image is clear from bbox, crop and timestamp.