that camera cannot be changed to h264 thats the reason for asking…
I could never get H.265 to work. I don’t think it is supported so I had to change to H.264 the cameras I have dedicated to Frigate.
Thank you, I will look for a usb powered hub. Do you have both SSD and Coral on the hub and then the hub on the PRI4?
In hindsight an Intel NUC wasn’t so bad pricewise with all the extra parts I now have on my RPI4 to keep it cool, powered, ssd boot etc lol.
Yes, I moved both the coral and the SSD to the hub so the PI only has to power itself.
My plan was to use frigate, mqtt, node-red and pushover to get a picture of the motion or instead of mqtt using node-red to push the pic of a person entity. So far I’ve spent a lot of time with no success. Can anyone give me some pointers?
I’m using h265 with frigate, no issues since ffmpeg handles the decoding, the issue is trying to play it back on the web interface. Msft edge works great, Chrome and Safari don’t support it, and of course neither do the mobile apps since they use wrappers of either Safari or Chrome. I’d like to come up with a way either to transcode on demand, or maybe use intents to open the stream in VLC, which will play h265. There is a solution out there, just need to find it, seems a shame to give up on the reduced memory and disk space due to a licensing dispute.
Ahh that is what my problem was. I couldn’t remember what didn’t work with H.265 before I gave up on it but that must have been it!
I created a simple node red flow as follows. It’s not the most elegant but it does exactly what I need it to.
- On state change of the binary sensor, I call home assistant to take a snapshot of the Frigate camera (person, car, whatever)
- Save snapshot to media folder
- Send notification to my phone
It’s nice because it overwrites the old image every time a new one is taken so there’s no purging old images which is fine for me since I’m recording clips and 24/7 anyway. Downside is, this is taking the first image on state change so it’s not ALWAYS the best image but you get the very first image and get the point someone or something is in your driveway/yard/street/whatever and can then go look at your camera or clips.
There are a few other examples in the thread above as well that might work for your needs as well.
[{"id":"ab163638.e9e6a8","type":"server-state-changed","z":"c9f81df1.37961","name":"","server":"d0a84ee3.dd70e","version":1,"exposeToHomeAssistant":false,"haConfig":[{"property":"name","value":""},{"property":"icon","value":""}],"entityidfilter":"binary_sensor.front_person_motion","entityidfiltertype":"exact","outputinitially":false,"state_type":"str","haltifstate":"on","halt_if_type":"str","halt_if_compare":"is","outputs":2,"output_only_on_state_change":true,"for":"","forType":"num","forUnits":"minutes","ignorePrevStateNull":false,"ignorePrevStateUnknown":false,"ignorePrevStateUnavailable":false,"ignoreCurrentStateUnknown":false,"ignoreCurrentStateUnavailable":false,"x":190,"y":860,"wires":[["eb08b85f.3ba2a8"],[]]},{"id":"44f1615a.02085","type":"api-render-template","z":"c9f81df1.37961","name":"Time","server":"d0a84ee3.dd70e","template":"{{ as_timestamp(now())|timestamp_custom('%I:%M %p') }}","resultsLocation":"payload","resultsLocationType":"msg","templateLocation":"template","templateLocationType":"msg","x":770,"y":840,"wires":[["806aeedf.9b31c"]]},{"id":"806aeedf.9b31c","type":"api-call-service","z":"c9f81df1.37961","name":"Send Notification to phone","server":"d0a84ee3.dd70e","version":1,"debugenabled":false,"service_domain":"notify","service":"mobile_app_pixel_3","entityId":"","data":"{\"message\":\"Person Detected\",\"title\":\"{{payload}}\",\"data\":{\"image\":\"/media/local/person.jpg\",\"priority\":\"high\",\"ttl\":0}}","dataType":"json","mergecontext":"","output_location":"","output_location_type":"none","mustacheAltTags":false,"x":960,"y":840,"wires":[[]]},{"id":"eb08b85f.3ba2a8","type":"throttle","z":"c9f81df1.37961","name":"","throttleType":"time","timeLimit":"1","timeLimitType":"minutes","countLimit":0,"blockSize":0,"locked":false,"x":450,"y":840,"wires":[["e6dded1b.983e3"]]},{"id":"e6dded1b.983e3","type":"api-call-service","z":"c9f81df1.37961","name":"snapshot","server":"d0a84ee3.dd70e","version":1,"debugenabled":false,"service_domain":"camera","service":"snapshot","entityId":"camera.front_person","data":"{\"entity_id\":\"camera.front_person\",\"filename\":\"/media/person.jpg\"}","dataType":"json","mergecontext":"","output_location":"","output_location_type":"none","mustacheAltTags":false,"x":600,"y":840,"wires":[["44f1615a.02085"]]},{"id":"d0a84ee3.dd70e","type":"server","name":"Home Assistant"}]
Anecdotal, but while I can’t suggest one cpu arch vs. another, but I’m using Frigate with 4 1080p cams on an intel nuc8i5beh and CPU usage hovers around 10%. That’s with no hardware acc., because I have HA in a VM, and proxmox passthrough of nucs’ EGPUs is (IMO) difficult
My (wyze) cam produces only a 1080p RTSP stream. Does frigate automatically scale this down to a lower resolution for motion detection and object detection?
if so, how do it make frigate use hardware acceleration for the downscaling? I followed https://blakeblackshear.github.io/frigate/configuration/nvdec and got things working, but CPU usage is around 10 to 15% on just one camera. Any help is appreciated.
Oops. You said PCIE and I read m.2 for some reason. I was talking about the slot keying. Good to know the pcie works well.
Are you running frigate in an LXC or are you running it as an add-on in your home assistant VM? If it’s the latter, do you have a USB coral?
Hi,
I’m currently struggeling configuring my ffmpeg hwaccel_args.
- disabled protection mode
- added to my config:
ffmpeg:
hwaccel_args:
- -c:v
- h264_mmal
Frigate is showing green images, when using the HD stream. The SD streams are working (same camera 640x360).
What could be the cause? Do I have to configure hwaccel_args for each stream? Where do I start?
Thank you!
What is the resolution of the HD stream?
Did you use VLC to examine the video stream and check the resolution VLC lists to what you entered in the frigate configuration file?
The steam you send to frigate via the config.yaml
will be the resolution that is used for detection. You also have to specify the resolution for the stream you are pulling in the config.yaml
. Here is the example posted in the docs:
mqtt:
host: mqtt.server.com
cameras:
back:
ffmpeg:
inputs:
- path: rtsp://viewer:{FRIGATE_RTSP_PASSWORD}@10.0.10.10:554/cam/realmonitor?channel=1&subtype=2
roles:
- detect
- rtmp
- path: rtsp://viewer:{FRIGATE_RTSP_PASSWORD}@10.0.10.10:554/live
roles:
- clips
- record
width: 1280
height: 720
fps: 5
CPU usage depends on the device you are running Frigate on and if you’re using a Coral device. Not using a Coral will use lots of CPU usage.
The resolution is ok (1920/1080).
When I remove hwaccel_args from the config files, the streams show up.
But my system perfomance is bad. I hope, I can reduce the CPU load using hardware acceleration.
It’s a good sign the streams show up when you remove hwaccel_args. That means the other parts of the frigate config file are probably ok.
I am assuming you are running on a Raspberry Pi.
I had to do the following to get hardware acceleration running successfully on my Raspberry Pi 4 (I use 32-bit hardware acceleration):
- in raspi-config enable the camera interface
- in /boot/config.txt add/change a line with ‘gpu_mem=256’. Don’t forget to reboot the Pi after you edited config.txt
Perhaps these changes work for you as well?
Do I have to configure hwaccel_args for each stream?
I configure the hwaccel_args globally.
@blakeblackshear with the new version, what has the path of http://IPADDRESS:5000/pool/person/best.jpg changed to?
I have node red flow that sends the best.jpg via pushover if a person is detected while the alarm is armed. Now just getting blank.
All those endpoints moved under /api