Hi guys, I just want to share my configuration. I had the need to move the execution of Frigate NVR to Windows (running on my mini pc Intel i9 10th gen with an external RTX 3060 12 GB GPU) . Frigate worked on the Raspberry Pi 5, but the inference times were too high.
I used WSL 2 and Docker Desktop on Windows 11. After performing the updates, I created the following folders and subfolders inside WSL
mkdir -p ~/frigate/config
mkdir -p ~/frigate/media
mkdir -p ~/frigate/models
I created the config.yaml
from scratch using the structure shown below, and saved it in the previously created config
folder
mqtt:
host: # IP address where MQTT is running
port: 1883
user: # user created in Home Assistant
password: # password for the username
topic_prefix: frigate
cameras:
camera:
ffmpeg:
hwaccel_args: preset-nvidia
inputs:
- path: rtsp://...
roles:
- detect
detect:
enabled: true
width: 800
height: 450
fps: 5
stationary:
threshold: 20
objects:
track:
- person
- cat
filters:
person:
min_score: 0.5
threshold: 0.7
min_area: 10000
max_area: 180000
cat:
min_score: 0.35
threshold: 0.4
record:
enabled: false
snapshots:
enabled: false
timestamp: true
bounding_box: true
retain:
default: 1
motion:
threshold: 38
contour_area: 25
improve_contrast: 'true'
detectors:
tensorrt:
type: tensorrt
device: 0
Once saved in the config
folder, we can run the docker run
command from the WSL terminal:
docker run -d \
--name frigate-test \
--restart unless-stopped \
--runtime nvidia \
--gpus all \
--shm-size=512m \
-v /etc/localtime:/etc/localtime:ro \
-v /home/davide/frigate/config:/config \
-v /home/davide/frigate/media:/media/frigate \
-v /home/davide/frigate/cache:/tmp/cache \
-p 5000:5000 \
-p 8554:8554 \
-p 8555:8555 \
-e YOLO_MODELS=yolov7-320 \
-e USE_FP16=true \
-e NVIDIA_VISIBLE_DEVICES=all \
-e NVIDIA_DRIVER_CAPABILITIES=compute,video,utility \
ghcr.io/blakeblackshear/frigate:stable-tensorrt
So frigate will be available in http://localhost:5000/
Once Frigate is started, you’ll notice that it works, but if your GPU is not being used as a decoder.
Stop Frigate and open the config.yaml
.
Frigate will have created a section like the following:"
model:
path: /config/model_cache/tensorrt/yolov7-320.trt
input_tensor: nchw
input_pixel_format: rgb
width: 320
height: 320
Make sure that the yolov7-320.trt
model is located in the folder specified in the path.
In my case, it had automatically pointed to an incorrect 1KB file.
The yolov7-320.trt
file specified in the path automatically generated by Frigate in the config.yaml
was incorrect.
Another folder named 8.5.3
had been created, containing the correct file (160 MB).
So I changed the path
section under model
to the correct path:
path: /config/model_cache/tensorrt/8.5.3/yolov7-320.trt
model:
path: /config/model_cache/tensorrt/8.5.3/yolov7-320.trt
input_tensor: nchw
input_pixel_format: rgb
width: 320
height: 320
Restart the container.
When you set the Frigate integration in Home assistant set the correct ip address of the computer hosting frigate
Now everything works good
Then you can use Frigate proxy to see it direcly on Home assistant