Maybe someone using Nvidia GPU can help me in correctly configuring, but when I put this in my config file:
ffmpeg:
input_args:
- -c:v
- h264_cuvid
I keep getting:
ffmpeg sent a broken frame. something is wrong.
I’ve also tried it with hwaccel_args, same thing. As soon as I remove those from the config (and Frigate is running on the CPU) everything works.
Trying to get this to work on my GTX 1050 TI. Running Unraid (yes all Nvidia arguments are passed to the container, other containers also are able to work with the videocard). In combination with an RTSP stream coming from a Ubiquiti UVC-G3 bullet cam.
Hopefully someone can shine some light on how to properly configure, because it has cost me two evenings already of trial and error getting it to work
You can use the mjpeg streams, but those are generated from the input assigned to the detect role, not the resolution from the rtmp role. There is no way to get bounding boxes on streams other than the detect stream. The resolution can be specified with the ?h= parameter, but it wont do you any good to increase it beyond the resolution of the source feed.
everybody talking about the new web gui, but if i push open web gui on the add on page i don’t get anything, and couldn’t find any reference to the actual address
We’re working on it. In short, home assistant ingress is very cumbersome. You can get to the UI for now by visiting http://<your-home-assistant-machine-ip>:5000/
@blakeblackshear I just opened the new GUI and noticed all of my camera’s images were frozen at somewhere around 7am this morning. There were no events captured from today even though I had been out and around the cameras a few times. Also none of the HA device history show any activity for today.
In the logs I see this.
Exception in thread detected_frames_processor:
Traceback (most recent call last):
File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/opt/frigate/frigate/object_processing.py", line 517, in run
camera_state.update(frame_time, current_tracked_objects, motion_boxes, regions)
File "/opt/frigate/frigate/object_processing.py", line 355, in update
c(self.name, removed_obj, frame_time)
File "/opt/frigate/frigate/object_processing.py", line 439, in end
event_data = obj.to_dict(include_thumbnail=True)
File "/opt/frigate/frigate/object_processing.py", line 167, in to_dict
'thumbnail': base64.b64encode(self.get_thumbnail()).decode('utf-8') if include_thumbnail else None
File "/opt/frigate/frigate/object_processing.py", line 174, in get_thumbnail
jpg_bytes = self.get_jpg_bytes(timestamp=False, bounding_box=False, crop=True, height=175)
File "/opt/frigate/frigate/object_processing.py", line 186, in get_jpg_bytes
best_frame = cv2.cvtColor(self.frame_cache[self.thumbnail_data['frame_time']], cv2.COLOR_YUV2BGR_I420)
KeyError: 1611057140.753266
Thanks for the hard work, @blakeblackshear! I just installed the RC2 hassio addon and I can access the web UI via ingress now, but after clicking on “Cameras” and then on one of my cameras, the mjpeg doesn’t show.
Has anyone been able to run this on a dev board or dev board mini? I just got one today and can’t figure out what to do.
The device shows up as dev/cu.usbmodemmocha_tang1 and dev/tty.usbmodemmocha_tang1 on my Mac. I’ve tried mounting both of these as volumes with /dev/bus/usb like so.
Hello I just swapped my whole Hassio Installation to a newer NUC and installes the driver for coral like documented on their site. I have the pcie version and without seeing it in exposed pci devices in hassio it is somehow workung directly I guess it’s frigate exposing the pci right? great job! I now have TPU… wow … today it’s a good day Thanks sooo much for all your hard work!