Video processing with live view on HA and object detection - how to do it right?

Hello everyone

Looking for advice and suggestions. I want to integrate HA, a webcam, live view from the webcam on HA, and object detection from the live stream and have such streams recorded on HA for further access and notification via telegram.

What I really want is to offload video processing using GPU acceleration on Jetson Nano (have it sitting on the desk unused). (or Coral when I receive it. Current wait time - 9 months, ordered August last year).

I want to have the ability:

  1. Stream the live view from a webcam directly to HA so I can see it on demand connecting to HA
  2. Stream the live view in parallel to Jetson (or a separate device with Coral) for object detection.
  3. From Jetson (coral) notify/stream to HA using a separate channel/method about detected objects until the object is in the view and save the clip on HA
  4. Have HA send a telegram notification regarding detected objects and related stream

#1 from the above could be done directly using the web camera and HA, as long as HA supports the webcam stream format. My understanding is that only H.264 currently is supported.

#2 - a question - can the stream be broadcasted at two different places simultaneously? If not, then the stream should be sent to Jetson, do transcoding into the format supported by the HA/browser for the live view.

#3 - I don’t know how to do it on Jetson using GPU - via Frigate or by some other means.

#4 - Should be easy to do as long as everything is working.

The question is - how to do the above right so everything is working…

If anyone has a working setup, I would appreciate if you can share as well as step-by-step setup instructions.

h.265 is also supported depending on your choice of browser

Yes, using something like go2rtc (which is built into frigate 0.12) or RTSP simple server

GPU support for Jetson with ffmpeg does not exist afaik, need to use gstreamer which adds complications