Image processing with USB acceleration - all pi! ARCHIVED

FYI this project is now merged with my Deepstack Object integration (forum thread).
Cheers

hi @robmarkcole,

so the HASS-Google-Coral custom component is not working anymore?
I see it is not with newer coral-rest-server.

Migration to Deepstack is great but as far as I understand this require a decent amount of ram even using the Coral USB stick isnā€™t it?

You can simply use the newer version of the Google Coral TPU API server (which responds to the same API request as deepstack) with the Deepstack Home Assistant component. The coral API server is now mostly ā€œplug compatibleā€ with the deepstack server, though need to set it up separately with the model you want to use, etc. Itā€™s working great for me!

1 Like

Hi Louis,

Thanks for this.
So basically you mean update to latest version of this:
https://github.com/robmarkcole/coral-pi-rest-server
Setting up locally the model I want to use (which I already did but still on version 0.4)

And use Deepstack component which will send request to the above server based on Tensorflow and running on USB Stick for inference instead to run Deepstack itself.

Correct?

I also have a question for @robmarkcole : if I understand correctly tensorflow lite is using 300x300 images for post processing, is that correct?
Or does it use full resolution images from the cameras?

@mspinolo your understanding is correct. Also yes images are resized, this is true of all models

What are the advantages/ disadvantages when compared to Frigate?

Frigate allows processing on a stream (i.e. high FPS), but this comes at the cost of a more complex setup.

1 Like

Is there any way to increase the size of images?
Or does it require a new model?

You could crop the regions of interest you care about, this is what frigate allows you to do. You should use the proxy camera for cropping

OK I published release v0.7 which works with Buster and the PI4, so now you can use any of pi-zero/pi3/pi4. Also setup process is simplified by making use of a disk image released by Google. Inference times on the pi3 are around half a second (when queried from another computer) so this is suitable for 1 FPS image processing in Home Assistant. Surprisingly inference times are not any faster on a pi4 despite the USB3, I am chasing this up with Google but suspect the USB3 is not yet implemented in the Coral library.
Cheers

Hi @robmarkcole, is facial recognition still planned? Or I should better go with Deepstack and a Movidius stick?

Well if you have a movidius you should tryout Deepstack. I am using coral with Deepstack as performance is better (at the moment). I may implement face recognition but theres already multiple solutions for that so may do something else instead.

This is some great work but Iā€™m struggling to get things running, although I feel Iā€™m pretty close!

Iā€™ve got the coral-pi-rest-server running on a Pi and I can call it from my NUC running HA and get a response/result using the following command:

curl -X POST -F [email protected] 'http://192.168.1.149:5000/v1/vision/detection'

Iā€™ve got a camera (Xiaofang with fanghacks) set up in Home Assistant and displaying on a card with the following code. I use Zoneminder to provide the jpg snapshot, but I get the stream directly from the camera:


camera:
  - platform: generic
    still_image_url: https://192.168.1.146/cgi-bin-zm/nph-zms?mode=single&monitor=3&user=xxxxxxxx&pass=xxxxxxxx
    stream_source: rtsp://192.168.1.148/unicast
    name: fang
    verify_ssl: false

The yaml for the deepstack custom component looks like:

image_processing:
  - platform: deepstack_object
    ip_address: 192.168.1.149
    port: 5000
    scan_interval: 5000
    save_file_folder: /var/www/deepstack_person_images
    target: person
    confidence: 50
    source:
      - entity_id: camera.fang
        name: person_detector

All I get in HA is the card with ā€œperson_detector Unknownā€ and the following when I click on the card:

target

person

target confidences

all predictions

{}

save file folder

/var/www/deepstack_person_images/

And the only warning I get in the HA log is:

2019-09-08 00:57:46 WARNING (MainThread) [homeassistant.loader] You are using a custom integration for deepstack_object which has not been tested by Home Assistant. This component might cause stability problems, be sure to disable it if you do experience issues with Home Assistant.

Iā€™ve been through the READMEs on both githubs, but canā€™t see where I might be going wrong.

You have set a 5000 second scan interval, so reduce that or call the scan service with an automation

That did the trick! For some reason I had it in my head the interval was in ms

1 Like

Is there any clean up done on the image folder e.g. clear after 24 hours? Or is there an option to save the latest person jpg but not to save the older files? Iā€™m just thinking that if I scan every 5 seconds, then thatā€™s a lot of image space thatā€™s going to get used up.

You could just add a cronjob which looks for old files and delete them.

2 Likes

OK trying to get this going, I haveā€¦
-The google coral stick plugged in and working
-if I post curl -X POST -F [email protected] ā€˜http://192.168.0.101:5000/v1/vision/detectionā€™ in terminal I get back a result that its 98% sure its a bird (Great!)
-but when I add the below into my HA

image_processing:
  - platform: deepstack_object
    ip_address: 192.168.0.101
    port: 5000
    scan_interval: 5000
    save_file_folder: /var/www/deepstack_person_images
    target: person
    confidence: 50
    source:
      - entity_id: camera.fang
        name: person_detector

I get this error on validation
Platform error image_processing.deepstack_object - No module named ā€˜deepstackā€™

what do I need to add? I have custom_components folder set up with deepstack_object in itā€¦

It requires the deepstack-python module, this should be installed. I suggest you delete the contents of custom_components, check you have the latest code from github and load it afresh

jeez, dont mean to be dense but I am not sure what ā€œdeepstack-python moduleā€ is tried to google it but cant seem to figure it out.
I see something here from you

but it cannot run because coral uses 5000 port as well.