This is my attempt at integrating Coral AI local on-device inferencing capabilities with home assistant. My goal was to be able to detect certain objects and present the results with lovelace UI in a totally automated fashion.
This is my “Home” view:
Clicking on the “Object Detections” button gives you this view:
Clicking on the “Person” button gives you this view:
This is where you see the “person” detections from different sources. An inside Amcrest camera and outdoor Nest cameras provide the “motion” images that are inferenced and green boxes are put around the detected objects. The “confidence” of the object is selectable and is displayed along with the green box. Of course if you click on the thumbnail and a full sized image is displayed.
Here is an example of the full size “Dog” object with confidence 58.2%.
Hardware for this project includes 2 Raspberry Pi 4’s. One running Home Assistant and other running the Google coral AI USB accelerator.
Instructions
Here is how I accomplished this. My instructions will be in chronological order so that features work in the order that they are needed. Some instructions will be detailed and others less so; if someone asks I will provide more detail.
1. Install the raspberry pi 4 coral machine with the USB accelerator.
1.1. First install Debian Buster from here https://www.raspberrypi.org/downloads/raspbian/.
Change the network configuration to use a static IP address.
1.2 Install the Google coral USB accelerator and packages. Excellent instructions for this are
provided here. https://coral.ai/docs/accelerator/get-started
1.3. Install the code to “expose” the coral AI inferencing engine as per the following.
https://github.com/robmarkcole/coral-pi-rest-server I suggest you install this in your
/home/pi directory and install as a service. You should now have **/home/pi/google-rest-server
1.4. Make a folder called “python_scripts” or whatever you like in your home directory. So
/home/pi/python_scripts.
1.5. Download the following repositories and install them in the python_scripts location.
1.5.1. Install python script that periodically scans a folder to detect objects and draw
boxes around discovered objects driven by your selected parameters.
https://github.com/ijustlikeit/coral_object_detection
1.5.2. Install python script that creates html using python yattag for displaying in your
home assistant lovelace ui. https://github.com/ijustlikeit/python-scripts
1.5.3 Install python script that saves off .jpg images from gmail that contain your Nest
Cam notification emails. More on this later.
https://github.com/ijustlikeit/extract-gmail-attachments
1.5.4 All of my scripts can be found here. https://github.com/ijustlikeit?tab=repositories
1.5.5 Install Nginx (or Apache) that will serve up your html. I suggest you use Duck Dns
to create yourself a domain that will point to your static IP (that you created in step
1.1). Also create a ssl certificate for this domain using Lets Encrypt. So
you should have something like: https://yourduck.duckdns.org/yourhtml.html
serving up html form www/duckdns/ directory.
1.6 Install samba. You will need this to view the /config folder from your Home Assistant
instance.
1.7 Now perform step 2.
1.8 Install crontab.
1.9 Install your favorite email package (so you can view results of the crontab jobs).
1.10 Create a gmail account that only has the Nest camera notifications being forwarded to it.
This is easy to do with a filter.
1.11 Mount /Hassio/config:
sudo mount -t cifs //192.168.1.xx/config /mnt/Hassio -o user=u,pass=p,dom=yourworkgroup
1.12 Create a folder called gmail_attachmnts in your pi home directory.
1.13 Install the scripts into crontab using crontab -e (for 1.5.1,1.5.2, 1.5.3) change to your desired parameters and frequency of execution.
*/15 * * * * sudo python /home/pi/python_scripts/object_html_generator.py --src /media/object_images --dst /var/www/duckdns.org/ --size 0480 270 --object 'person,car,dog' # Object image file Generator script
*/8 * * * * sudo python /home/pi/python_scripts/coral_image_processsing.py --host 192.168.2.xx --port x000 --folder '/home/pi/gmail-attachmnts' --targets 'person,car,dog,tree' --confidence 55 --save_folder /mnt/Hassio/coralsnap --timestamp true
*/6 * * * * sudo python /home/pi/python_scripts/dlAttachments.py --dir /home/pi/gmail- attachmnts --thrash True
1.14 That should be about it. Start your testing. If you installed Dovecot check your email for the
script results.
2. On your Home Assistant instance install and configure the following.
2.1 Install Samba.
2.2 Install the following custom component into home assistant following the instructions within.
https://github.com/ijustlikeit/HASS-Deepstack-object
2.3 Before you try to create the sensors sensor.person , sensor.car make subfolders
coralsnap/person/ and coralsnap/car/. Normally the script will make these folders for
you but not until after you have ‘detected’ your first objects.
2.4 For viewing your results create a lovelace view:
- name: Person
cards:
- type: iframe
url: https://yourduckdns.duckdns.org/person.html
aspect_ratio: 60%
panel: true
path: person
title: Person
visible: false
2.5 Go back and continue on from step 1.8.```