UPDATES
June 10, 2019: I’ve updated the guide to use a simpler flow with Amazon Rekognition over here.
Summary
I’ve used Blue Iris for quite a while and have multiple PoE cameras outside of the home. The functionality of BI is amazing, especially for the price. However, it doesn’t have a native method of detecting people, just motion. I don’t want my phone blowing up with motion alerts when it’s the kids outside or a tree blowing in the wind. The end result was combining Home Assistant, TensorFlow, and Pushover to only send alerts when a person is detected while everyone is away from the home.
Please keep in mind that I run all of the components in Docker on Ubuntu, not Hass.io, so the instructions will be different if you’re running the latter.
Components
There are a few components to this. I’ll cite as much as possible since this would become a small novel if I covered everything from start to finish. Here are the important parts:
Blue Iris Integration
I’ve previously written how to accomplish this. It’s fairly straight-forward and shouldn’t be too difficult if you already have Blue Iris and Home Assistant up and running.
Presence Detection
There’s a ton of coverage on this. I use Life360. You’re free to use whatever works best. Please note that I use a template that shows Home
or Away
so I can change a bunch of underlying components without having to modify any of my automations.
Node-RED
I use this for all of my automations, specifically the node-red-contrib-home-assistant-websocket plugin. The assumption at this point is that you’re familiar with Node-RED, have the module installed, and have Node-RED talking to Home Assistant.
Pushover
This should work with any of the notification platforms, but I’m a big fan of Pushover since Pushbullet no longer seems to be developed. IMO it’s outstanding and well worth the minimal cost. Please note: this is using a Pushover custom component found in this thread. Unfortunately the attachment functionality is not in default. You can just do the following while you’re in the root home-assistant
directory:
mkdir -p custom_components/pushover_custom
cd custom_components/pushover_custom
wget -O notify.py https://raw.githubusercontent.com/brkr19/home-assistant/dev/homeassistant/components/notify/pushover.py
It also seems like the ability to overwrite default components was disabled in the 0.91.0 release. Using the component name above, you’ll want something like this in configuration.yaml
notifications:
- platform: pushover_custom
name: pushover_alert
api_key: !secret pushover_api
user_key: !secret pushover_user
You’ll need to restart the docker instance after making the change. Hopefully we’ll see the functionality baked into default.
TensorFlow
This is done through the TensorFlow image processing component. It can be pretty damn taxing on the CPU and it doesn’t make sense to run it full-time. Instead, I just call the service when motion is detected by Blue Iris. I’d never used it before and it’s ability to identify objects, specifically people, is super impressive.
Setup
Here’s the fun part. I’ll cover parts of this in detail when necessary, but a lot of it will refer to other setup guides. The assumption is that you already have BI up and running, Node-RED talking to Home Assistant, and some type of presence detection.
Configure the Cameras in Home Assistant
This is based on my previous guide. Home Assistant pulls the camera feeds directly from Blue Iris instead of the PoE camera. I have a separate yaml file for each of the 4 PoE Cameras. I’d highly recommend !secret
here but I’ll show you a mock-up of what it looks like. In the configuration.yaml
file I have this:
camera: !include_dir_merge_list cameras
In the camera
directory I have a yaml for each camera, with most looking like this, e.g. bdc.yaml
:
- platform: mjpeg
mjpeg_url: http://bi.home.local:23456/mjpg/BDC
name: BDC
username: insert_username
password: insert_password
authentication: basic
Again, one yaml per camera. BI makes it easy to access them with the short names. The 23456
port was one I selected for the built-in BI web server.
Configure TensorFlow
Just follow the official guide to get it running. I used the faster_rcnn_inception_v2_coco
model. The Home Assistant config resides in /opt/home-assistant
and is the /config
directory from the container’s perspective. My directory looks like this:
root@docker:~# ls -la /opt/home-assistant/tensorflow/
total 113516
drwxr-xr-x 4 root root 4096 Nov 22 22:52 .
drwxr-xr-x 23 root root 4096 Nov 22 23:12 ..
-rw-r--r-- 1 root root 460 Nov 22 01:29 camera-tf.yaml
-rw-r--r-- 1 root root 77 Nov 22 01:20 checkpoint
-rw-r--r-- 1 root root 57153785 Nov 22 01:20 frozen_inference_graph.pb
-rw-r--r-- 1 root root 53348500 Nov 22 01:20 model.ckpt.data-00000-of-00001
-rw-r--r-- 1 root root 15927 Nov 22 01:20 model.ckpt.index
-rw-r--r-- 1 root root 5685731 Nov 22 01:20 model.ckpt.meta
drwxr-xr-x 6 root root 4096 Nov 22 01:20 object_detection
-rw-r--r-- 1 root root 3244 Nov 22 01:20 pipeline.config
drwxr-xr-x 3 root root 4096 Nov 22 01:20 saved_model
I have this in my configuration.yaml
file:
image_processing: !include_dir_merge_list tensorflow
In the /opt/home-assistant/tensorflow
directory, I have a file named camera-tf.yaml
that contains the following:
- platform: tensorflow
source:
- entity_id: camera.fdc
- entity_id: camera.fyc
- entity_id: camera.byc
- entity_id: camera.bdc
file_out:
- "/config/www/tensorflow/{{ camera_entity.split('.')[1] }}_latest.jpg"
- "/config/www/tensorflow/{{ camera_entity.split('.')[1] }}_{{ now().strftime('%Y%m%d_%H%M%S') }}.jpg"
scan_interval: 604800
model:
graph: /config/tensorflow/frozen_inference_graph.pb
categories:
- person
The scan_interval
is set to a week. You really don’t need it to automatically scan anything, but it defaults to every 10 seconds if you don’t specify a value. The categories
is important: if you don’t specify anything, it’ll scan for every known object. I only care about people and don’t need alerts for plants, chairs, tables, etc. Most importantly, it writes out a file with _latest.jpg
on the end to designated the latest results from the scan. This will be used later.
At this point you can restart the Home Assistant container so it loads all of the components.
Configure Node-Red
I’ll use screenshots here since dumping the flow is going to look like garbage :).
Building the Flow
Now the fun part: building the flow to tie everything together. You end up with this:
Let’s start at the top with Camera Motion
. These binary_sensor
components were created from the Blue Iris guide that’d I’d written above. It only activates with there’s motion which flips it to on
. It looks like this:
This is the regex from above: binary_sensor.\w{3}_motion
My cameras all have a 3-letter designation that the \w{3}
matches, i.e. it’ll work with binary_sensor.abc_motion
, binary_sensor.xyz_motion
, etc. No need for a separate state node for each camera.
The next steps are basically logic checks: I only want to get alerted if everyone is away or it’s late at night. You can bake in whatever you’d like here. For the presence check, I’m using the following:
I’ve got a template sensor that only spits out Home
or Away
. If you don’t want to go that route, you could just create a node that says not Home
. It’s important that those boxes in the end are unchecked. The time check just says to pass the value if it’s after midnight and before dawn.
The next Change node
is labeled Convert
and was a lot of trial and error. It basically takes the 3-letter identifier and injects the camera
and image_processing
entities into the payload so we can use them later. This was how I got around needing a separate flow for each camera.
Please note that you’ll need to specify JSONata
for the to
field. I’ll paste these out in order:
SET:
data.base_id
$match(topic, /binary_sensor.(\w{3})_motion/).groups[0].$string()
SET:
data.camera
"camera." & $.data.base_id
SET:
data.image_processing
"image_processing.tensorflow_" & $.data.base_id
DELETE:
payload
SET:
payload.entity_id
"image_processing.tensorflow_" & $.data.base_id
You now have one Change node
that inserts all of the values you’ll need for processing. Next, onto the Call Service node
for triggering the TensorFlow scan:
This is the Entity ID from the service call that uses the previous value: {{data.image_processing}}
The next State node
labeled Person Check
checks to see if any people were detected in the scan. If there are no people detected then the flow terminates.
Almost done :). We now use a Template node
labeled Set Alert
to put in the values we’ll pass to the notification. This is slightly specific to Pushover, but can easily be modified for any notification system that supports attachments.
This is the text from the template node:
{
"data": {
"message": "{{data.new_state.attributes.friendly_name}}",
"title": "Person Detected",
"data": {
"file": {
"path": "/config/www/tensorflow/{{data.base_id}}_latest.jpg"
},
"device": "MiniNater",
"priority": "1"
}
}
}
Please note the path
can be changed to wherever you keep the image files. This is from the perspective of the docker container, not where it actually resides on the file system.
The last step is to pass it to a Call Service node
labeled Pushover
. Everything is mostly set from the template, so there are only a few values you need to plug into the node.
The End Result
I now get super-accurate people alerts sent to my iPhone and Apple Watch if someone as near my house when nobody is around. It even shows me the image in the alert so I don’t have to open another app to check the cameras directly. I’ve had zero false positives since I got this configured. Much, much better than the motion alerts that I eventually stopped looking at. Here’s an example of the end result:
Please note the message and title will be slightly different since I redid the guide and didn’t feel like taking an updated screenshot. FYI the red box is from Blue Iris flagging motion. The yellow box is from TensorFlow. You can’t read it, but says person 99.8%
at the top. Did I mention that it was accurate?
Hope this helps someone!