I installed an ip camera in my garage and did not have a sensor to monitor if the garage door was open or not. After some experimenting an, I am now using tensorflow to determine the garage door status.
I used the doc from https://codelabs.developers.google.com/codelabs/tensorflow-for-poets/#0 after getting the tutorial working. I created the following structure
- tf_files
- garagedoor_photos
- dooropen
- doorclosed
- garagedoor_photos
I then captured pictures and moved them into the respective folders to train the model.
(Captured like:
curl “https://XXXXXXXX/api/camera_proxy/camera.garage_camera?api_password=XXXX” > ~/tf_files/unsorted-$(date +"%Y%m%d%H%M%s%S").jpg
)
I used the following to train the model.
IMAGE_SIZE=224
ARCHITECTURE="mobilenet_0.50_${IMAGE_SIZE}"
tensorboard --logdir tf_files/training_summaries &
python -m scripts.retrain --bottleneck_dir=tf_files/bottlenecks --how_many_training_steps=$trainingsteps --model_dir=tf_files/models/ --summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}" --output_graph=tf_files/retrained_door_graph.pb --output_labels=tf_files/retrained_door_labels.txt --architecture="${ARCHITECTURE}" --image_dir=tf_files/garagedoor_photos
I’m not thrilled with this yet, but I am using flask + bash to integrate with HA. (I’d like to not jump out to subprocess in the future.)
import subprocess
import sys
import os
from flask import Flask
app = Flask(__name__)
@app.route('/')
def hello_world():
return 'Hello, World!'
@app.route('/tf/<model>/<camera>')
def show_user_profile(model,camera):
# show the model
conn = http.client.HTTPConnection("XXXXXXX")
headers = {}
url = "/api/camera_proxy/camera." + camera +"?api_password=XXXXXXXX"
conn.request("GET", url, headers=headers)
res = conn.getresponse()
data = res.read()
print(os.getcwd())
output = subprocess.check_output(["./webit.sh",model,camera])
print(output.decode('ascii'))
return '%s' % output.decode('ascii')
the webit.sh
#!/bin/bash
cd tensorflow-for-poets-2
curl -s "https://XXXXXXXX/api/camera_proxy/camera.$2?api_password=XXXXX" > tf_files/temp.jpg
source ../bin/activate
python -m scripts.label_image_json --graph=tf_files/retrained_$1_graph.pb --image=tf_files/temp.jpg --labels=tf_files/retrained_$1_labels.txt 2> /dev/null
then added the following to HA. My example need 90% confidence in one or the other
sensor:
- platform: rest
name: garage_door
resource: http://localhost:5000/tf/door/garage_camera
value_template: '{% if value_json.dooropen > 0.89 %} open {% elif value_json.doorclosed > 0.89 %} closed {% else %} unknown {% endif %}'