I wanted to integrate nest-like person detection into homeassistant with my existing IP camera so I could do things like turn on exterior/interior lights while the alarm was armed. Ended up working well, so I thought I would post here. It detects a person in <1s.
Attempting to make a tensorflow hass.io addon by implementing the tensorflow module with gRPC support
I am on rpi3b+ which hassio I staled on raspbian strech.
Can I install this on rpi.
You can try with a single detection region and a low fps stream. Not sure what the limits will be on an rpi.
What hardware do u use?
Currently using a 2013 Macbook Pro, but it isn’t working hard most of the time.
Really hoping to use this - thanks for the contribution!
I also checked out your HA repository - wholly molly… that must have been a long process to get everything done!
I just started, actually still waiting for my first round of sensors to arrive (door sensors). But I have an IP camera I wanted to build some triggers around. Unfortunately there is a tree and passing cars that generate a ton of shadows - so I have been looking for a good way to do object detection… preferably in docker.
Anyway - I hope to glean some knowledge for everything you have accomplished!
Very cool. I really like this.
I run my hass.io on ESX a nice I5 desktop machine at 3.5 GHz not doing very much. So I have some CPU cycles to spend for this
Trying to set this up. Fingers crossed
Hmm I didn’t succeed.
Can you suggest a model and label map for me to start playing with?
Also, is it possible to authenticate with my mqtt broker?
Download the file here and extract its contents and mount the
frozen_inference_graph.pb file: http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz
Then save this file as your labelmap: https://raw.githubusercontent.com/tensorflow/models/master/research/object_detection/data/mscoco_label_map.pbtxt
I didn’t implement mqtt authentication, but it can be added easily.
This is just what i have been looking for but having trouble with the regions environment variable format. Do you have an example of the regions env?
This is mine as an example:
Here is my regions env variable:
First region broken down (all are required):
350 - size of the square (350px by 350px)
0 - x coordinate of upper left corner (top left of image is 0,0)
300 - y coordinate of upper left corner (top left of image is 0,0)
5000 - minimum person bounding box size (width*height for bounding box of identified person)
200 - minimum number of changed pixels to trigger motion
mask-0-300.bmp - a bmp file with the masked regions as pure black, must be the same size as the region
I get the following error message when trying to up my docker-compose.yaml:
$ docker-compose up -d
Pulling frigate (blakeblackshear/frigate:latest)…
ERROR: pull access denied for blakeblackshear/frigate, repository does not exist or may require ‘docker login’
You had to build the container yourself with
docker build. I went ahead and pushed up my local build to docker hub, so you should be able to pull it now. I haven’t done any work to reduce the container size yet.
docker container provided,
is it armhf ready?
as i would like to install it on my raspberry pi.
I don’t have a armhf build. Sorry. You could build it yourself. I don’t have an RPi readily available to build one.
Are there any special recommendations for building it on rpi?
You should be able to just checkout the repo and run the build command. I expect it to take quite a while to build on a RPi.
@blakeblackshear, this sounds like a great solution, thanks for sharing! However, before investing in a NUC to run it on, I’d like to share my usecase.
I have a few Foscams cameras, which - upon detected motion - FTPs mpeg4 videos to a linux box (which could be the NUC in the future). These 2 minute MPEGs contains some motion, where I’d like to detect what’s on them (e.g. cats or birds shouldn’t trigger the alarm notification from Home Assistant (which were triggered with mqtt)).
Would that also be a possible way of doing this ? (e.g. not RTSP).
awesome, thank you for the example.