Simplisafe v2 sensors

I have been considering getting my father to use Home Assistant for awhile now, but one feature that would make it worth it, is to integrate all of his simplisafe v2 sensors. I know that the v2 system integrates as an alarm system (I actually made the initial integration) but being able to use the door sensors would be amazing. Have any simplisafe users out there managed to find a solution to get these sensors in HA? Thanks.

Also posted on Reddit just in case someone comes across this and there is an answer over there https://www.reddit.com/r/homeassistant/comments/abnb49/simplisafe_v2_sensor_integration/?utm_source=reddit-android

if you have a raspberry PI and RTL-SDR, this project is quite easy (via rtl_433 and a simple python script)…got all my v2 sensors (windows/doors/water leak/motion detectors) into homeassistant in less than a couple hours…you can even use the pi (via rpitx) to broadcast HOME/AWAY/DISARM commands but I have not found that to be super reliable…almost makes up for the fact that our v2 systems are so insecure :slight_smile:

python script I am using

#!/usr/bin/python
# -*- coding: UTF-8 -*-

import subprocess
import sys
import time
import paho.mqtt.client as mqtt
import os
import json

from config import *

rtl_433_cmd = "/usr/local/bin/rtl_433 -G -F json" # linux

# Define MQTT event callbacks
def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))

def on_disconnect(client, userdata, rc):
    if rc != 0:
        print("Unexpected disconnection.")

def on_message(client, obj, msg):
    print(msg.topic + " " + str(msg.qos) + " " + str(msg.payload))

def on_publish(client, obj, mid):
    print("mid: " + str(mid))

def on_subscribe(client, obj, mid, granted_qos):
    print("Subscribed: " + str(mid) + " " + str(granted_qos))

def on_log(client, obj, level, string):
    print(string)

# Setup MQTT connection

mqttc = mqtt.Client()
# Assign event callbacks
#mqttc.on_message = on_message
mqttc.on_connect = on_connect
#mqttc.on_publish = on_publish
mqttc.on_subscribe = on_subscribe
mqttc.on_disconnect = on_disconnect

# Uncomment to enable debug messages
mqttc.on_log = on_log

# Uncomment the next line if your MQTT server requires authentication
mqttc.username_pw_set(MQTT_USER, password=MQTT_PASS)
mqttc.connect(MQTT_HOST, MQTT_PORT, 60)

mqttc.loop_start()

# Start RTL433 listener
rtl433_proc = subprocess.Popen(rtl_433_cmd.split(),stdout=subprocess.PIPE,stderr=subprocess.STDOUT,universal_newlines=True)


while True:
    subtopic = None
    device = None
    for line in iter(rtl433_proc.stdout.readline, '\n'):
        if "time" in line:
            #mqttc.publish(MQTT_TOPIC, payload=line,qos=MQTT_QOS)
            json_dict = json.loads(line)
            for item in json_dict:
                value = json_dict[item]
                if "model" in item:
                    subtopic = value
                if "device" in item:
                    device = value
            if not subtopic:
                mqttc.publish(MQTT_TOPIC, payload=line,qos=MQTT_QOS,retain=True)
            else:
                if not device:
                    mqttc.publish(MQTT_TOPIC+"/"+subtopic, payload=line,qos=MQTT_QOS,retain=True)
                else:
                    mqttc.publish(MQTT_TOPIC+"/"+subtopic+"/"+device, payload=line,qos=MQTT_QOS,retain=True)

also need a config.py containing:

MQTT_USER="{your MQTT user}"
MQTT_PASS="{your MQTT pass}"
MQTT_HOST="{your MQTT serverIP}"
MQTT_PORT=1883
MQTT_TOPIC="sensors/rtl_433"
MQTT_QOS=0

This is awesome, thanks. I have been planning on getting a SDR and just haven’t got around to it. Do you have any recommendations on which one to get?

not really…Mine is a few years old…Older NooElec similar to this one [amazon]…The official v3 should also be good [amazon]

Would this work for intercepting Simplisafe v3 sensor signals as well ?

Thanks !

NO…v3 wireless signals are encrypted…which is a good thing for security…but blocks this method…

update to the code - bug in how I was using device and/or id values…

#!/usr/bin/python
# -*- coding: UTF-8 -*-

import subprocess
import sys
import time
import paho.mqtt.client as mqtt
import os
import json

from config import *

rtl_433_cmd = "/usr/local/bin/rtl_433 -G -F json" # linux

# Define MQTT event callbacks
def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))

def on_disconnect(client, userdata, rc):
    if rc != 0:
        print("Unexpected disconnection.")

def on_message(client, obj, msg):
    print(msg.topic + " " + str(msg.qos) + " " + str(msg.payload))

def on_publish(client, obj, mid):
    print("mid: " + str(mid))

def on_subscribe(client, obj, mid, granted_qos):
    print("Subscribed: " + str(mid) + " " + str(granted_qos))

def on_log(client, obj, level, string):
    print(string)

# Setup MQTT connection

mqttc = mqtt.Client()
# Assign event callbacks
#mqttc.on_message = on_message
mqttc.on_connect = on_connect
#mqttc.on_publish = on_publish
mqttc.on_subscribe = on_subscribe
mqttc.on_disconnect = on_disconnect

# Uncomment to enable debug messages
mqttc.on_log = on_log

# Uncomment the next line if your MQTT server requires authentication
mqttc.username_pw_set(MQTT_USER, password=MQTT_PASS)
mqttc.connect(MQTT_HOST, MQTT_PORT, 60)

mqttc.loop_start()

# Start RTL433 listener
rtl433_proc = subprocess.Popen(rtl_433_cmd.split(),stdout=subprocess.PIPE,stderr=subprocess.STDOUT,universal_newlines=True)


while True:
    for line in iter(rtl433_proc.stdout.readline, '\n'):
        subtopic = None
        device = None
        if "time" in line:
            #mqttc.publish(MQTT_TOPIC, payload=line,qos=MQTT_QOS)
            json_dict = json.loads(line)
            for item in json_dict:
                value = json_dict[item]
                if "model" in item:
                    subtopic = value
                if "device" in item:
                    device = value
                if not device:
                    if "id" in item:
                        device = str(value)
            if not subtopic:
                mqttc.publish(MQTT_TOPIC, payload=line,qos=MQTT_QOS,retain=True)
            else:
                if not device:
                    mqttc.publish(MQTT_TOPIC+"/"+subtopic, payload=line,qos=MQTT_QOS,retain=True)
                else:
                    mqttc.publish(MQTT_TOPIC+"/"+subtopic+"/"+device, payload=line,qos=MQTT_QOS,retain=True)

Awesome thanks. Still haven’t bought the SDR yet trying to find a good computer to run this on for my father.

Trying to figure out if my sensors are compatible with this project. I have the older Simplisafe, which does not use wifi. My keypad panel is the old LCD style one.

My sensors are u9k-es1000 433MHz. Is this gen 1 and incompatible with your above code?

I ended up buying a https://www.amazon.com/NooElec-NESDR-Mini-Compatible-Packages/dp/B009U7WZCA/ after having some concern about which dongle to get due to some complaints of them getting hot, but this one doesn’t get hot and I had it shipped from China to get it cheaper. Used an old raspberry pi2 with a custom image that had rtl433 and some other stuff already installed.

I downloaded a Raspberry Pi image that already had everything I would need (rtl_433 for sniffing, supervisor for launching rtl_433 on startup) from this video and followed the first half of this guide ( https://www.youtube.com/watch?v=z1y6j8-V7J0&feature=emb_title ) for setting up pushing it to my HAS MQTT server. rtl_433 has a library built-in for Simplisafe and an MQTT messaging service built-in so you don’t need to mess with python scripts.

Using that guy’s image, I used putty/kitty to SSH in. I then set the device to use a static IP (google this if you don’t know how) and edited my supervisor.conf (to make rtl_433 autostart on boot and restart on crashes);

sudo nano /etc/supervisor/conf.d/rtl_433.conf

with;

command=/home/pi/rtl_433/build/src/rtl_433 -R 102 -F "mqtt://IPOFYOURMQTTSERVER:1883,user="mqtt username",pass="mqtt password",events=Simplisafe"

and also ran

sudo raspi-config

to change my timezone on the preconfigured image and enabled auto login on the pi.

sudo nano /etc/supervisor/conf.d/rtl_433.conf

with contents

command=/home/pi/rtl_433/build/src/rtl_433 -q -R 102 -F "mqtt://LOCALMQTTSERVERIPHERE:1883,user="YOURMQTTBROKEUSERNAMEHERE",pass="YOURMQTTPASSWORDHERE",events=Simplisafe"
user=pi
autostart=yes
autorestart=yes
startretries=100
stderr_logfile=/var/log/rtl_433/rtl_433.err.log
stdout_logfile=/var/log/rtl_433/rtl_433.log

and then an automation for each sensor.

here are two separate automations for automations.yaml. one pushes notifications to an LGTV and the other announces sensor open messages to a google home speaker. each sensor has it’s own automation because i also have a page in my lovelace ui that allows me to toggle off and on each one in case i don’t want to monitor a certain sensor on certain days.

- id: '1610264389167'
  alias: Justin's TV - Justin's Room South Window Sensor
  description: 1P8JN
  trigger:
  - platform: mqtt
    topic: Simplisafe
  condition:
  - condition: template
    value_template: '{{ trigger.payload_json.device == "1P8JN" }}'
  action:
  - choose:
    - conditions:
      - condition: template
        value_template: '{{ trigger.payload_json.extradata == "Contact Open" }}'
      sequence:
      - service: notify.bedroom_tv
        data:
          message: Justin's Room South Window Opened
          data:
            icon: /config/www/images/window-open.png
- id: '1610330537969'
  alias: CCMini - Jon South Window
  description: 1PBWF
  trigger:
  - platform: mqtt
    topic: Simplisafe
  condition:
  - condition: template
    value_template: '{{ trigger.payload_json.device == "1PBWF" }}'
  action:
  - choose:
    - conditions:
      - condition: template
        value_template: '{{ trigger.payload_json.extradata == "Contact Open" }}'
      sequence:
      - service: tts.google_translate_say
        data:
          entity_id: media_player.home_assistant_speaker
          message: South Window Open, Jon's Room
    - conditions:
      - condition: template
        value_template: '{{ trigger.payload_json.extradata == "Contact Closed Giving
          False Alerts" }}'
      sequence:
      - service: tts.google_translate_say
        data:
          entity_id: media_player.home_assistant_speaker
          message: South Window Open, Jon's Room
    default: []
  - delay: '6'
  mode: single

and for a Lovelace sensor setup;

in configuration.yaml, we’ll add some input_text labels for each sensor to make the initial state of the label not say “Unknown” and just be blank instead, so our sensor cards aren’t cluttered with “Unknown” messages for sensors that haven’t been triggered yet;

input_text:
  mostrecentlyusedsensor:
    name: Unavailable
    initial: ''
  lastusedsidedoor:
    name: Unavailable
    initial: ''
  lastusedgaragehallwaydoor:
    name: Unavailable
    initial: ''

in automations.yaml, we will make a single automation that checks our MQTT messages for each sensor’s name in a “Contact Open” status. when it sees this, it writes the date to our friendly name labels we made above.

- id: '1610658657522'
  alias: Most Recent Entry Sensor
  description: Most Recently Triggered
  trigger:
  - platform: mqtt
    topic: Simplisafe
  condition:
  - condition: template
    value_template: '{{ trigger.payload_json.extradata == "Contact Open" }}'
  action:
  - choose:
    - conditions:
      - condition: template
        value_template: '{{ trigger.payload_json.device == "1P7U5" }}'
      sequence:
      - service: input_text.set_value
        data:
          value: '{{ now().strftime("%a, %b %d at %I:%M %p")}}'
        entity_id: input_text.lastusedsidedoor
    - conditions:
      - condition: template
        value_template: '{{ trigger.payload_json.device == "1M6WC" }}'
      sequence:
      - service: input_text.set_value
        data:
          value: '{{ now().strftime("%a, %b %d at %I:%M %p")}}'
        entity_id: input_text.lastusedgaragehallwaydoor
    default: []
  mode: single

repeat with a new condition: template for every sensor you have ( trigger.payload_json.device is the serial of your respective sensor ).

For a card that shows the most recently triggered sensor, we will use this automation snippet. Append a new condition: template for each sensor.

- id: '1610658647527'
  alias: Most Recent Entry Sensor Overall
  description: Most Recently Triggered Overall
  trigger:
  - platform: mqtt
    topic: Simplisafe
  condition:
  - condition: template
    value_template: '{{ trigger.payload_json.extradata == "Contact Open" }}'
  action:
  - choose:
    - conditions:
      - condition: template
        value_template: '{{ trigger.payload_json.device == "1P7U5" }}'
      sequence:
      - service: input_text.set_value
        data:
          value: Side Door on {{ now().strftime("%a, %b %d at %I:%M %p")}}
        entity_id: input_text.mostrecentlyusedsensor
    - conditions:
      - condition: template
        value_template: '{{ trigger.payload_json.device == "1M6WC" }}'
      sequence:
      - service: input_text.set_value
        data:
          value: Garage Hallway Door on {{ now().strftime("%a, %b %d at %I:%M %p")}}
        entity_id: input_text.mostrecentlyusedsensor
    default: []
  mode: single

and the Lovelace markdown card code for most recently used sensor;

type: markdown
content: |
  <ha-icon icon="mdi:alert"></ha-icon> {{ states('input_text.mostrecentlyusedsensor') }}
title: Most Recently Activated

and for doors and windows;

type: markdown
content: |-
  <ha-icon icon="mdi:door"></ha-icon> Side Door -
  {{ states('input_text.lastusedsidedoor') }}
  ---
  <ha-icon icon="mdi:door"></ha-icon> Front Door - 
  {{ states('input_text.lastusedfrontdoor') }}
  ---
  <ha-icon icon="mdi:door"></ha-icon> Patio Door - 
  {{ states('input_text.lastusedpatiodoor') }}
title: Doors
type: markdown
content: |
  <ha-icon icon="mdi:window-closed-variant"></ha-icon> Left Bay Window - 
  {{ states('input_text.lastusedleftbaywindow') }}
  ---
  <ha-icon icon="mdi:window-closed-variant"></ha-icon> Right Bay Window - 
  {{ states('input_text.lastusedrightbaywindow') }}
  ---
  <ha-icon icon="mdi:window-closed"></ha-icon> Kitchen Window - 
  {{ states('input_text.lastusedkitchenwindow') }}
title: Windows
1 Like

This is really cool! We bought a home with a ton of v2 entry sensors and motion/glass break sensors as well and the Simplisafe app is pretty clunky (can’t believe you can’t set a schedule to change modes at night) - I don’t want to pay the monthly fee either. Was wondering if you’re still using your sensors and if you ever had hiccups since your post? Also wondering if you have to still use the Simplisafe base station v2 for all the sensors to function? I’d prefer some open source base station replacement that could support more verbose home security armed/disarmed modes and schedules, if one exists.

We don’t use the arm/disarm feature, so I can’t speak to it. There’s a major security flaw in the v2 - I can’t remember exactly how, but it allowed for remotely disabling the entire system if you’re within a certain distance or line of sight to the base station. There’s info on the web. My base station stays on with this running along side it, all this does for me is monitor when a contact sensor opens or motion sensor is triggered.

I still use my setup. I have a dedicated pi running the rtl usb dongle 24/7 that sends the sensor data via mqtt to my main pi (Home Assistant). Your Home Assistant dashboard running on a cheap tablet or your phone would be your new base station UI. I also use sweet home 3D to make an overhead view of my home and took a screenshot of each room lit up (by adjusting exposure on other rooms) which changes in realtime on my Home Assistant lovelace UI when the most recently triggered Simplisafe sensor is triggered. I’ve also added phone alerts since my last post because they are super easy.

The biggest flaw of v2 is I couldn’t get it to reliably show realtime sensor state. It knows when a contact is separated (window or door open) but cannot tell if it remains open or closed thereafter. It kinda sucks and you’re better off buying zigbee sensors and using home assistant on a battery backup with your router for your “base station” to withstand outages, maybe.

1 Like

Thanks for alerting me to that vulnerability. I’d prefer to just use the sensors without the base station since it is clunky and has problems like these—will do some experimenting when I get the chance. Good to know about the flakiness of the sensor states after initial open/close too, I’ve noticed that behavior a bit as well.

What a cool idea for using Sweet Home 3D + HA! If you’ve published any content on that, I’d love to check it out, but even that screenshot is enough to see the potential of building an interactive home UI out.

I found the Sweet Home 3D idea from someone else in this community. Here is the contents of my lovelace card;
sweethome3d lovelace rooms - Pastebin.com.

I also have this in my configuration.yaml to define and set all the states to “off” when Home Assistant starts up; configuration yaml for most recent image simplisafe - Pastebin.com

And finally, an additional automation which sets an input boolean value for a most recently toggled sensor – this is the value that the image card watches; most recent image sensor automation for boolean toggle (for image card) - Pastebin.com

This is probably a really unoptimized way of doing it and I don’t have a great understanding of making stuff in Home Assistant, so it’s sloppily cobbled together, but works for me. The images live as separate image files for each room. I basically just took the one screenshot of the top-down view of my 3D model. After that, I used the marquee tool in Photoshop to lower the exposure of the image (except for the room I want to highlight), then saved each image file separately so I had an image highlighting every possible room. The end result is when the Simplisafe sensor is triggered on each respective window or door you have an at-a-glance 3D model image in the frontend when the boolean is triggered on.

I’ve seen people take the SweetHome3D method a step further by making clickable elements on the image which active entities and IOT objects that live in their home, but I haven’t gotten that far yet.