Migration to hassio in docker on ubuntu

Hello folks,

I thought I’d share my journey (warts and all) from hassio running on a raspberry pi 3b+ to it’s new home running as a docker container under ubuntu 18.04.3 on an old Fujitsu Esprimo SFF PC I had lying around.

I ordered 8GB of matched non-ECC memory (4x2GB) for around £10 and a crucial 120GB ssd for £20. For testing purposes (z-wave wise) I have a couple of spare Everspring SA413 USB adapters which are cheap - when you can get them - and work with everything I’ve attached them to so far; so I used one of these plugged directly into the back just to make sure that the z-wave adapter would be passed through to the docker container without issues.

That leaves the spec as:

  • Fujitsu Esprimo E5916 with essentially a Core2 duo processor
  • 8GB Kingston RAM
  • 120GB Crucial SATA 3 SSD
  • Onboard Gigabit Ethernet
  • Onboard VGA for a terminal when needed
  • Eversring SA413 z-wave USB adapter

I took it out to the garage for a good blow-out with an air gun part way through the move, which has actually reduced the CPU core temperatures from around 45C at idle to around 40C at idle. Compressed air for the win!

Apart from the table-top yesterday at around 60C - caused by a runaway something-or-other in another docker container taking CPU to 50% until I spotted it, it’s been pretty stable temperature wise. Early days I know. Please forgive the current colour scheme BTW - I needed something standoutish such that I did not make configurational changes on the wrong instance :slight_smile: you can clearly see the point at the far right hand end where I blew out the dust bunnies.

One of my main driving reasons for moving off the pi hardware was my dissatisfaction with the performance of logbook/history/(recorder). I’d previosuly moved (several months ago) from local storage for the home assistant database to a separate database within a mySQL jail running on a freeNAS box. Which improved things ever-so-slightly, but not to the point that I was happy. This I believe is a limitation of the pi in some shape or form.

I am pleased to report that in it’s current form with very little (bar some annoying z-wave nonsense for all of my 16 x TRVs) excluded from logbook/history/recorder, it is far more performant. I’ve got 31days with a 7 day purge cycle configured, so I’ll give it a week or two and report back on this front!

FYI: bringing up 24hrs worth of logbook entries is now achieved in 5s (five seconds) and around 6s to render 24hrs of history, which in comparison to the before picture on the pi is a monumental improvement. No disrespect to the pi intended - it is what it is and has done well to this point.

I have been using node-red for my automations for around a year, which was a steep learning curve for the first week or so! I started with node-red within hassio on the pi, but as part of figuring what might be causing the slooooow logbook/history rendering, I span up a freebsd jail, built node-red and migrated all my flows over to there. I saw a very marginal improvement, but it was so subjective that I could not say definitively that I saw a real world improvement.

One of my better decisions has been to use the Export>All Flows>Formatted>Download feature within node-red to effectively back-up my flows. It has paid dividends a number of times! I personally save them encrypted (using Boxcryptor) on Amazon Drive. It has indeed allowed me to be rather “gun-ho” node-red wise as apart from occasionally having to sort out control nodes, recovery when I’ve screwed up has been swift and painless. This is what I’m talking about:

Actually this is now reasonably redundant, because before I do anything daft now, I have the facility to create a new docker image from my running container, which has all my modifications and (importantly for node-red) all my flows within the image. From that image, I can then spin up a new container identical to my (now) trashed original. I’ve used this feature already!

I appreciate I am jumping around a bit, but one thing I recently discovered when migrating my last bits of node-red automation from my freeNAS jail to the new docker container is that in order for my homekit service nodes to be accessible (think iptables/firewall/bridged network nightmare), node-red is best run attached to the host network rather than the default bridge network IMHO. The default method of spinning up a node-red container results in a bridged container. Adding new port mappings for a container is not easy once they are running - some folk have had success in creating another bloody container to act as a go-between, but effectively, after committing my existing node-red container image, I simply span up a duplicate using the “replace” option within Portainer, changed the network to “host” (from “bridge”) and used my new image as the source for the new container. Perfect! This is my Portainer container view:

I use the Eve app on our IOS phones (yup, I’m an IOS fanboy!) having shared a bunch of elementary “accessories” within the official IOS Home app with my family, who then have the ability to boost the heating for half an hour. First world problems people :wink:

One thing that didn’t work off the bat in docker was my CPU temperature sensors. I installed lm-sensors on the ubuntu server host, but this is not passed through to containers. So, plan B was/is a change to the existing pi CPU sensor. This is the simple sensor code for my Esprimo SFF which works a treat:

# CPU Temperature Core 0
  - platform: command_line
    name: CPU Temperature Core 0
    command: "cat /sys/class/hwmon/hwmon0/temp2_input"
    # If errors occur, remove degree symbol below
    unit_of_measurement: "°C"
    value_template: '{{ value | multiply(0.001) | round(1) }}'

# CPU Temperature Core 1
  - platform: command_line
    name: CPU Temperature Core 1
    command: "cat /sys/class/hwmon/hwmon0/temp3_input"
    # If errors occur, remove degree symbol below
    unit_of_measurement: "°C"
    value_template: '{{ value | multiply(0.001) | round(1) }}'

I’ve seen various solutions for z-wave battery monitoring, from the simple to the sublime and having experimented with them all (I think) I’ve settled on my own simple solution. I create a new sensor for things with a battery I want to montor like this:

# Battery Level sensors for all z-wave devices
  - platform: template 
    sensors:
      office_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat', 'battery_level') | int }}"
        friendly_name: 'Office TRV Battery Level'
        device_class: battery
      hall_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_2', 'battery_level') | int }}"
        friendly_name: 'Hall TRV Battery Level'
        device_class: battery
      living_room_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_3', 'battery_level') | int }}"
        friendly_name: 'Living Room TRV Battery Level'
        device_class: battery
      toilet_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_4', 'battery_level') | int }}"
        friendly_name: 'Toilet TRV Battery Level'
        device_class: battery
      sun_room_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_5', 'battery_level') | int }}"
        friendly_name: 'Sun Room TRV Battery Level'
        device_class: battery
      utility_room_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_6', 'battery_level') | int }}"
        friendly_name: 'Utility Room TRV Battery Level'
        device_class: battery
      dining_room_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_7', 'battery_level') | int }}"
        friendly_name: 'Dining Room TRV Battery Level'
        device_class: battery
      landing_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_8', 'battery_level') | int }}"
        friendly_name: 'Landing TRV Battery Level'
        device_class: battery
      master_bedroom_left_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_9', 'battery_level') | int }}"
        friendly_name: 'Master Bedroom Left TRV Battery Level'
        device_class: battery
      master_bedroom_right_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_10', 'battery_level') | int }}"
        friendly_name: 'Master Bedroom Right TRV Battery Level'
        device_class: battery
      porch_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_11', 'battery_level') | int }}"
        friendly_name: 'Porch TRV Battery Level'
        device_class: battery
      spare_bedroom_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_12', 'battery_level') | int }}"
        friendly_name: 'Spare Bedroom TRV Battery Level'
        device_class: battery
      den_right_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_13', 'battery_level') | int }}"
        friendly_name: 'Den Right TRV Battery Level'
        device_class: battery
      den_left_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_14', 'battery_level') | int }}"
        friendly_name: 'Den Left TRV Battery Level'
        device_class: battery
      megan_left_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_15', 'battery_level') | int }}"
        friendly_name: 'Megan Left TRV Battery Level'
        device_class: battery
      megan_right_trv_battery_level:
        value_template: "{{ state_attr('zwave.eurotronic_eur_spiritz_wall_radiator_thermostat_16', 'battery_level') | int }}"
        friendly_name: 'Megan Right TRV Battery Level'
        device_class: battery

Then have two automations in node-red that are crude, but do the job. The first sends a pushover notification for any battery at or below 30% and the second kindly informs at the next z-wave battery update that a battery has been replaced using some really crude logic! Notice the wildcard in the Entity ID:

This is what the function node looks like:


This is the code for those automations:

[
    {
        "id": "7d3b27a2.c777f8",
        "type": "tab",
        "label": "Batteries",
        "disabled": false,
        "info": ""
    },
    {
        "id": "a05bdfc5.0451c8",
        "type": "trigger-state",
        "z": "7d3b27a2.c777f8",
        "name": "trv battery level <=30%",
        "server": "f4dddbca.91055",
        "entityid": "sensor.*_trv_battery_level",
        "entityidfiltertype": "regex",
        "debugenabled": false,
        "constraints": [
            {
                "id": "fthrrst2f3d",
                "targetType": "this_entity",
                "targetValue": "",
                "propertyType": "current_state",
                "propertyValue": "new_state.state",
                "comparatorType": ">=",
                "comparatorValueDatatype": "num",
                "comparatorValue": "5"
            },
            {
                "id": "st6tiwn6toh",
                "targetType": "this_entity",
                "targetValue": "",
                "propertyType": "current_state",
                "propertyValue": "new_state.state",
                "comparatorType": "<=",
                "comparatorValueDatatype": "num",
                "comparatorValue": "30"
            }
        ],
        "constraintsmustmatch": "all",
        "outputs": 2,
        "customoutputs": [],
        "outputinitially": true,
        "state_type": "num",
        "x": 170,
        "y": 60,
        "wires": [
            [
                "25630eeb.b5ea2a"
            ],
            []
        ]
    },
    {
        "id": "e3b6cf9f.66efb8",
        "type": "api-call-service",
        "z": "7d3b27a2.c777f8",
        "name": "low battery notification",
        "server": "f4dddbca.91055",
        "version": 1,
        "service_domain": "notify",
        "service": "pushover",
        "entityId": "",
        "data": "",
        "dataType": "json",
        "mergecontext": "",
        "output_location": "",
        "output_location_type": "none",
        "mustacheAltTags": false,
        "x": 840,
        "y": 60,
        "wires": [
            [
                "725b52ef.259ba4"
            ]
        ]
    },
    {
        "id": "25630eeb.b5ea2a",
        "type": "function",
        "z": "7d3b27a2.c777f8",
        "name": "convert trigger:state message for pushover",
        "func": "newmsg = {};\nsfn = msg.data.event.new_state.attributes.friendly_name\nsfn = sfn.replace (\" Battery Level\", \"\");\nnewmsg.payload = { data: {\"title\": \"Battery Low in \"+\n    //msg.data.new_state.attributes.friendly_name,\n    sfn,\n    \"message\":\"The battery in the \"+\n    //msg.data.new_state.attributes.friendly_name+\n    sfn+\n    \" has fallen to \"+msg.payload+\n    \"%. Please replace it soon!\"\n} };\nreturn newmsg;",
        "outputs": 1,
        "noerr": 0,
        "x": 510,
        "y": 60,
        "wires": [
            [
                "e3b6cf9f.66efb8"
            ]
        ]
    },
    {
        "id": "725b52ef.259ba4",
        "type": "debug",
        "z": "7d3b27a2.c777f8",
        "name": "",
        "active": false,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "x": 1050,
        "y": 60,
        "wires": []
    },
    {
        "id": "a83cf137.c6ad88",
        "type": "trigger-state",
        "z": "7d3b27a2.c777f8",
        "name": "trv battery replaced",
        "server": "f4dddbca.91055",
        "entityid": "sensor.*_trv_battery_level",
        "entityidfiltertype": "regex",
        "debugenabled": false,
        "constraints": [
            {
                "id": "w6maft8lo9i",
                "targetType": "this_entity",
                "targetValue": "",
                "propertyType": "previous_state",
                "propertyValue": "old_state.state",
                "comparatorType": ">=",
                "comparatorValueDatatype": "num",
                "comparatorValue": "5"
            },
            {
                "id": "uxzimozkudm",
                "targetType": "this_entity",
                "targetValue": "",
                "propertyType": "previous_state",
                "propertyValue": "old_state.state",
                "comparatorType": "<=",
                "comparatorValueDatatype": "num",
                "comparatorValue": "30"
            },
            {
                "id": "xzx1ygb9hpd",
                "targetType": "this_entity",
                "targetValue": "",
                "propertyType": "current_state",
                "propertyValue": "new_state.state",
                "comparatorType": ">=",
                "comparatorValueDatatype": "num",
                "comparatorValue": "90"
            }
        ],
        "constraintsmustmatch": "all",
        "outputs": 2,
        "customoutputs": [],
        "outputinitially": true,
        "state_type": "num",
        "x": 150,
        "y": 160,
        "wires": [
            [
                "ffdf49f0.8e1b"
            ],
            []
        ]
    },
    {
        "id": "ffdf49f0.8e1b",
        "type": "function",
        "z": "7d3b27a2.c777f8",
        "name": "convert trigger:state message for pushover",
        "func": "newmsg = {};\nsfn = msg.data.event.new_state.attributes.friendly_name\nsfn = sfn.replace (\" Battery Level\", \"\");\nnewmsg.payload = { data: {\"title\": \"Battery replaced in \"+\n    //msg.data.new_state.attributes.friendly_name,\n    sfn,\n    \"message\":\"The battery in the \"+\n    //msg.data.new_state.attributes.friendly_name+\n    sfn+\n    \" has been replaced and is now at \"+msg.payload+\n    \"%.\"\n} };\nreturn newmsg;",
        "outputs": 1,
        "noerr": 0,
        "x": 510,
        "y": 160,
        "wires": [
            [
                "bd0c5bec.b0bb88"
            ]
        ]
    },
    {
        "id": "bd0c5bec.b0bb88",
        "type": "api-call-service",
        "z": "7d3b27a2.c777f8",
        "name": "battery replaced notification",
        "server": "f4dddbca.91055",
        "version": 1,
        "service_domain": "notify",
        "service": "pushover",
        "entityId": "",
        "data": "",
        "dataType": "json",
        "mergecontext": "",
        "output_location": "",
        "output_location_type": "none",
        "mustacheAltTags": false,
        "x": 860,
        "y": 160,
        "wires": [
            [
                "483281d9.03b208"
            ]
        ]
    },
    {
        "id": "483281d9.03b208",
        "type": "debug",
        "z": "7d3b27a2.c777f8",
        "name": "",
        "active": false,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "x": 1090,
        "y": 160,
        "wires": []
    },
    {
        "id": "f4dddbca.91055",
        "type": "server",
        "z": "",
        "name": "Home Assistant",
        "legacy": false,
        "hassio": false,
        "rejectUnauthorizedCerts": false,
        "ha_boolean": "y|yes|true|on|home|open",
        "connectionDelay": true
    }
]

At the moment, the only thing missing from the new docker container is Bluetooth presence detection. I’ve ordered a couple (always have a spare a-la-z-wave-usb!) of known-to-work-with-ubuntu 20M range USB tranceivers, so that will be a future update.

The one things that I thought was guaranteed to cause me pain was thez-wave move…however, having tested the pass-through of a spare SA413 to the hassio container, I:

  • backed up my z-wave config on the pi
  • shutdown the pi
  • removed the SA413 containing my paired z-wave TRVs
  • plugged the same SA413 into the back of the Fujitsu Esprimo SFF
  • rebooted ubuntu

…and voila! It just worked! I didn’t believe it at first, but it simply worked - of course there was no reason for it not to work!

None of any of the above would have been possible (without a great deal of pain) had I not taken hassio snapshots. This was the single most important element in all of this migration that I have left until last - hassio snapshots for the win (as well as air compressors with air guns of course). I used the samba share addon, copied the snapshot from the backup directory on the pi and pushed it to the backup directory on the docker container. I did have to restart the container to get it to pick up the backup, but thereafter I used the erase option and the rest is history.

If you’ve read all the above, thanks for sticking with - as always, it’s turned into a bit of an essay!

bikefright.

So…Bluetooth through docker is an unreliable nightmare! It works…until it doesn’t.

All is not lost - a silver lining of moving hassio to docker under ubuntu is that I had a “spare” pi3.

Roll forward a week and I have disabled the bluetooth device_tracker platform under hassio and instead implemented a single-node monitor instance (courtesy of @andrewjfreyer - what a fabulous solution this is! Truly!), which has been faultless since install.

That looooong thread is here: [monitor] Reliable, Multi-User, Distributed Bluetooth Occupancy/Presence Detection

For reference, my raspberry pi hostname is: home How original.

Again, a steep learning curve, but once the basics are in hand like:

  1. a naming convention
  2. starting monitor with the -x option (makes published topics persistent - retain flag set in mqtt terms)
  3. a single automation (now my only automation within hassio) to restart monitor when hassio restarts

How I achieved the above:

I went with names_devicetype e.g. rowlands_iphone, rowlands_ipad etc. 'cos I have a simple setup for now. I’m literally interested in family phones as they come and go.

Edit the systemd service config for monitor with:

sudo nano /etc/systemd/system/monitor.service

My monitor service now looks like this, the only change being the “-x”:

[Unit]
Description=Monitor Service
After=network.target

[Service]
User=root
ExecStart=/bin/bash /home/pi/monitor/monitor.sh -x &
WorkingDirectory=/home/pi/monitor
Restart=always
RestartSec=10

[Install]
WantedBy=multi-user.target network.target

My hassio automation looks like this:

  - id: '1571175825529'
   alias: monitor restart
   trigger:
   - event: start
     platform: homeassistant
   condition: []
   action:
   - data:
       topic: monitor/scan/restart
     service: mqtt.publish

For reference, my mqtt_preferences file looks like this:

# ---------------------------
#
# MOSQUITTO PREFERENCES
#
# ---------------------------

# IP ADDRESS OR HOSTNAME OF MQTT BROKER
mqtt_address=yourhostname.yourdomainname

# MQTT BROKER USERNAME
mqtt_user=yournewuser

# MQTT BROKER PASSWORD
mqtt_password=yournewpassword

# MQTT PUBLISH TOPIC ROOT
mqtt_topicpath=monitor

# PUBLISHER IDENTITY
mqtt_publisher_identity=''

# MQTT PORT
mqtt_port='1883'

# MQTT CERTIFICATE FILE
mqtt_certificate_path=''

#MQTT VERSION (EXAMPLE: 'mqttv311')
mqtt_version=''

I have migrated from the mqtt addon within hassio through mqtt running on my new monitor node… now to it’s final resting place on another docker container to provide separation.

I followed this tutorial to setup the container and create my new username: installing mosquitto in docker

The only addition to behaviour_preferences is the enablement of the device_tracker reporting at the bottom which adds a device_tracker mqtt sub-topic for each device in your known_xxxx_addresses files (for example: monitor/home/rowlands_iphone/device_tracker) which has the value “home” or “not_home”. Very handy:

# ---------------------------
#
# BEHAVIOR PREFERENCES
#
# ---------------------------

#MAX RETRY ATTEMPTS FOR ARRIVAL
PREF_ARRIVAL_SCAN_ATTEMPTS=1

#MAX RETRY ATTEMPTS FOR DEPART
PREF_DEPART_SCAN_ATTEMPTS=2

#SECONDS UNTIL A BEACON IS CONSIDERED EXPIRED
PREF_BEACON_EXPIRATION=240

#MINIMUM TIME (IN SECONDS) BEWTEEN THE SAME TYPE OF SCAN (ARRIVE SCAN, DEPART SCAN)
PREF_MINIMUM_TIME_BETWEEN_SCANS=15

#ARRIVE TRIGGER FILTER(S)
PREF_PASS_FILTER_ADV_FLAGS_ARRIVE=".*"
PREF_PASS_FILTER_MANUFACTURER_ARRIVE=".*"

#ARRIVE TRIGGER NEGATIVE FILTER(S)
PREF_FAIL_FILTER_ADV_FLAGS_ARRIVE="NONE"
PREF_FAIL_FILTER_MANUFACTURER_ARRIVE="NONE"

# If true, this value will cause monitor to report a 'home' or 'not_home' message to
# /device_tracker conforming to device_tracker mqtt protocol
PREF_DEVICE_TRACKER_REPORT="true"

I wanted to keep things simple, so within hassio, I have simply configured my device trackers as follows:

# Device Tracker
device_tracker:
  - platform: mqtt
    devices:
      rowlands_iphone_mqtt: 'monitor/home/rowlands_iphone/device_tracker'
      rowlands_ipad_mqtt: 'monitor/home/rowlands_ipad/device_tracker'
      rowlands_ipod_mqtt: 'monitor/home/rowlands_ipod/device_tracker'
      sandras_iphone_mqtt: 'monitor/home/sandras_iphone/device_tracker'
      sandras_ipad_mqtt: 'monitor/home/sandras_ipad/device_tracker'
      megans_iphone_mqtt: 'monitor/home/megans_iphone/device_tracker'
      kyles_ipad_mqtt: 'monitor/home/kyles_ipad/device_tracker'

This results in a simple device_tracker entity for each device that I have added to the relevant hassio “person”.

If you want to use the mqtt sensors and get at the confidence level etc, you can also do what I did initially, which is to add the relevant sensors to sensors.yaml. Most (if not all of this is covered off in the readme or within the original post/replies in the user community:

# mqqt sensors that enable the monitor platform running on a separate raspberry pi to
# communicate "presence confidence" values to homeassistant
  - platform: mqtt
    state_topic: 'monitor/home/status'
    name: 'Home Monitor Instance Status'# overall health of the monitor service
    force_update: true

  - platform: mqtt
    state_topic: 'monitor/home/rowlands_iphone'
    value_template: '{{ value_json.confidence }}'
    unit_of_measurement: '%'
    name: 'Rowlands iPhone MQTT'

  - platform: mqtt
    state_topic: 'monitor/home/rowlands_ipad'
    value_template: '{{ value_json.confidence }}'
    unit_of_measurement: '%'
    name: 'Rowlands iPad MQTT'

  - platform: mqtt
    state_topic: 'monitor/home/rowlands_ipod'
    value_template: '{{ value_json.confidence }}'
    unit_of_measurement: '%'
    name: 'Rowlands iPod MQTT'

  - platform: mqtt
    state_topic: 'monitor/home/sandras_iphone'
    value_template: '{{ value_json.confidence }}'
    unit_of_measurement: '%'
    name: 'Sandras iPhone MQTT'

  - platform: mqtt
    state_topic: 'monitor/home/sandras_ipad'
    value_template: '{{ value_json.confidence }}'
    unit_of_measurement: '%'
    name: 'Sandras iPad MQTT'

  - platform: mqtt
    state_topic: 'monitor/home/megans_iphone'
    value_template: '{{ value_json.confidence }}'
    unit_of_measurement: '%'
    name: 'Megans iPhone MQTT'

  - platform: mqtt
    state_topic: 'monitor/home/kyles_ipad'
    value_template: '{{ value_json.confidence }}'
    unit_of_measurement: '%'
    name: 'Kyles iPad MQTT'

  - platform: min_max
    name: "Home Occupancy Confidence"
    type: max
    round_digits: 0
    entity_ids:
     - sensor.rowlands_iphone_mqtt
     - sensor.rowlands_ipad_mqtt
     - sensor.rowlands_ipod_mqtt
     - sensor.sandras_iphone_mqtt
     - sensor.sandras_ipad_mqtt
     - sensor.megans_iphone_mqtt
     - sensor.kyles_ipad_mqtt

This has been stable for four days. If anything changes, I’ll update my own thread :slight_smile:

bikefright.