Internal position tracking with co-ordinates using an iphone and room-assistant

THIS IS CURRENTLY A WORK IN PROGRESS
However sharing as others smarter than me can weigh in and add their insights :slight_smile:

Project status: Working co-ordinates of iphone via BLE and room-assistant
To-do: 1. 2nd floor (and way to select the right floor based on distance or other sensor)
2. Room division
3. Overlay floorplan strategy

Hi Everyone,
So iā€™ve been trying to solve this FOREVER but never had the time to devote to it (and get the help I needed).
First off, big thanks to @Kermit for help with Node-Red and @mKeRix for room-assistant

My goal eventually is to overlay the positions of the person onto a floorplan (my inspiration was from xandem and their positioning system, which is great - however requires a lot of nodes currently, and doesnā€™t distinguish between people. Also if you are stationary it wont detect you).

Knowing where the regular house occupants are means I can get much more granular with automations.

The premise is to use 3 Room-assistant nodes on each floor to triangulate the position of a BLE device in my case an Iphone.

So first up the hardware:
3 x Raspberry Pi 3s (you can use Pi zeros, but WiFi and BT iā€™ve found to cause issues, so I hardwired my Pis in to avoid this). Iā€™ve got a couple of ethernet pi zero hats coming to test out too :slight_smile:
3 x MicroSDs 8Gb is more than enough

Software Needed:
Room-Assistant beta
Node-Red & HA Palette
Optional - HA NodeRed custom_component
Iphone BLE companion app

Step 1 - Infrastructure stuff
Iā€™m not going to walk people through their HA, Node-Red or Raspi Image set up - there are plenty of guides on that online.

Step 2 - Get BLE device IDs
If using iphone join the beta testflight with this link, approve BT permissions and copy the device ID for all the devices you want to track.

Step 3 - Raspberry Pi
Install Docker, nano, docker-compose and Room-assistant via ssh

sudo apt update
sudo apt upgrade
curl -sSL https://get.docker.com | sh
sudo usermod -a -G docker $USER
sudo apt -y install nano
sudo apt -y install docker-compose
cd ~
mkdir room-assistant
cd room-assistant
mkdir config
nano docker-compose.yml

Use this docker-compose (detailed info on the room-assistant site):

version: '3'
services:
  room-assistant:
    container_name: room-assistant
    image: mkerix/room-assistant:beta
    restart: unless-stopped
    network_mode: host
    volumes:
      - /var/run/dbus:/var/run/dbus
      - /home/pi/room-assistant/config:/room-assistant/config

[Ctrl+X and then hit Y to save]

nano ~/room-assistant/config/local.yml

Use this in your local.yml
Be sure to change:
instanceName should be named something different for each one (you can use actual room names if you prefer, just be sure to change them!)
mqttUrl to the IP of your MQTT server, and the credentials if you use them. if not remove the mqtt Options chunk
whitelist should include the BLE device IDs from the room-assistant companion app.
NOTE I have specified the network to use the ethernet port to overcome any WiFi issues (I also need to look in to whether I should disable WiFi altogether)

global:
  instanceName: ra0
  integrations:
    - homeAssistant
    - bluetoothLowEnergy
cluster:
  networkInterface: eth0
homeAssistant:
  mqttUrl: 'mqtt://X.X.X.X:1883'
  mqttOptions:
    username: user
    password: pass
bluetoothLowEnergy:
  whitelist:
    - ble-device-id-1 #name_here
    - ble-device-id-2 #name_here

Now start the docker-compose process (assuming you havent logged out yet so the usermod wont have kicked in for docker):

cd ~/room-assistant
sudo docker-compose up

Docker will be pulled down and start up. Be sure to check for any errors before moving on.
To validate the phone is detected and visible use either HA or MQTTfx to subscribe to the following topic:
room-assistant/sensor/ble-{device-id}/#

If when you move the phone, the distance JSON value within attributes changes you can move on.
If not pause for troubleshooting and re-read the docs.

This part is a little repetitive as you need to build the other 2 raspberry pis, so repeat 1-3 for each one, remembering to change the instanceName for each local.yml

Step 4 - Node-Red and triangulation
Iā€™m not going to pretend I understand the maths involved but if anyone cares, check out this link.

Essentially you need 3 fixed known locations, and thus the distances of an object from those locations will yield a resulting triangulated position.

Now Iā€™m using the Unraid NodeRed docker, so my settings.js is in /data/ yours maybe different.

cd /data
cd node_modules
npm install trilateration
sudo apt -y install nano
nano /data/settings.js

Now scroll down until you find:

functionGlobalContext: {
        // os:require('os'),
        // jfive:require("johnny-five"),
        // j5board:require("johnny-five").Board({repl:false})
    },

To use the trilateration package globally in NodeRed we need to add it here:

functionGlobalContext: {
        trilateration:require('trilateration')
        // os:require('os'),
        // jfive:require("johnny-five"),
        // j5board:require("johnny-five").Board({repl:false})
    },

[CTRL+X and hit y]

At this point Iā€™m not sure if required, but I rebooted the NodeRed docker

Now for the flow:
At the end of the flow I use an HA sensor (which requires the HA Node-Red integration to be installed from HACS - instructions) but you can just leave the co-ordinates as a msg if you want?

[{"id":"1abe71e.6c6f98e","type":"http request","z":"fa164293.781af","name":"","method":"GET","ret":"obj","paytoqs":"ignore","url":"http://X.X.X.X:6415/entities","tls":"","persist":false,"proxy":"","authType":"","x":413.9745178222656,"y":2304.657470703125,"wires":[["eaeec6bd.a20238"]]},{"id":"b5830319.8d31c","type":"inject","z":"fa164293.781af","name":"","props":[{"p":"payload"}],"repeat":"1","crontab":"","once":true,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":210.23840713500977,"y":2303.8727588653564,"wires":[["1abe71e.6c6f98e"]]},{"id":"eaeec6bd.a20238","type":"split","z":"fa164293.781af","name":"","splt":"\\n","spltType":"str","arraySplt":1,"arraySpltType":"len","stream":false,"addname":"","x":616.2383193969727,"y":2303.8723487854004,"wires":[["41f11a3d.f7ef04"]]},{"id":"41f11a3d.f7ef04","type":"switch","z":"fa164293.781af","name":"","property":"payload.id","propertyType":"msg","rules":[{"t":"eq","v":"ble-device-id-1","vt":"str"},{"t":"eq","v":"ble-device-id-2","vt":"str"}],"checkall":"true","repair":false,"outputs":2,"x":770.2383155822754,"y":2301.8723487854004,"wires":[["2c62a50f.cdf57a"],["e429bd13.904"]]},{"id":"2c62a50f.cdf57a","type":"function","z":"fa164293.781af","name":"","func":"var trilateration = global.get('trilateration');\n\ntrilateration.addBeacon(0, trilateration.vector(2, 4));\ntrilateration.addBeacon(1, trilateration.vector(5, 13));\ntrilateration.addBeacon(2, trilateration.vector(11, 2));\n\ntrilateration.setDistance(0, msg.payload.distances.ra0.distance);\ntrilateration.setDistance(1, msg.payload.distances.ra1.distance);\ntrilateration.setDistance(2, msg.payload.distances.ra2.distance);\n\nmsg.payload = trilateration.calculatePosition();\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":947.2381324768066,"y":2255.3531799316406,"wires":[["4764d31f.17f95c","fca290bf.e2bb7"]]},{"id":"e429bd13.904","type":"function","z":"fa164293.781af","name":"","func":"var trilateration = global.get('trilateration');\n\ntrilateration.addBeacon(0, trilateration.vector(2, 4));\ntrilateration.addBeacon(1, trilateration.vector(5, 13));\ntrilateration.addBeacon(2, trilateration.vector(11, 2));\n\ntrilateration.setDistance(0, msg.payload.distances.kitchen.distance);\ntrilateration.setDistance(1, msg.payload.distances.livingroom.distance);\ntrilateration.setDistance(2, msg.payload.distances.serverroom.distance);\n\nmsg.payload = trilateration.calculatePosition();\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":949.2382392883301,"y":2343.8723850250244,"wires":[["a07e06d7.b84318","3770931a.35aa2c"]]},{"id":"4764d31f.17f95c","type":"ha-entity","z":"fa164293.781af","name":"Person1 Y Coordinate","server":"b89bdbfd.f73988","version":1,"debugenabled":false,"outputs":1,"entityType":"sensor","config":[{"property":"name","value":"person1_y_coordinate"},{"property":"device_class","value":""},{"property":"icon","value":""},{"property":"unit_of_measurement","value":""}],"state":"payload.y","stateType":"msg","attributes":[],"resend":true,"outputLocation":"","outputLocationType":"none","inputOverride":"allow","x":1207.238437652588,"y":2283.8727588653564,"wires":[[]]},{"id":"fca290bf.e2bb7","type":"ha-entity","z":"fa164293.781af","name":"Person1 X Coordinate","server":"b89bdbfd.f73988","version":1,"debugenabled":false,"outputs":1,"entityType":"sensor","config":[{"property":"name","value":"person1_x_coordinate"},{"property":"device_class","value":""},{"property":"icon","value":""},{"property":"unit_of_measurement","value":""}],"state":"payload.x","stateType":"msg","attributes":[],"resend":true,"outputLocation":"","outputLocationType":"none","inputOverride":"allow","x":1209.238437652588,"y":2225.8727588653564,"wires":[[]]},{"id":"3770931a.35aa2c","type":"ha-entity","z":"fa164293.781af","name":"Person2 Y Coordinate","server":"b89bdbfd.f73988","version":1,"debugenabled":false,"outputs":1,"entityType":"sensor","config":[{"property":"name","value":"person2_y_coordinate"},{"property":"device_class","value":""},{"property":"icon","value":""},{"property":"unit_of_measurement","value":""}],"state":"payload.y","stateType":"msg","attributes":[],"resend":true,"outputLocation":"","outputLocationType":"none","inputOverride":"allow","x":1190.238437652588,"y":2419.8727588653564,"wires":[[]]},{"id":"a07e06d7.b84318","type":"ha-entity","z":"fa164293.781af","name":"person2 X Coordinate","server":"b89bdbfd.f73988","version":1,"debugenabled":false,"outputs":1,"entityType":"sensor","config":[{"property":"name","value":"person2_x_coordinate"},{"property":"device_class","value":""},{"property":"icon","value":""},{"property":"unit_of_measurement","value":""}],"state":"payload.x","stateType":"msg","attributes":[],"resend":true,"outputLocation":"","outputLocationType":"none","inputOverride":"allow","x":1192.238437652588,"y":2361.8727588653564,"wires":[[]]},{"id":"b89bdbfd.f73988","type":"server","name":"Home Assistant","legacy":false,"addon":false,"rejectUnauthorizedCerts":true,"ha_boolean":"y|yes|true|on|home|open","connectionDelay":true,"cacheJson":true}]

There are a couple of things you will need to change:

  1. the http request node - you need to add the IP of a Raspberry Pi
  2. the switch node - add the device ids from above ā€œble-xxxxxx-xxxxxā€¦ā€
  3. the function nodes - they both need the instanceName
trilateration.setDistance(0, msg.payload.distances.ra0.distance);
trilateration.setDistance(1, msg.payload.distances.ra1.distance);
trilateration.setDistance(2, msg.payload.distances.ra2.distance);

replace ra0, ra1, ra2 with the 3 instanceName strings you used.

Now the inject is set to do so every second, so you may want to turn that to a single inject for testing.
When you hit inject you should see the resulting values.

Now the API is working and successful, in theory we can turn off the HomeAssistant integration (removing the MQTT load).

If working correctly you now need to calculate the room co-ordinates and the locations of the pi3 devices.

(Update to follow)

14 Likes

This sounds super coolā€¦ following for updates.

Found another pi :slight_smile:

Going to try 3D trilateration over the next few days
lm-trilateration3d - npm (npmjs.com)

Hi,
great project, I would really need it to find my glasses, wallet and hearing aid and other things of that size again.

There are any RFID or BLE beacons, which could be put on glasses?

By the way are you aware of posts on Reddit aiming for the same goal:

BLE tags are getting smaller, but yes in theory you could attach one to anything.

Tile does a Wallet beacon, but the smaller the object the more the BLE tag maybe cumbersome.

I have seen those, thanks for sharing. There are a lot of options out there, but none really did what I needed (find, happybubbles, monitor etc). They dont triangulate or provide actual location.

The limited number of RPi devices doesnt make it to the cm accurate (but it is working fairly well at the moment with some smoothing).

Crazy goal, but imagine in a fire or smoke you could have the app guide you out the house using your positioning. Or if a family member fall detection on their apple watch is detected, you could find them immediately?

However, in the meantime I just want to know reliably who is in what room :slight_smile:

Ok little update

I have now got 3d trilateration working, and actually iā€™m pretty impressed with the results. I need to work on fine tuning the sensitivity, and smoothing.

To user trilateration 3D you will need the following:

  1. Install a 4th Raspberry Pi (Step 3 above), remembering to name it for room-assistant.

  2. Install trilateration package and add to the gobalfunctioncontext
    From within the node-red:

cd /data
cd node_modules
npm install lm-trilateration3d
nano /data/settings.js

And as before look for functionGlobalContext and amend to look like this:

functionGlobalContext: {
        trilateration:require('trilateration'),
        lmtrilateration3d:require('lm-trilateration3d')
        // os:require('os'),
        // jfive:require("johnny-five"),
        // j5board:require("johnny-five").Board({repl:false})
    },

Again not sure if a restart is needed, but not an issue for me to restart the NodeRed Docker, so I always do.

  1. Update the node-red flow.
    youā€™ll now be getting 4 distances and will need to input them in the new function node.
    Enter this in the function node:
const trilat = global.get('lmTrilateration3d');
 
// 3D
// At least 4 beacons position and distance to locate 3D, if have only 3 beacons, z can not calculate then replace by undefined
var input3D = { data: [
//         X       Y       Z                  R
    [      10,     55,     100,   msg.payload.distances.kitchen.distance],
    [    100,    80,      10,   msg.payload.distances.livingroom.distance],
    [    90,    100,      100,   msg.payload.distances.office.distance],
    [     55,   30,        10,   msg.payload.distances.serverroom.distance]
]};
msg.payload = trilat.locate3D(input3D);
return msg;

The above accounts for the 4 RPis
The (x,y,z) co-ordinates of the beacon and the distance Ā® for the phone.

The split node based on ID will separate the messages, so you need to copy the function node to each branch for now (until we tidy up the flow)

For more information on the trilateration 3d package:

  1. Room assignment
    Now we can add another function node to calculate where the person is.
    This is useful to validate the results before we look at overlaying on a floorplan.
if (msg.payload.z <= 50 ) {
    msg.floor = "first"
}
if (msg.payload.z > 50 ) {
    msg.floor = "second"
}
if ((msg.floor == "first") && (msg.payload.x < 50) && (msg.payload.y < 140) && (msg.payload.y >= 120)) {
    msg.room = "diningroom"
} 
if ((msg.floor == "first") && (msg.payload.x < 50) && (msg.payload.y <= 120) && (msg.payload.y >= 60)) {
    msg.room = "kitchen"
} 
if ((msg.floor == "first") && (msg.payload.x > 50) && (msg.payload.y < 120) && (msg.payload.y >= 60)) {
    msg.room = "livingroom"
} 
if ((msg.floor == "first") && (msg.payload.x < 50) && (msg.payload.y < 60)) {
    msg.room = "garage"
} 
if ((msg.floor == "second") && (msg.payload.x < 50) && (msg.payload.y < 140) && (msg.payload.y >= 80)) {
    msg.room = "master"
} 
if ((msg.floor == "second") && (msg.payload.x > 50) && (msg.payload.y < 120) && (msg.payload.y >= 80)) {
    msg.room = "office"
} 
if ((msg.floor == "second") && (msg.payload.x > 50) && (msg.payload.y < 80) && (msg.payload.y >= 50)) {
    msg.room = "nursery"
} 
if ((msg.floor == "second") && (msg.payload.x < 50) && (msg.payload.y < 60) ) {
    msg.room = "guestroom"
} 
return msg;

You will need to adjust for your rooms, and your house - but hopefully this will be enough code for people like me (who canā€™t write their own code, but modify others :slight_smile: ).

Essentially if the Z axis (in my case) is binary. You are on floor 1 or 2. So if above a certain value you can mark it as second else mark as first. I left the value rather than an else incase there are multiple floors.

Now we have removed Z, its time to focus on X & Y.
This took some time to get right as I had to make sure I used a common scale across the house to avoid distortion of the values (they became less linear if my scale was off).
It was a mix of trial and error to get the RPi location co-ordinates correct, and then see values changing suitably for the rooms.

Please note there is still some work to do, but its getting closer!

2 Likes

they are small but still too big to put in glasses etc.

the more things you have and the the older the child becomes, you want to know what is in what room :smile:

there is you buinsess case :wink:

1 Like

Happy to see this. Today I was messing with room assistant and FIND trying to get to a similar conclusion. Iā€™m going to keep playing around to see if I can come up with a non-docker, non-Node-Red method just because I currently donā€™t use either and would rather not introduce either.

I will say after 5 hours of training FIND3, it was still putting me 15 feet off actual so I was not pleased with those results. Hopefully with the triangulation it will work better.

Iā€™m going to give esp32-room a crack too, but use iPhones in our house so FIND3 itself wonā€™t work.

I think the solution is a mix of distance triangulation and an AI model that can either smooth or can be trained for the specific house.

Iā€™m not an RF expert but I imagine BT and 2.4GHz being so close will cause periodic changes in values causing internal ā€œmovementā€. Iā€™m guessing any smart wifi or mesh set up that is constantly monitoring and adjusting etc is likely to create some issues.

Iā€™m wondering how sensitive the phones are to movement detection - so where possible if the device is stationary, smooth out the positioning.
Realistically I donā€™t care WHERE in the room I am, just Iā€™m within the boundary. It also is unlikely Iā€™m less than 2ft from a wall either - so that should allow me to create room bubbles or zones, rather than a room space than changes at on crossing a single coordinate line (like floorplan boundaries). Itā€™s also been nearly impossible to get the scale consistent of the floorplan layout (so the room spacing does get a little skewed until you test it an manually adjust).

Not sure the best way to implement the probability.

Letā€™s say Iā€™m in the middle of the kitchen and walk to the living room.
My coordinates are changing toward the living room, and so the probability of me moving to that room increases (a trained AI model will be able to identify that). The device stops moving and is within 10ft of the center (or resting/seating area) of the living room.
Most rooms people will be in a limited number of areas, so Iā€™m trying to work out some ā€˜hot zonesā€™.
It throws it off if Iā€™m laying on the floor playing with my kids however,

My next test is to ā€˜redrawā€™ the floorplan coordinates with hot zones in the middle of each space. The nearer I am to the hot zone the higher probability Iā€™m in that space (and then ideally reduce calculations or increase smoothing / debounce if the device is stationary.

Quite the rambling, but figured it would help others think about possible solutions also

Let me first say that I will likely abandon doing this a different way and go ahead with your Docker/Node-Red example but in the spirit of trying to be differentā€¦

I have setup 3 classic BT sensors right now (turned off the one in HA due to no sensor) and have a couple more on the way (Pi0W). I have the device entities set up for 3 phones and
I fear I have something set up wrong.

Iā€™m not terribly educated on MQTT. How do I determine which RA sensor is sending the distance from the message?
OR do I have to get from the API?

Also, what is the unit of measure? What is 0.5 compared to 2? Meters?

Message 470 received on room-assistant/sensor/bluetooth-classic-xx-xx-yy-yy-zz-zz/attributes at 8:48 PM:
{
    "distance": 0,
    "last_updated_at": "2020-12-29T02:48:31.806Z"
}
QoS: 0 - Retain: false
Message 469 received on room-assistant/sensor/bluetooth-classic-xx-xx-yy-yy-zz-zz/attributes at 8:48 PM:
{
    "distance": 23,
    "last_updated_at": "2020-12-29T02:48:23.579Z"
}
QoS: 0 - Retain: false

another thing that I was not expecting, I guess the Leader is the RA that is reflected in the ā€˜presenceā€™ sensor as I cannot seem to get this to change to a different RA even when inches from it.

Yeah right now RA reports which room you are essentially nearest, which doesnā€™t help our efforts to get all 3 positions.

Iā€™ve spoken to the dev of RA, and they are looking at providing all values / distances via MQTT by device ID.
This would mean you would have:
Room-assistant/phone1/summary/

{ ā€œnode1ā€ : { ā€œdistanceā€ : 22 , ā€œstatusā€ : ā€œonlineā€}, ā€œnode2ā€ : { ā€œdistanceā€ : 5, ā€œstatusā€ : ā€œonlineā€},
ā€œnode3ā€ : { ā€œdistanceā€ : 8 , ā€œstatusā€ : ā€œonlineā€}

Then you could easily parse those JSON distances and states to node-red for calculation.

However in the meantime Iā€™m using the room-assistant API which works, but means I have to query it from node-red.

Running 2 queries actually:

  1. Hit all node API endpoints /status - and checking there are no errors
  2. Hit 1 of the nodes API that is online and you can get all the distances. To make it a tad more reliable it would make sense to have 1 or 2 nodes more than you need. Then based on the results in theory pick the 3 lowest values (or 4 for 3D trilangulation) (ie the closest to the phone) and drop the ones that may be close to the edge of the range and cause issues Orin a big house not actually see the phone at all.

I had this delusion of grandeur that Iā€™d have some free time over Christmas to test and investigate - my toddler has other plans lol

Oh, re the distance - itā€™s an arbitrary measure of the signal strength calculation RA uses. I havenā€™t figured out what the optimum measure is yet so have kept the distance consistent but starting with 1 node and checking the distance in real life (say 10 was the difference in distance from my office door to the office wall) then I used that same scale to workout the other distances in the house. It does take some tinkering and some patience.

Iā€™ve now got 7 ESP32s flashed with the same ibeacon info and testing the Rooms app.
Issue with that is itā€™s all on the phone so will use the battery more - but the AI part of it allows for training which basically says ā€œI know Iā€™m in this room because I can see these beacons and these strengths)
I havenā€™t worked out how to apply a similar AI model from node-red so hopefully going to teach myself how to do that over the next few weeks as I prefer the Rpi room-assistant set up, but the AI training process would teach the process which room you are most likely in.

1 Like

Any idea how to install trilateration into Home Assistant Core with NodeRed or does it need to be an external NodeRed (like you did)?

I found the settings.js.

Iā€™m not sure you could integrate it with HA, its node.js so assuming it could be called somehow, but that is out of my area of knowledge.

I like to keep things separate (had too many bad experiences attempting to get things working on my HA box and installing a bunch of stuff I didnt understand which created me issues further down the line). Plus I run it in docker, so anything non-persistent would be hosed after an update.

Canā€™t wait to see the end result looks promising. Would be better if the results can be presented in a visual format.

Well done mate :clap:

thanks - would very much like a visual result (as it would make the scale/calibration better)

That said, i have no clue where to even start to overlay the coordinates on to a floorplan. Been trying to find some existing projects that are similar (as Iā€™m not a code from scratch kind of person, merely an edit/learn/evolve situation). If you know of anything that would make for a good overlay LMK.

My rough plan was to use a 3D floorplan that some use for HA already, and then overlay an icon that moves (but not sure what platform would allow me to pipe in co-ordinates to move the icons around)

Have you checked out Xiaomi Cloud Vacuum Map Extractor and how he was able to get data and show it on a floor plan?

wouldnt it be better to use a smart watch? what if your phone diesā€¦

Great idea! Iā€™ll investigate :slight_smile:

1 Like

By all means feel free to try it with a smart watch and report back.
I can manage to keep my phone charged as have wireless chargers where I work/sit/sleep.

do you own an apple watch?