Internal position tracking with co-ordinates using an iphone and room-assistant

Yeah right now RA reports which room you are essentially nearest, which doesn’t help our efforts to get all 3 positions.

I’ve spoken to the dev of RA, and they are looking at providing all values / distances via MQTT by device ID.
This would mean you would have:
Room-assistant/phone1/summary/

{ “node1” : { “distance” : 22 , “status” : “online”}, “node2” : { “distance” : 5, “status” : “online”},
“node3” : { “distance” : 8 , “status” : “online”}

Then you could easily parse those JSON distances and states to node-red for calculation.

However in the meantime I’m using the room-assistant API which works, but means I have to query it from node-red.

Running 2 queries actually:

  1. Hit all node API endpoints /status - and checking there are no errors
  2. Hit 1 of the nodes API that is online and you can get all the distances. To make it a tad more reliable it would make sense to have 1 or 2 nodes more than you need. Then based on the results in theory pick the 3 lowest values (or 4 for 3D trilangulation) (ie the closest to the phone) and drop the ones that may be close to the edge of the range and cause issues Orin a big house not actually see the phone at all.

I had this delusion of grandeur that I’d have some free time over Christmas to test and investigate - my toddler has other plans lol

Oh, re the distance - it’s an arbitrary measure of the signal strength calculation RA uses. I haven’t figured out what the optimum measure is yet so have kept the distance consistent but starting with 1 node and checking the distance in real life (say 10 was the difference in distance from my office door to the office wall) then I used that same scale to workout the other distances in the house. It does take some tinkering and some patience.

I’ve now got 7 ESP32s flashed with the same ibeacon info and testing the Rooms app.
Issue with that is it’s all on the phone so will use the battery more - but the AI part of it allows for training which basically says “I know I’m in this room because I can see these beacons and these strengths)
I haven’t worked out how to apply a similar AI model from node-red so hopefully going to teach myself how to do that over the next few weeks as I prefer the Rpi room-assistant set up, but the AI training process would teach the process which room you are most likely in.

1 Like

Any idea how to install trilateration into Home Assistant Core with NodeRed or does it need to be an external NodeRed (like you did)?

I found the settings.js.

I’m not sure you could integrate it with HA, its node.js so assuming it could be called somehow, but that is out of my area of knowledge.

I like to keep things separate (had too many bad experiences attempting to get things working on my HA box and installing a bunch of stuff I didnt understand which created me issues further down the line). Plus I run it in docker, so anything non-persistent would be hosed after an update.

Can’t wait to see the end result looks promising. Would be better if the results can be presented in a visual format.

Well done mate :clap:

thanks - would very much like a visual result (as it would make the scale/calibration better)

That said, i have no clue where to even start to overlay the coordinates on to a floorplan. Been trying to find some existing projects that are similar (as I’m not a code from scratch kind of person, merely an edit/learn/evolve situation). If you know of anything that would make for a good overlay LMK.

My rough plan was to use a 3D floorplan that some use for HA already, and then overlay an icon that moves (but not sure what platform would allow me to pipe in co-ordinates to move the icons around)

Have you checked out Xiaomi Cloud Vacuum Map Extractor and how he was able to get data and show it on a floor plan?

wouldnt it be better to use a smart watch? what if your phone dies…

Great idea! I’ll investigate :slight_smile:

1 Like

By all means feel free to try it with a smart watch and report back.
I can manage to keep my phone charged as have wireless chargers where I work/sit/sleep.

do you own an apple watch?

I do indeed

can you try it to see if it works?

it wont work for my house, as I’m the only one who owns one,
plus it doesn’t broadcast a BLE address (apple seems to randomize them) - so on the surface it wont work.
The reason this is possible is due to the room-assistant app that broadcasts a consistent BLE address.

If you can find a way around that issue, i’ll try it out

the room assistant app you use says it supports apple watches

The beta app (released by the dev to circumnavigate the random BLE mac broadcast issue i mentioned above) i’m using on the iphone to broadcast BLE, I don’t believe supports running on an apple watch to broadcast its BLE currently.

Bluetooth Classic will work with Apple watch, and in the past I have tried and succeeded with that for detection but not triangulation. I’ve also experienced more interference with 2.4GHz WiFi with BT classic than BLE.

Your post is getting quite a bit of attention :slight_smile:

I started over and went through the 3d part… 4 pi nodes, all ethernet connections.
Installed Node-Red on one (pi 4 8gb) and created the flow.

I’m not sure if I missed something or not but I am not getting any sensors from Node-Red into HA although the Person X & Y show to be running. Since this weekend has been a learning experience for me with Node-Red, I assume I did something wrong.

Any thoughts?


I did see this in the node red logs. I really do not understand the http://supervisor/core as i did not enter that anywhere, just the 192.168.30.177 into the flow.

4 Jan 01:33:25 - [info] [server:Home Assistant] Connecting to http://192.168.30.177:8123
4 Jan 01:33:25 - [info] [server:Home Assistant] Connecting to http://supervisor/core
4 Jan 01:33:30 - [info] [server:Home Assistant] Connecting to http://supervisor/core
4 Jan 01:33:30 - [info] [server:Home Assistant] Connecting to http://supervisor/core
1 Like

are you getting values at the sensor?
throw a debug node in one of the sensor nodes place, and set msg to entire msg and trigger the inject node on the left/start.

Then we can debug based on whether you are getting a value

also you need to add the node-red integration in the HA UI
Configuration > Integrations > + button (bottom right) - search for node-red.

well crap… I had 2 other 'node-red’s installed but not that last one… smh.

Doing that got your first example running.

Now back to the 3D version…

I get what appears to be a good message from the switch:

"payload":{"attributes":{"distance":0.4,"lastUpdatedAt":"2021-01-05T01:18:17.664Z"},"id":"bluetooth-classic-a8-34-6a-87-6b-91","name":"a8:34:6a:87:6b:91 Room Presence","distributed":true,"distances":{"sanctumBT":{"lastUpdatedAt":"2021-01-05T01:17:53.464Z","distance":21.3,"outOfRange":false},"lrBT":{"lastUpdatedAt":"2021-01-05T00:50:22.601Z","distance":5.9,"outOfRange":false},"lr2BT":{"lastUpdatedAt":"2021-01-05T01:18:17.664Z","distance":0.4,"outOfRange":false},"officeBT":{"lastUpdatedAt":"2021-01-05T01:18:22.004Z","distance":6.8,"outOfRange":false}},"timeout":40,"state":"lr2BT"},"statusCode":200,"headers":{"x-powered-by":"Express","content-type":"application/json; charset=utf-8","content-length":"2513","etag":"W/\"9d1-w1aHnT1BQNmLh6bm5zq3ebfp7MA\"","date":"Tue, 05 Jan 2021 01:18:32 GMT","connection":"close","x-node-red-request-node":"7f20469b"},"responseUrl":"http://192.168.30.98:6415/entities","redirectList":[],"parts":{"id":"ed8a1a0f.086018","type":"array","count":9,"len":1,"index":8},"_msgid":"9f9e2faf.22191"}

If I go to the 1st function after the switch, I get:

TypeError: Cannot read property 'distance' of undefined

const trilat = global.get('lmTrilateration3d');
 
// 3D
// At least 4 beacons position and distance to locate 3D, if have only 3 beacons, z can not calculate then replace by undefined
var input3D = { data: [
//         X       Y       Z                  R
    [      10,     55,     100,   msg.payload.distances.lr2BT.distance],
    [    100,    80,      10,   msg.payload.distances.lrBT.distance],
    [    90,    100,      100,   msg.payload.distances.officeBT.distance],
    [     55,   30,        10,   msg.payload.distances.sanctum2BT.distance]
]};
msg.payload = trilat.locate3D(input3D);
return msg;

I have the trilateration package installed and referenced in the pi (running node-red).

pi@lr2BT:~/.node-red $ npm install lm-trilateration3d
+ [email protected]
updated 1 package and audited 78 packages in 3.363s

19 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities
    //    global.get("os")
    functionGlobalContext: {
        trilateration:require('trilateration'),
        lmtrilateration3d:require('lm-trilateration3d')
        // os:require('os'),
        // jfive:require("johnny-five"),
        // j5board:require("johnny-five").Board({repl:false})
    },
    //

on first glance it looks like you are only getting SanctumBT not Sanctum2BT