Bermuda - Bluetooth/BLE Room Presence and tracking [custom integration]

icon

Howdy!
Bermuda is a custom integration (available via HACS using custom repository) which takes the bluetooth advertisements from ESPHome bluetooth-proxy’s (or shelly devices) in HA, and allows you to track your devices by Area and Distance.

Ultimately I plan to use it as a platform to experiment with trilateration / triangulation, locating devices based on relative signal strengths (like building a map of your home based on visible ble devices and their relative distances, solving for all the triangles that creates) - but for now it provides reasonably good room-based presence detection (assuming you have a proxy in that room).

It’s still in early development but I think it’s ready for some more people to poke holes in it :slight_smile:

The easiest way to install is definitely in HACS, by adding agittins/bermuda as a custom repository.

Some screenshots…

integration


deviceinfo

sensor-info

If you wish to support this work financially, you can:

Any support is extremely welcome - but completely optional!

10 Likes

I’ve been playing around with Bermuda today. I had/have the problem of the names not showing up for devices which makes it hard to figure out what each device is. However, I notice you addressed this in the FAQ.

I configured my bluetooth proxies via the easy esphome page at: Ready-Made Projects — ESPHome

When you do the above it gioves the following yaml

substitutions:
  name: esp32-bluetooth-proxy-93a710
packages:
  esphome.bluetooth-proxy: github://esphome/bluetooth-proxies/esp32-generic.yaml@main
esphome:
  name: ${name}
  name_add_mac_suffix: false


wifi:
  ssid: !secret wifi_ssid
  password: !secret wifi_password

It’s then not obvious where you would add:

active: true

I decided to change this config so I have the full manual YAML config and now have the correct section

bluetooth_proxy:
  active: true

I then turned on BLE beacons on my watch for the first time and it did appear in the list of detected devices as a Garmin Instinct next to the MAC Address. Unfortunately this is the only device that has the name next to it. All others which were detected before altering the YAML in the proxy just show the MAC addresses generally.

I am trying to identify my phone beacon which is an Android Pixel 4a running Home Assistant Companion app with BLE beacons transmitting. I can’t identify it in the list of 15+ MAC addresses shown. It’s beacon was correctly detected previously by a Espresence node, but I used the UUID to add it to monitored devices. The UUID is shown in the Companion app but it isn’t an option to use this to add to Bermuda.

I did have to alter the default options to stop my watch constantly being shown as Unknown. I now have:

Max Radius = 4
Timeout = 30
Environment attenuation factor = 5
Default RSSI = -62

I notice in your screenshot your figures are much higher than the defaults and higher than mine. I will continue to play with the above settings while testing with my other proxies but working in my office OK.

So to summarise the above, Bermuda is now detecting my watch successfully after changing options, names are not showing for most of my devices which were detected before changing the YAML of my proxy (possibly cached) and could UUID be added as a way to identify devices?

1 Like

Hi Steven!

Thanks for trying it out, and thanks especially for the detailed run-down on your experience with it!

Looks like I got totally mixed up with the active:true setting, and my FAQ had it completely backwards :astonished: - it’s esp32_ble_tracker|scan_paramaters that needs active: True, not bluetooth_proxy, sorry for the confusion there. The template from ready-made projects apppears to have that set up OK, so that makes it a bit odd that it wasn’t working for you, but since you have a manual config now, maybe check you have:

esp32_ble_tracker:
    scan_parameters:
      active: True

(true is the default, so strictly speaking it would only matter if you had specifically set it to false previously).

I do find it a bit hit-and-miss as to which devices report names and which do not, so if it still doesn’t work for you I might need to do a bit more digging.

Are you saying that some devices had names before changing the yaml and now do not, or just that all the ones that showed in the list previously that had no names continue to not have names? From your summary I think you’re saying you used to have some names that you no longer have. That could well be caching, but also possibly something else. Do you have a bluetooth adaptor attached to your homeassistant box? I’ll do some digging.

From memory I think I have one device that quite rarely sends it’s local name in a broadcast, so it can be a bit hit-and-miss, but it sounds like you’re having a more consistent problem.

I’ll do some testing with my Android and the HA companion app - I haven’t been using it for tracking (my watch seems to broadcast more reliably, and I only carry my phone between rooms 97% of the time :sweat_smile: ) but it’s worth fixing. I’ve raised an issue to work on supporting identity via UUID as I think that’s a good idea - Support identifying devices by UUID, possibly IRK · Issue #44 · agittins/bermuda · GitHub

I’ve also been planning to have per-device options for ref_power and for naming, so I’ve created a ticket for that as well.

I’ve added some info to the FAQ on how the different default options work. I had trouble with my watch being marked as “Away” too quickly, which caused my “arriving home” automation to false trigger when I had never left. So I set max_radius and timeout to “silly large” (~40m) and 300 (5 minutes) - the screenshot is still on 35seconds for the timeout, I should update them when I have ironed out some bugs :slight_smile:

Setting a large radius should help avoid devices going Unavailable, the only downside is that it will tend to show you as being in an “Area” even if you are away from it - unless another area is closer. It’s not ideal.

Actually, I think I’ll make max_radius a per-proxy option, so that you can limit how far you can be from an area while still considered “inside” it, and split the global setting into one as a default for areas and one for home/not_home for the device_tracker.

Thanks again for taking the time to detail your experience!

I’ve just setup this integration, my proxy is working and devices are being found, however none of them correspond to my Apple device,

It’s also hard to tell because I have 10 Apple devices show up but none of them match the Bluetooth MAC address of my device.

Is there an easy way to find out and track an iPhone and Apple Watch?

Not yet - because apple devices rotate their MAC addresses for privacy reasons. There is a new integration in HA core called Private BLE Device which can let you resolve the address in real time. Ultimately I hope to integrate Bermuda with that so it can offer that functionality, but I don’t have time currently to work on it.

However, the integration already provides some estimated distance to the nearest BLE receiver, so it might be worth having a look at since it might do what you need already.

Just a heads-up for anyone following this, note that HomeAssistant 2024.1 causes a breaking change to existing versions of Bermuda, so after receiving a fix from jaymunro I’ve created a new release v0.3.2 which should work with previous and future releases of HA.

2 Likes
alias: Pannenkoekenplant area follower
description: "When Bermuda detects this plant moved area, update the plant and relevant sensors to that area so room temperature and light intensity aggregates make sense again."
trigger:
  - platform: state
    entity_id:
      - sensor.pannenkoekenplant_1_area
    attribute: area_id
action:
  - service: ha_registry.update_entity
    data:
      area_id: |
        {{ state_attr('sensor.pannenkoekenplant_1_area','area_id') }}
      entity_id:
        - plant.pannenkoekenplant
        - sensor.plant_sensor_6bf38a_temperature
        - sensor.plant_sensor_6bf38a_illuminance
mode: restart

:heart_eyes:

This can most definitely be improved, but first I need to get more bluetooth_proxies :upside_down_face:

Haha that’s awesome! Nicely done :smiley:

Thanks! One more thing I was wondering about; A lot of my proxies (shelly’s) are exactly (in the wall) in between areas, I guess when strategically placed, this would allow for accurate area detection, however I don´t think the current implementation supports that. (e.g. each proxy = in 1 area).
I have noticed that devices jump between proxies (and as such, so does the distance reported to the connected proxy). I wonder whether we can keep that information separate, and with that, actually use the information from more than 1 proxy to have a better understanding of where that device is.

Understandably rewriting based on a modern device_tracker integration has higher prio (having the devices merged would be great, bermuda tracker & tracked device).

Yeah, definitely using a 1:1 mapping for proxy/area initially, but ultimately I want to try and be able to support trilateration to place items anywhere between proxies.

Right now you could use the dump_devices service (Developer ToolsServicesBermuda BLE Trilateration: Dump Devices) which will give you a dump of everything it knows. Part of that is the distances from each proxy receiver for each device (even the ones you don’t have entities set up for). From that you could start trying out some calculations, or even create template sensors - if you’re up for that level of messing around.

I like the idea of being able to get the distances from adjacent areas though. I could probably change the distance sensor so that it lists the other areas as attributes. That might make an easier way to create automations against - but I think it will also create a lot more database activity, if I understand it correctly.

The device_tracker stuff probably won’t get much more attention unless some bugs come up - device_tracker only does home/not-home, and that feature seems fairly solid now. The main directions I am interested in are:

  • sorting out things that don’t just track by MAC address (eg iBeacons, iPhones/Private BLE etc) and
  • trilateration or more accurately tracking devices in relation to proxies.

The latter is currently just “which proxy am I closest to”, but I would love to see how close we can get to “exactly where am I in the house?”. The closest-proxy thing jumps about too much currently, especially when the closest proxy just misses a packet. I have some basic filtering already but it definitely needs improvement.

I’m hesitant to start creating lots of sensors and a bit hesitant about creating lots of attributes (although maybe that’s not too bad), but using the yaml from dump_devices is certainly worth looking at if you’re keen.

It’s definitely an area I’m thinking about though, so it’s good to hear different ideas.

2 Likes

Hi and great work @agittins!

I see so much potential for this. However for me the ability to track IOS (especially apple watch) is critical. I will have approx 20 ESP proxy devices and a combination of cameras and mm wave sensors. By combining these I hope I can get really close to track who is where in the house, even the kids…

Some initials thoughts from me (not a math-geek or programmer/developer) so don’t judge me.

Potential ways forward:
Alt1: Using a web interface where the user (1) adds the floor plan, (2) “draw” room boundaries and (3) places all sensors to be used. With this we can get the relative X,Y coordinates. This should be enough to then use n-number of sensors and relevant formulas to retrieve the relative X,Y coordinate which then can be compared to the floor plan to determine the location.

I started to look into triliteration. One thing that immediatly came to mind was that there is no need or even favourable to not track in 3D i.e. if it connects to devices on different levels there should be a way to only use data from sensors on the same floor. Since distance is measured by RSSI my gut feeling says that this can likely fool the system.

Alt2: Since the tracking is based on RSSI signals that can be disturbed by a number of things I was thinking that there might be a use case for AI. My thinking is that in a UI the user enters the name of the zone/room and then press a “record” button. The system stores all the RSSI data from all sensors while the user walks, jumps, lays in bed etc. for lets say 5 minutes with 2 second scan interval (which would give 150 datapoints or longer/tighter scan interval depending on how many data points is necessary for accurate analysis) in that room/zone. By doing this for all zones it should be possible to train an AI model to determine which room/zone the user is in.

With my limited abilities I can see what I can possibly contribute with but I really like the idea of having a reliable tracking system.

Hi Hogster and thanks for the input!

TL/DR: I really do waffle on here, and none of it is likely to be relevant to anyone who just wants to use the integration - so it can be safely skipped! If you’re interested in helping / understanding my pathology though, read on…

This is one of those areas where coming up with ideas is one thing but actually implementing them to find out if they work is quite another thing altogether!

The data we get is a set of “distance” measurements, which are:

  • subject to the dark magiks of RF propagation.
  • very noisy, with noise that is massively skewed to overshoots (ie, almost all measurements are actual distance plus a positive noise figure, a few might have near-zero noise, and a tiny amount might have a negative noise offset due to in-phase reflections).
  • unreliably delivered. They might be late, or more likely missing altogether. A missing measurement is also indistinguishable on its own from an infinite distance measurement.

The unreliable delivery and asymmetric noise cause trouble straight away with the classical algorithms like Kalman filtering or even just plain averaging/smoothing. Luckily though, accurately putting certain things at the exact same place as a certain other thing, in as short a time as possible, is something that militaries are quite interested in, so there is (funded) research around, for those able to read the math. I am not skilled at the math, but I can sometimes grok the principles.

Re users drawing maps, that is a (long term) possibility, but I have no plans to do that any time soon. The main thing is that I think it will be unnecessary, but I’m in the wrong part of the Dunning-Kreuger curve to know for sure. I am (un?)reasonably confident that the system should be able to “solve” the spatial arrangement, and potentially-inaccurate user measurements would make that more difficult rather than less. I could be wrong, though. Ultimately I expect we’ll have our own “point cloud” which could then be “pinned” to a user’s map using translate/rotate/scale values.

My thinking is that if two proxies each measure the distance to one beacon, then the distance between the beacons can not be further than the sum of the two measurements (this is called “Triangle Inequality”). If the beacon moves around, that minimum-estimate will gradually improve, and if the beacon is placed at one of the beacons, we get our “answer” for the distance between those two proxies. From now on, when we receive measurements to that beacon, we have the lengths of three sides of a triangle from those proxies, so can arrive at two solutions in space for the beacon, relative to the proxies (the location is mirrored about the axis between the proxies). You can think about this by drawing circles - choose an arbitrary point as proxy-a. Draw a circle from that point with the radius of the distance between proxy-a and proxy-b, and choose anywhere on that as the location of proxy-b. Now if you draw a circle centred on each proxy for the beacon distance measured, those two circles should intersect at two locations, either side of the line between the proxies (or on the line, occasionally). In a way you have the “actual” position, and an ambiguous “reflection” of that position. The reason I think this is “better” than a user-defined map is that the distances then all come from the same, faulty system - rssi measurements, which I think might be easier to integrate than two different measurement systems. Some basic constraints from user input might be really helpful though, things like “these three proxies all lie along a single line, and this fourth proxy is on a line perpendicular to them, intersecting at the first one”, say.

When we have distance “measurements” between three points, we have solved the shape of that triangle but not which “reflection” of that triangle we have. If we bring in another proxy we get a third “fixed” point and solve where the beacon is in relation to those three points (in a 2D plane).

Once we achieve this, it stands to reason we can achieve this with every combination of proxy and beacon, at which point we have “solved” the spatial arrangement of all the beacons and proxies, relative to each other. At that stage one just needs to define a point in real space for any proxy, and an angle in real space to point at another proxy and the entire “cloud” is now aligned with real space. eg “consider my office proxy as being at [0,0,0], and the laundry proxy is exactly North of that”.

This is why I think we don’t need a user-supplied map to solve the space, and you can probably see that with the noisy nature of the data, trying to fit a user-dimensioned map adds a lot of complexity to something that hasn’t yet solved its existing complexities! :smiley:

I’m much keener for being able to visualise a 3D cloud of points in HA rather than going down the user-provided floorplan, but it would be a long time before I have time for either, I suspect.

So, being do-able for a single floor seems… waves hands likely? Solving in 3D for multiple floors is… less certain… certainly harder. I think that mathematically (as far as I can think mathematically!) there’s no reason why it shouldn’t work. With distances from two proxies you get a circle perpendicular to the plane (so it looks like two points in 2D), while distances from three proxies, if perfect, give you 2 points in 3D, which looks like a single point in 2D. Adding another proxy that’s not in the same plane (say, on a different floor) then gives us a hint at the “point”'s distance from the original plane. Except, none of these “points” are actually “points”, they’re all clouds of confusion, due to the noise and uncertainty. I think that adding that third dimension is likely stretching the already low-quality data a lot more than going to two did. I still think it’s possible, just laying out the difficulties so people can manage their expectations! :slight_smile:

As for at which point these uncertainties cause everything to fall apart (or simply be less than useful) I am not sure, but I am pretty confident that there is a lot of useful space between “not perfectly correct” and “entirely busted”.

My hesitation around solving for 3D is that while we work in only two dimensions, we sort of just fold the third dimension into the noise and try to ignore it (since any distance in the third dimension just causes an increase in all the measurements from the origin plane, which is exactly what RF noise/loss does, too). So in a way, extracting the third dimension is trying to extract meaningful data out of our left-over garbage. It should be entirely possible, but I suspect that this data will be even less reliable than that from a 2D solution - and I’m not entirely sure yet how reliable even that will be!

For now, I am going to stick to working out the 2D solution, because at this stage I believe 3d will need all the same solutions that 2D does, plus more - so it seems like a “progressive” path - we need to solve 2D first, and that work won’t be wasted in solving 3D, I think.

In practical terms, there’s probably a lot of functionality possible even without a 3D solution. For room-level occupancy, the nearest-proxy will still be pretty reliable (I expect typical inter-floor materials will add enough RF loss to make this easier) and combining with other sensors (PIR, mm-wave etc) will probably always be an important part, even for 2D (I am using Bermuda plus PIR for my office, and it’s rudimentary, but still great).

Perhaps grouping proxies by floor (which might be akin to what you were suggesting) in order to “constrain” each floor of proxies onto separate, parallel planes would then give us a per-floor location from which we’d choose by closest distance… I don’t know. It’s certainly something I am keeping in mind, but we are a long way from being able to worry about it in a productive way.

Re AI, I’m not sure how much benefit it might provide - again we’d need someone on the other side of the Dunning-Kruger curve to weigh in. Maybe it would be really good for tuning algorithms that clean up the distance measurements, or for real-time estimation of distances based on previously observed patterns - but maybe it’s not a good fit for any current ML patterns - I really don’t know. My gut feeling is that in these earlier stages there’s a lot to do in the basic mathematical realm of cleaning up and interpreting data, and that AI might be more useful down the road - but I could be wrong about that - maybe we could throw a sack of numbers into a GPU and it will work it all out! It would seem likely that it might be well-matched for estimating positions within a room given that RF reflections are so insanely complicated and simple at the same time. I’d love to hear from anyone who might have relevant experience in that area.

Sorry for the long rant, but part of it is just me mulling things over in my mind and thinking out loud.

My current thinking about next-steps that I can/will actually do are:

  • Beacon UUID / phone IRK/PrivateBLE matching

  • create per-proxy distance sensors for each beacon so that I can easily use the history tool to visualise what’s happening with the data, including missing packets. The current distance and area sensors are good, but their data makes it clear that

    • missing packets are causing spurious area switches
    • noisy data makes it hard to choose between nearby areas that are open plan (line-of-sight)
    • having multiple proxies in one area really helps with both issues (graph showing (effectively) area-of-nearest-proxy and distance-to-nearest-proxy over time for my watch):
  • experiment with some smoothing/prediction filters on those distances to fill in missed packets and reduce the noise without losing responsiveness. I’m thinking Kalman filters, or maybe just a “two-euro” filter might be enough. Having the data easily graph-able right in HA will be a good start for evaluating different methods. I like the idea that a Kalman filter gives a “predicted” result, which might be particularly useful for making a decision when a given packet is missing. I know it’s sort of a low-pass filter that’s required, but given the noise is additive rather than symmetrical and also I want to preserve responsiveness I feel that simple smoothing might be a poor fit.

3 Likes

Hello @agittins , I’ve installed your integration yesterday after several trial with other solutions (espresense, room-assistant, some own statistics sensors based on RSSI). All falling short with high variance of the rssi data. Your integration is also having same issues but it is way more elegant than the others and taking full advantage of ESPHome integration and new HA ble stack.

I’m using Mi Band 6 as beacon and, not only the rssi is erratic, but it also has period of times when it is not advertising (sometimes 2-3 minutes) and getting “away” status (unless increasing a lot the consider_home setting). But this is happening with all solutions, as it is likely a problem in the device (can someone confirm they have the same problem ?).

Other than that, I’m still not able to determine if Shelly are really working or not. By checking the logs of their aiot script, MI Band 6 MAC address is not appearing.

Anyway, a part from the testing, I welcome your active research in this topic, you can give a look to room-assistant spec and code, they were the most advanced and succesful in do statistical filtering of the data. I’ve tried myself to use a simple Mean to the data (60 seconds window), this increased the reliability of the location but decreased the speed of room change.

Good job|

Ok. Some more testing and I also dumped the data of your integration with the dedicated service (thanks).
I remembered I had a Mi Band 2 in a drawer and was able to charge it and integrate it in Bermuda. Shelly scanners are working fine with Bermuda (they do not get Mi Band 6 but they get Mi Band 2, the only difference is that Mi Band 6 is paired with my mobile, but using “hcitool lescan” and ESPHome it is appearing).

I have the impression that Bermuda takes the last advertisement and areas are changing constantly and sometimes surprisingly: I have a shelly in the bathroom 4m away from me and an ESPHome 50 cm away and area is constantly changing to the bathroom, this seems rather unexpected ).

Mi Band 2 is more stable for presence home/away: it is advertising constantly and it is scanned by Shelly; in my home I have 4 ESPHome and 4 Shelly evenly distibuted in the rooms.

I will try to use the dump to give you some more factual feedbacks.

Watching this project with interest! - I tried it out today not realising that it didn’t support the UUID’s published by the Android Companion app. So obviously it didn’t detect the device as I was expecting, it did show a bunch of MACs under the ‘Configure’ window for the integration, but none appeared to relate to my phones BT mac - so I assume it’s rotating them and I’d need a different app to publish a fixed MAC.

Anyway, looks very promising - for the moment I’ll continue using the FormatBLE Tracker but will be switching to this once UUID support is there!

Also - not sure if it’s intentional, but I noticed that the drop down box for selecting MACs on the integration has no scroll bar when it overflows the height of the page… took me a while to spot there were more MACs that weren’t visible, but I could view them using the arrow keys. :slight_smile:

Hi Andy, glad you’ve had some success at least, and good to get some feedback re where the other projects are at!

Making full use of the esphome / ha ble stack was a key desire for me, I didn’t want to have to dedicate any hardware just to the proximity stuff.

Glad you found the dump_devices service, I was going to suggest that for troubleshooting.

Yes, missed adverts is a key issue, as you’ve discovered :slight_smile: The algo currently is a bit too basic and switches too readily when the closest proxy misses a packet. My previous post above hints at what I have in mind for improving this on the software side. If I can smooth and estimate for missing adverts that might improve things greatly.

On the hardware/firmware side, a bit of experimenting with the interval and window parameters of esphome’s scan_parameters might help (I’m currently using 1000ms and 900ms respectively). The defaults of 320/30 regularly results in up to 10 second intervals between captured packets, which is pretty awful. The closer window and interval are to each other the higher percentage of time the esp is “listening” for packets - but the less time it has for keeping wifi alive (incl sending data to HA) and doing other tasks.

Interesting differences with the two Mi bands. Many devices will change how often they advertise (or stop adverts altogether) when they are actively connected. But given you are seeing differences between the shelley and the esphome I wonder if that’s more to do with the listening setup. I’d be quite interested to see the dump info (particularly the hist_interval part) from the shelly and esphome for anything you know that does 1s adverts, to see if it performs better than I’ve managed with the esphomes.

1 Like

Howdy Swifty,

Yes hopefully I’ll had UUID sorted before toooooo long :slight_smile:

Re the scrollbar, I’m just using the default selector tools provided by HA, but on my system at least I do see the scrollbars showing up:


(I’m running chrome on linux)
It could be some old cached css or something (a CTRL-F5 can sometimes solve that), or a particular browser incompatibility. You could try raising it with HA, but you might need to find a core integration that behaves the same way to find a way to reproduce it.

Ah interesting - I wonder if it’s a theme thing, as I’m using ‘Bubble’ - no particular weird browser or anything just Chrome on windows… will have a play and report if needed.

Really looking forward to watching the development on this - like you I think espresence is great but when I’ve already got esphome devices in every room (running LEDs, lux sensors etc) it annoys me to have to deploy another dedicated esp to each room just for the presence stuff :laughing:

2 Likes

Debugging is really a nightmare. You were right about Mi Band 6, I disconnected it from the smartphone and it started to advertise with the right frequence (and for some reason, shelly started to “see” it). But yesterday I stumbled on another problem one of the ESPHome scanners was not proxying anymore (maybe the BLE stack crashed), nothing was reported in the logs, restarting it, fixed the issue: maybe they are not coping with the workload of all the BLE devices. I’m still trying to find the right parameters for Bermuda. There are too many factors affecting the reliability of the “area” and “distance” sensors, but your integration provides for the most elegant “presence” and “room” architecture and I decided to stick to it

Same for me, chrome on windows