I found a way to “integrate” Apple AirTag into Home Assistant.
My aim was that Android users could use these devices (actually cheap “AirTags”) and have access to their location through Home Assistant, to consult their location and create automations.
What I have done is create a shortcut that sends the information to a helper in HA, and then create a rest sensor to get the coordinates and finally get a device tracker (throught a blueprint i made, easy to config in few seconds), so you can use it in a map card and in your automations.
Good enough for me It is very easy to do and totally user friendly.
UPDATE:
I finally took my time to implement some improvements:
I have changed the approach of the shortcut when extracting information. This way it should work correctly in any language (no need modifications) and it should prevent some fails due to OCR.
The guide now includes tips and lessons learned in these months (ie, how to use it with iPads or more than 3 AirTags)
I have modified the main sensor, so now you can:
a) Identify frequent locations (home, work, …)
b) Limit the address by city, country… and thus avoid false matches.
c) Identify when the AirTag is unavailable (and show “Last seen…”)
d) Include sensor’s last update date.
Since it has been tricky for some users, creating the device tracker is now optional. In any case, they can use the main sensor that also indicates the address and coordinates.
I have included options and tips in the guide on how to implement coordinate updates. I hope it helps you and gives you some ideas.
In the future, i have some ideas to get the battery level of the AirTag and the last seen address when it get lost. We will see!
I have included the link to the printable model of the ‘aguatracker’ in the post, to combine the functions of the AirTag and an NFC keychain (to open doors, trigger automations…) Keep in mind that it is design to perfectly fit this cheap clones, because not all of them has the same chip.
I think the best way to do that is to set automations in your iPhone / iPad. As far as i know, details of what you can do depends on your iOS version (check this).
Since I am not an Apple user I cannot give you details, but from the tests I have done, if i were you i would set to trigger the shortcut when your device is unlocked but you are not interacting with the screen (i.e. CarPlay, always on display…)
Pretty cool hack! Kudos for doing it even without being an Apple user!
It works, with a few limitations:
findmy://items doesn’t always open with the full list of items drawer expanded. When it doesn’t you have to swipe it up before the screenshot is taken. Also I’ve only got 3 tags showing so not sure if you have more than 1 screen’s worth of items.
Even though there are ways of running it without requiring intervention, it does take over the screen for a few seconds so if you are using your phone when the shortcut runs it’s going to be distracting.
Are you able to support how to create this automation
@TiToTB
Thank you so much!
I was looking for any solution to import my Airtags into HA since ages
I struggle a little though.
The text_input.airtag is in place and I imported the shortcut.
In the part “Object on Index” in this shortcut there is an Index of “2”, but I get the error on my phone, that only 1 Object is available.
I tried it a few times (delete / re-import), but it does not work.
If I change to “1”, I also don’t get data into the input_text.airtag
Any suggestion?
EDIT I:
My bad - I needed to adapt to my German screenshot (Objetos → Objekte)
However, I get only gIch in my input_text.airtag Ich is the 4th tab on FindMy…
If I choose “Personen” (first Tab in FMF), I get GeräteObjektegIch
Whereas Geräte = Devices, Objekte = Objetos, Ich = Me
I don’t know where the g between Objekte and Ich is coming from
So, it seems the Text from the Tab is extracted and not from the AirTag view.
I guess the main issue are the differences between the languages!?
EDIT II:
I played around a bit and found out that the OCR finds obviously Obiekte instead of Objekte
The shortcut itself works well (and I am also considering to make more use of Apple Shortcuts for devices without an HA integration )
The entity definition works well, too, but I still face the issue mentioned by @thebunk
So I wonder if there is a function on the iphone to swipe the screen upwards?
EDIT: @TiToTB There is still something.
The retrieved string from the screenshot is just about 165 characters long, which is not sufficient for my 5 Airtags. So I am wondering if there is a limitation in the OCR?
By the way: My Orbea bike is recognized as Orhaa
Does it show all 5 of your AirTags at once or do you need to scroll? Have you seen that you can debug Shortcuts automations by adding a Show results step immediately after the step you want to see the results for?
I need to swipe the bottom part up to see al 5 Airtags.
If I close the app (to the background) and trigger the shortcut, the item view will be opened in the default view (40% items / 60% Map).
So I would need a swipe up command in the shortcut, I guess.
I didn’t know about this option and will try it.
EDIT I: Correction: The screen seems to be in the same condition (swiped up).
I have created an automation to execute the shortcut at a specific time but when the iphone is locked it does no work. Does it work on your side with a locked iPhone?
But as long as you have the information in the screen you should be able to get it. Probably you need to adjust the shorcut to ignore names of streets (as i did with the iPhone).
I’ll tell you my experience after playing a couple of days:
My device is an iPad, it is important for debugging problems with the shortcut to include the “Show Results” step, in my case sometimes the OCR recognizes the word “Objetos” as "Obietos, depending on whether the iPad is in night mode.
I also have problems depending on whether the iPad is horizontal or vertical, when the text captured after the word “Objects” is very long nothing is passed to input_text.airtag, otherwise the text is passed.
One more thing, any hint how can I “adjust the shortcut to ignore names of streets”?
I see, it would be usefull to have a mod shortcut for iPads, since they look to work a little bit different because of the larger screen.
Have you tried to include a condition to change the word “Objetos” if night mode is on? (not sure how exactly work in iOS, if you have the chance to choose this, or maybe you can use de hour).
The shortcut split the whole text using “Objetos” word and getting everything after that. If it is too long try to find another key word to split it again, so you can keep the information you need and make it shorter.
This is the outoput string, It seems to take all the text in the entire horizontal part, left part “objeto” and the right part with the name of the streets.
Screenshot attached and a close up crop of the screenshot
You can see “parque de”, “Trafalgar”, “Princesa”, etc recognized by the OCR
{“value”:“\nUr\nMi\nAirtag_B1\nCasa • Ahora\nAirtag w1\nAirtag w1_position_censored • hace 38 min\nIdentificar objeto encontrado\n+\nContigo\n4 km\nGoll\nlis\nParque de\nCALLE PRINCESA\nPla Bombilla\ndel Oeste O\nTRAFALGAR\nMercado\nSan Antón\nPALACIO REAL\nPLAZA MAYOR\n• Puerta del Sol\nMUSEO DEL\nLa Riviera\nIMPERIAL\nMUSEO\nSOFÍA\n•Parque de\nSan Isidro\nParque\n9 Madrid Rio\nCHOPERA\nOPAÑEL\nMOSCARDÓ\nLEGAZPI\nALMENDRALES\nParque\n• Emperatriz María de Austria\n• Padotongo\nLADOS\nM-40\nCarrefour\n1 0 64%\nHAATREEEAN\nRECOLETOS\nIBIZA\nES@UERDO\nParque de\nPACÍFICO\nATOCHA\nNUN\nSAN DIEGO\nENTREVIAS\n* Caja Mágica\nObramat Usera\n(Bricomart)\nE Mercadona\n因\nParque de la\nDehesa Boyal\nE-5\nParq\nM-40\n苁\nPersonas\nDispositivos\n”,“entity_id”:“input_text.airtag”}