AirTag "integration" (user friendly + device tracker)

Hi everybody!

I found a way to “integrate” Apple AirTag into Home Assistant.

My aim was that Android users could use these devices (actually cheap “AirTags”) and have access to their location through Home Assistant, to consult their location and create automations.

What I have done is create a shortcut that sends the information to a helper in HA, and then create a rest sensor to get the coordinates and finally get a device tracker (throught a blueprint i made, easy to config in few seconds), so you can use it in a map card and in your automations.

Good enough for me :smiley: It is very easy to do and totally user friendly.

Captura de pantalla 2024-02-10 133646

:bell:UPDATE:

I finally took my time to implement some improvements:

  • I have changed the approach of the shortcut when extracting information. This way it should work correctly in any language (no need modifications) and it should prevent some fails due to OCR.

  • The guide now includes tips and lessons learned in these months (ie, how to use it with iPads or more than 3 AirTags)

  • I have modified the main sensor, so now you can:
    a) Identify frequent locations (home, work, …)
    b) Limit the address by city, country… and thus avoid false matches.
    c) Identify when the AirTag is unavailable (and show “Last seen…”)
    d) Include sensor’s last update date.

  • Since it has been tricky for some users, creating the device tracker is now optional. In any case, they can use the main sensor that also indicates the address and coordinates.

  • I have included options and tips in the guide on how to implement coordinate updates. I hope it helps you and gives you some ideas.

  • I have written an English version of the guide for the community.

  • In the future, i have some ideas to get the battery level of the AirTag and the last seen address when it get lost. We will see!

  • I have included the link to the printable model of the ‘aguatracker’ in the post, to combine the functions of the AirTag and an NFC keychain (to open doors, trigger automations…) Keep in mind that it is design to perfectly fit this cheap clones, because not all of them has the same chip.

You can read full (english or spanish) explanation and get the codes here:

Also, maybe experienced Apple users can provide some ideas to improve this experiment.

Hope you like it!

If you find it usefull

You can help me to create more things like this if you:

:coffee: Consider to buy me a coffee
:email: Subscribe to my blog to get new ideas
:speech_balloon: Join our telegram community (we have an english room)
:avocado: Follow me in Facebook, Twitter, Youtube or Instagram

Thanks!! :smiley:

Other cool stuff i made

:ferris_wheel: ARC Reactor integration
:watch: Home Assistant companion for Zepp Devices
:label: Dynamic automations with NFC
:shopping_cart: Improved Shopping List + Bring! + Google Assistant
:muscle: Personal Trainer
:tv: Google Echo Show

16 Likes

Thank you for that easy and handy approach. It is working and i have the data in HA and a dashboard card configured.

But until now i could not found a solution to automate that the shortcut will be triggered e.g. hour or every two hours between 8 am and 3pm.

Are you able to support how to create this automation?

Thank you

1 Like

I’m glad you like it :slight_smile:

I think the best way to do that is to set automations in your iPhone / iPad. As far as i know, details of what you can do depends on your iOS version (check this).

Since I am not an Apple user I cannot give you details, but from the tests I have done, if i were you i would set to trigger the shortcut when your device is unlocked but you are not interacting with the screen (i.e. CarPlay, always on display…)

Hope it helps!

Pretty cool hack! Kudos for doing it even without being an Apple user!

It works, with a few limitations:

  1. findmy://items doesn’t always open with the full list of items drawer expanded. When it doesn’t you have to swipe it up before the screenshot is taken. Also I’ve only got 3 tags showing so not sure if you have more than 1 screen’s worth of items.
  2. Even though there are ways of running it without requiring intervention, it does take over the screen for a few seconds so if you are using your phone when the shortcut runs it’s going to be distracting.

Are you able to support how to create this automation

@smartmatic one way is to automate the service call @TiToTB detailed in his blog post. Something like this maybe?

description: "Run AirTag shortcut"
mode: single
trigger:
  - platform: time_pattern
    hours: "1"
condition:
  - condition: time
    after: "08:00:00"
    before: "15:00:00"
    weekday:
      - mon
      - tue
      - wed
      - thu
      - fri
action:
  - service: notify.mobile_app_iPhone
    data:
      message: ¿Dónde están los AirTag?
      data:
        shortcut:
          name: AirTag to HA
1 Like

Thanks! I am sure Apple users can help to solve these limitations :smiley:

@TiToTB
Thank you so much!
I was looking for any solution to import my Airtags into HA since ages :wink:

I struggle a little though.
The text_input.airtag is in place and I imported the shortcut.

In the part “Object on Index” in this shortcut there is an Index of “2”, but I get the error on my phone, that only 1 Object is available.

I tried it a few times (delete / re-import), but it does not work.
If I change to “1”, I also don’t get data into the input_text.airtag

Any suggestion?

EDIT I:
My bad - I needed to adapt to my German screenshot (Objetos → Objekte)
However, I get only gIch in my input_text.airtag
Ich is the 4th tab on FindMy…
If I choose “Personen” (first Tab in FMF), I get GeräteObjektegIch
Whereas Geräte = Devices, Objekte = Objetos, Ich = Me
I don’t know where the g between Objekte and Ich is coming from

So, it seems the Text from the Tab is extracted and not from the AirTag view.

I guess the main issue are the differences between the languages!?

EDIT II:
I played around a bit and found out that the OCR finds obviously Obiekte instead of Objekte

1 Like

Thanks, also for sharing your process!!

Did you solve it finally? Let me know otherwise!

Yes, I did! Thanks for asking.

The shortcut itself works well (and I am also considering to make more use of Apple Shortcuts for devices without an HA integration :slight_smile: )

The entity definition works well, too, but I still face the issue mentioned by @thebunk
So I wonder if there is a function on the iphone to swipe the screen upwards?

EDIT:
@TiToTB There is still something.
The retrieved string from the screenshot is just about 165 characters long, which is not sufficient for my 5 Airtags. So I am wondering if there is a limitation in the OCR?
By the way: My Orbea bike is recognized as Orhaa :crazy_face:

@nc03 if you close the findmy app with the items drawer fully showing (like this):

Does it show all 5 of your AirTags at once or do you need to scroll? Have you seen that you can debug Shortcuts automations by adding a Show results step immediately after the step you want to see the results for?

If I take a look at my items, I see three of them as half of the screen is used by the map like this:

I need to swipe the bottom part up to see al 5 Airtags.
If I close the app (to the background) and trigger the shortcut, the item view will be opened in the default view (40% items / 60% Map).

So I would need a swipe up command in the shortcut, I guess.

I didn’t know about this option and will try it.

EDIT I: Correction: The screen seems to be in the same condition (swiped up).

I have created an automation to execute the shortcut at a specific time but when the iphone is locked it does no work. Does it work on your side with a locked iPhone?

Android user here.

Can’t make It work on iPad, becaude the iPad screen is larger, 70% of It is the map, OCR gets names of streets, etc.

Hi!

I don’t have any iPad near so i can’t test it.

But as long as you have the information in the screen you should be able to get it. Probably you need to adjust the shorcut to ignore names of streets (as i did with the iPhone).

Hope it helps!

1 Like

Can you please post an example of the resulting text string that it’s send to HA? I’m trying to focus the OCR on the “Objetos” part.

Thanks!

Sure. This is what i get:

“+ Coche Calle DoesntExist, 67 • Ahora Mochila Tito Calle DoesntExist, 67 • Ahora Dog Calle DoesntExist, 67 • Ahora Contigo Contigo Contigo Identificar objeto encontrado Personas Dispositivos”

Also consider that you have to replace key words like “Objetos” if your app is in another language, as explained before by @NCO3

1 Like

What helped me a lot was to use show results as mentioned by @TiToTB in post #9

As soon as you can see the recognized text string, it’s easier to troubleshoot.

2 Likes

Thanks Tito for this integration,

I’ll tell you my experience after playing a couple of days:

My device is an iPad, it is important for debugging problems with the shortcut to include the “Show Results” step, in my case sometimes the OCR recognizes the word “Objetos” as "Obietos, depending on whether the iPad is in night mode.

I also have problems depending on whether the iPad is horizontal or vertical, when the text captured after the word “Objects” is very long nothing is passed to input_text.airtag, otherwise the text is passed.

One more thing, any hint how can I “adjust the shortcut to ignore names of streets”?

Thanks for sharing!

I see, it would be usefull to have a mod shortcut for iPads, since they look to work a little bit different because of the larger screen.

Have you tried to include a condition to change the word “Objetos” if night mode is on? (not sure how exactly work in iOS, if you have the chance to choose this, or maybe you can use de hour).

The shortcut split the whole text using “Objetos” word and getting everything after that. If it is too long try to find another key word to split it again, so you can keep the information you need and make it shorter.

Hope it helps!

This is the outoput string, It seems to take all the text in the entire horizontal part, left part “objeto” and the right part with the name of the streets.

Screenshot attached and a close up crop of the screenshot

You can see “parque de”, “Trafalgar”, “Princesa”, etc recognized by the OCR

{“value”:“\nUr\nMi\nAirtag_B1\nCasa • Ahora\nAirtag w1\nAirtag w1_position_censored • hace 38 min\nIdentificar objeto encontrado\n+\nContigo\n4 km\nGoll\nlis\nParque de\nCALLE PRINCESA\nPla Bombilla\ndel Oeste O\nTRAFALGAR\nMercado\nSan Antón\nPALACIO REAL\nPLAZA MAYOR\n• Puerta del Sol\nMUSEO DEL\nLa Riviera\nIMPERIAL\nMUSEO\nSOFÍA\n•Parque de\nSan Isidro\nParque\n9 Madrid Rio\nCHOPERA\nOPAÑEL\nMOSCARDÓ\nLEGAZPI\nALMENDRALES\nParque\n• Emperatriz María de Austria\n• Padotongo\nLADOS\nM-40\nCarrefour\n1 0 64%\nHAATREEEAN\nRECOLETOS\nIBIZA\nES@UERDO\nParque de\nPACÍFICO\nATOCHA\nNUN\nSAN DIEGO\nENTREVIAS\n* Caja Mágica\nObramat Usera\n(Bricomart)\nE Mercadona\n因\nParque de la\nDehesa Boyal\nE-5\nParq\nM-40\n苁\nPersonas\nDispositivos\n”,“entity_id”:“input_text.airtag”}

ipad 1

1 Like

Thanks for that! I have better understanding now.

I think you can crop de image through the shortcut just after taking de screenshot. Maybe that could solve this issue.

Have you tried that?