Security Analysis of Apple’s “Find My…” Protocol

Interesting research: “Who Can Find My Devices? Security and Privacy of Apple’s Crowd-Sourced Bluetooth Location Tracking System“:

Abstract: Overnight, Apple has turned its hundreds-of-million-device ecosystem into the world’s largest crowd-sourced location tracking network called offline finding (OF). OF leverages online finder devices to detect the presence of missing offline devices using Bluetooth and report an approximate location back to the owner via the Internet. While OF is not the first system of its kind, it is the first to commit to strong privacy goals. In particular, OF aims to ensure finder anonymity, untrackability of owner devices, and confidentiality of location reports. This paper presents the first comprehensive security and privacy analysis of OF. To this end, we recover the specifications of the closed-source OF protocols by means of reverse engineering. We experimentally show that unauthorized access to the location reports allows for accurate device tracking and retrieving a user’s top locations with an error in the order of 10 meters in urban areas. While we find that OF’s design achieves its privacy goals, we discover two distinct design and implementation flaws that can lead to a location correlation attack and unauthorized access to the location history of the past seven days, which could deanonymize users. Apple has partially addressed the issues following our responsible disclosure. Finally, we make our research artifacts publicly available.

There is also code available on GitHub, which allows arbitrary Bluetooth devices to be tracked via Apple’s Find My network.

Posted on March 15, 2021 at 6:16 AM8 Comments

Comments

One AnonEngineer March 15, 2021 11:00 AM

Why did these guys think that

world’s largest crowd-sourced location tracking network called offline finding (OF)

was a good idea ??

Winter March 15, 2021 11:29 AM

I see a lot of cases where people could reap enormous benefits from their data being used for some public good, e.g., medical information for health care, financial data, location data for all kinds of logistics and transport improvements, and, mobile phone location data for finding lost or stolen devices.

The one thing that blocks most uses of such is the danger that lies in sharing data that can be traced back to the origin, i.e, privacy risks.

That is most easily seen in medical data, without medical privacy, people would avoid seeking medical care. A recent example is the unwillingness of people to use corona tracing apps for fear that it would trace your whereabouts (which your phone does anyway). But the same holds for other data sets.

I applaud it when someone finds a way to share data for public goods while protecting the privacy of the individuals.

So, if Apple succeeded in making a secure cryptographic protocol and implementation, I can only support it.

Sadly, I am totally unqualified to say anything useful about the security of this application.

Hideous March 15, 2021 2:36 PM

Apple has created the world’s greatest “bumper beeper” tracking system. To track someone wherever she goes, you just hide a tiny Bluetooth LE beacon in her car, her keyfob, her Mophie case, or whatever, then let Apple’s Find-My network update you on its every move.

You can be even more subtle: hide a beacon which doesn’t activate until it picks up a short-range signal broadcast only in certain places or from the pockets of certain people.

SpaceLifeForm March 16, 2021 5:19 PM

@ Hideous, David Olrik, Clive

In this instant case, it appears that Hideous is correct, and that David Olrik read it wrong.

Actually, David Olrik brings up a point.

It’s not obvious. Must think outside of the box.

Think. Seriously think.

Who is the ‘owner’?

Who is the ‘user’?

Are you sure?

Fool buys tracker. Is the fool the owner? Is the fool the user?

Maybe the fool is neither.

Goat March 16, 2021 9:24 PM

re:”I applaud it when someone finds a way to share data for public goods while protecting the privacy of the individuals.”

@Winter it seems like a risky trail to tread on .. “Public good” is a slippery term and there is nothing like 100% protection. It is not unusual for survelliance to expand in the disguise of Public good and penetrable technlogies being developed to use data.

As @Bruce has restated many times “Anyonymising is difficult” and since the data would be used in some way de anonymising is always a risk

Winter March 19, 2021 5:49 AM

@Goat
“As @Bruce has restated many times “Anyonymising is difficult” ”

I tend to say that anonymization is impossible.

But there is always a trade-off. Yes, my flu/covid status might leak. On the other hand, data about my flue/covid status might be part of a successful effort to contain the spread of flu/covid. Maybe the risk is worth it.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.