Like the thoroughbred Rich Strike at the 2022 Kentucky Derby, one category of personal data recently broke from the rear and galloped its way to the forefront of awareness, astonishing the grandstands. You may hold its source in the palm of your hand. It is precise geolocation data1, collected from mobile devices.
The analogy presumes that the grandstands are packed with privacy nerds. For the rest of you, here’s a quick setup: Modern privacy laws2 define personal information very broadly3. Examples are given, including the physical location of an identifiable human being4 (“location” or “geolocation” data). Certain categories of personal info are deemed to be riskier to handle than others5. An increase in the level of risk attributed to precise geolocation data is the topic of this article.
Also presumed is a memory of Rich Strike’s epic victory. Picture a horse making moves like a running back, cutting a path through the field like he’s the only steed with a sense of urgency. Then he’s over the line and like: Whoa, what just happened?
But we’re getting ahead of ourselves.
Upon entering the gates at post time, geolocation data seemed to merit the same odds as Rich Strike (80:1) of what was about to transpire. After all, GDPR6 itself (the OG of privacy laws) deemed it to be nothing special.7
Let’s trace its path as it makes its astonishing run. Then we’ll circle back to GDPR and answer the obvious question: did it really (as it appears) fail to back the right horse? (Spoiler alert: the answer is no.) Finally, we’ll explore whether a silver bullet might exist to address the core concern underlying the discussion. (Spoiler alert: the answer is yes.)
A Word About Geolocation Data
Normally, geolocation data collected from cell phones is used to serve targeted ads to consumers who have consented to the process. The ideal recipient delights in getting a coupon for the precise cup of joe (for example) that happens to be his favorite, just as he happens to pass a store that happens to offer it.8 Yay to that.
But unfortunately, a sketchier use came to light at about the same time that GDPR was published (2016). It seemed like a niche concern at the time, more of a culture-war skirmish than anything broader. The story appeared in the Rewire News Group, a special interest publication with a narrowly focused readership9:
Anti-Choice Groups Use Smartphone Surveillance to Target ‘Abortion-Minded Women’ During Clinic Visits.10
It garnered little attention.11 Following in GDPR’s footsteps, the 1.0 version of CCPA12 (2018) mentions “geolocation data” as one example of personal information, but declines to single it out as anything special.
That changed in 2020 when CCPA 2.0 was adopted.13 Among the amendments, a newly-created category of “sensitive personal data” debuted, including a “consumer’s precise geolocation.” However the added protections afforded were limited.14
The Sprint to Prominence
The day that corresponds (in our analogy) to the sixth furlong at Churchill Downs, and the start of the homestretch, is May 2, 2022.
That’s when the SCOTUS decision in Dobbs v. Jackson15 was leaked to the press. The very next day, Vice Media published a story entitled Data Broker Is Selling Location Data of People Who Visit Abortion Clinics.16 The article warned of “an increase in vigilante activity or forms of surveillance and harassment against those seeking or providing abortions.”17 A cascade of similar reporting ensued.18
Following the lead of the fourth estate, the other three soon got involved.19 A handful of pro-choice states quickly passed laws restricting the use of geolocation data associated with family planning centers.20
Meanwhile, the Federal Trade Commission entered the fray, deeming certain uses of geolocation data to be unfair.21 In 2022, it floated a novel position: that using precise geolocation data associated with sensitive locations is an invasion of privacy22 prohibited by law.23 By 2024, it had firmed up a list of locations it deemed in the scope of the prohibition, including medical facilities, religious organizations, ethnic/religious group social services, etc. (The full list appears in the table below.)
Effectively, the FTC consigned “Sensitive Location Data” to the highest rank of sensitivity: personal data so sensitive that even informed consent can’t sanction its processing. Other rule-makers would go even further, proposing to ban the sale of precise geolocation altogether (sensitive or not)24, which brings us to the present day – and to a present-day head-scratcher:
Are the risks so dire that our hypothetical coffee consumer must be denied the targeted coupon that so delights him?
Circling back to GDPR provides a helpful perspective.
Did GDPR Really Back the Wrong Horse?
GDPR deems certain types of personal data to be sensitive25 including data concerning a person’s health, religion, political affiliation, etc. (The full list appears in the table below.) Location data isn’t included.
Nevertheless, if and when location data reveals or concerns sensitive data, it transforms into sensitive data ipso facto.
For example, data that locates a patient within a hospital is sensitive data, because it concerns their health. But data that locates an attending physician within the same hospital is not sensitive data, because it doesn’t.
That’s one difference between GDPR and the FTC rule: the latter deems all location data associated with a Sensitive Location to be sensitive, whereas GDPR deems location data sensitive only if it actually reveals the sensitive data of a consumer.
Here’s another difference:
Even when GDPR deems personal data to be sensitive, it doesn’t prohibit its use altogether. Rather, sensitive data may be used in accordance with a consumer’s explicit consent.
If that just caused you to raise an eyebrow, you’re probably not alone. GDPR isn’t known for permissive standards. And indeed, there’s a catch. The permissiveness comes at a cost in the form of rigorous duties imposed on businesses wishing to use sensitive data.
A threshold duty is to check local laws. GDPR hedges on its permissiveness by granting member-state lawmakers the right to raise the bar; to outlaw particular uses of sensitive data altogether (like the FTC did with Sensitive Location Data).26
Furthermore, it falls to the business to adjudge whether the risks of using the sensitive data outweigh the benefits.27 A formal Data Protection Impact Assessment is required, which is no small feat. Any green light to the use of sensitive data is likely to be closely scrutinized, should it catch the attention of a Supervisory Authority. Businesses must avoid using the rope provided by GDPR to hang themselves with – that’s the takeaway.
Finally, a heightened standard is likely to govern the validity of any consents purported to authorize the use of sensitive data,28 which brings us to the crux of the matter:
A Crisis of Confidence in Consents
Modern privacy laws set a high bar for what constitutes valid consent. In a nutshell, the person providing it must understand – really and truly – what they’re saying “yes” to.29
If the high bar is met, targeted ads may properly be served to consenting consumers, assuming any applicable red lines regarding sensitive data are respected.30 No current privacy framework31 rejects this principle. Rather, what’s been called into question, in particular cases, is the proviso – i.e., whether purported consents are valid in the first place.32
Some rule makers are skeptical to the extreme. They would dispense with consent as a legal basis for using location data in targeted advertising altogether. So flawed is the system, in their view, that consumers – for their own protection – must be denied the agency to proffer consent. Sorry coffee lover, no just-in-time coupon for you!
There are reasons to think that position would go too far.
Why Consent Matters in Principle
Here’s a reality check: the right to privacy is not absolute. Even under GDPR, it must be balanced against other fundamental rights, including the freedom to conduct a business.33 This may be why GDPR stops short of an outright ban on the use of sensitive data, consent notwithstanding. Taken too far, such a ban might infringe on the rights of individuals to determine how their personal data (which they own) may be used, and the rights of businesses to use personal data in accordance with the wishes of consenting adults.
Big Improvements in Managing Consents
A protocol is currently being rolled out by a nonprofit consortium of digital advertising businesses, the IAB Tech Lab.34 Known as the Global Privacy Platform (GPP), it establishes a method for digitally recording a consumer’s consent to the use of their data. The resulting “consent string” attaches to the personal data, accompanying it on its journey through the auction houses of cyberspace. Businesses that receive the data also receive the consent string, so there’s little excuse for exceeding consumer permissions.
Universal adoption of the GPP would establish the state-of-the-art in consent management for digital advertising businesses. It would be a significant milestone.
Give Consent a Chance
Thereafter, improvements in the granularity of consent, and the effectiveness of consent management processes, might soon blow our minds. Or so we are led to expect, at this point in history, the dawn of the AI era. Consent-management “copilot” bots nestled in our pockets like Tinkerbell – only a Luddite would doubt it. Or so it seems.
This is the promised silver bullet: consents so robust and manageable that even the most privacy-conscious consumer might have the confidence to grant them – present company included.
* * * *
When is Location Data Deemed Sensitive?
FTC “Sensitive Location Data” is precise geolocation data associated with35: |
GDPR “Location Data” becomes Sensitive Data when it reveals or concerns an individual’s: |
Medical facilities | Health |
Religious organizations | Religious or philosophical beliefs |
Correctional facilities | Data relating to criminality is not Special Category data under Art.9, but might be effectively bucketed into this column. See Art.10. |
Labor union offices | Trade union membership |
Locations held out to the public as predominantly providing education or childcare services to minors | The personal data of children is not Special Category data under Art.9, but might be effectively bucketed into this column. See Art.8 and Recital 75. |
Locations held out to the public as predominantly providing services to LGBTQ+ individuals such as service organizations, bars and nightlife | Sex life or orientation |
Locations held out to the public as predominantly providing services based on racial or ethnic origin | Racial or ethnic origin |
Locations held out to the public as predominantly providing temporary shelter or social services to homeless, survivors of domestic violence, refugees, or immigrants | No direct corollary. But the ordinary risk assessment required for non-sensitive data may result in adding data about homelessness, etc. to this column. See also the previous row, which may apply to data of refugees and immigrants. |
Locations of public gatherings of individuals during political or social demonstrations, marches, and protests | Political opinions |
Military installations, offices, or buildings | No direct corollary. But the ordinary risk assessment required for non-sensitive data may result in adding data about military installations, etc. to this column. |
Similar protections are accorded to the location of an individual’s private residence | No direct corollary, though the ordinary risk assessment required for non-sensitive data may result in adding domicile data to this column. |
1 Typically defined as latitude & longitude coordinates derived from a device such as a cellphone, which place the device at a physical location with an accuracy of <1>
2 E.g., the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA).
3 Any information that relates to an identified or identifiable human being is personal data.
4 For example, the following (by itself) is not personal information, because the subject is not identifiable: “A person was present in Grand Central Station at 5:30 pm every workday last month.” By contrast, this data (by itself) is personal information, because the subject is likely identifiable: “A person was present at 1234 Residential Lane at 1:00 AM every workday last month.”
5 The risks are to the “rights and freedoms” of the individual whose data is collected.
6 The European Union’s General Data Protection Regulation, the final text of which was published in 2016 (two years before it became effective).
7 “Location data” doesn’t appear among GDPR’s “special categories” of personal data deemed especially sensitive.
8 In reality happenstance has little to do with it, which is the point.
9 The “only national publication exclusively dedicated to” reproductive rights, according to its website. https://rewirenewsgroup.com/about-us/
10 Coutts, Sharona (2016, May 25). “Anti-Choice Groups Use Smartphone Surveillance to Target ‘Abortion-Minded Women’ During Clinic Visits.” Rewire News Group. https://rewirenewsgroup.com/2016/05/25/anti-choice-groups-deploy-smartphone-surveillance-target-abortion-minded-women-clinic-visits/. See also Office of the Attorney General of the Commonwealth of Massachusetts. (2017, April 4). AG Reaches Settlement with Advertising Company Prohibiting ‘Geofencing’ Around Massachusetts Healthcare Facilities [Press release]. https://www.mass.gov/news/ag-reaches-settlement-with-advertising-company-prohibiting-geofencing-around-massachusetts-healthcare-facilities
11 A Google search for pages that cite to the article returns almost nothing prior to 2022, and little until 2024.
12 The California Consumer Privacy Act.
13 The amendments were made by the California Privacy Rights Act (CPRA).
14 Consumers are given the right to limit the use and disclosure of Sensitive Personal Information to that “which is necessary to perform the services or provide the goods” they requested.
15 Dobbs v. Jackson Women’s Health Organization. Dobbs overturned Roe v. Wade, rescinding the constitutional right of access to abortion.
16 Cox, Joseph (2022, May 3). “Data Broker Is Selling Location Data of People Who Visit Abortion Clinics.” Vice Media. https://www.vice.com/en/article/location-data-abortion-clinics-safegraph-planned-parenthood/.
17 The article surfaced another issue as well: reportedly, the US military had purchased location data of cellphones running apps with names like Muslim Pro and Muslim Mingle. The former is a call-to-prayer app, the latter a dating app.
18 E.g., Tau, Byron; Mollica, Andrew; Haggin, Patience and Volz, Dustin (2023, Oct 13). “How Ads on Your Phone Can Aid Government Surveillance.” Wall Street Journal. https://www.wsj.com/tech/cybersecurity/how-ads-on-your-phone-can-aid-government-surveillance-943bde04. This piece exposes the sale of cellphone location data by Near Intelligence Inc. to private intermediaries that passed it along to the US Department of Defense. On December 8, 2023, Near Intelligence went into bankruptcy. In March of 2024, its assets were sold to a newly-formed company, Azira LLC. As of the date of this article, Near Intelligence continues to exist as a debtor-in-possession under Chapter 11 of the US Bankruptcy Code.
19 Historically the “other three” estates referred to the clergy, the nobility, and the commoners. In modern American usage, they refer to the branches of government.
20 Some of these laws apply to a broader set of healthcare services.
21 The FTC is tasked with enforcing 15 U.S.C. § 45(a), which prohibits “unfair or deceptive [or abusive] acts or practices in or affecting commerce.”
22 At a minimum, the FTC’s phrasing of what’s at stake is novel; comprehensive privacy laws don’t treat locations as sensitive per se, nor do they term the line that separates lawful from unlawful processing as an “invasion” of privacy, nor do they use “privacy” as a defined term-of-art.
23 Complaint (2022), Federal Trade Commission v. Kochava, Inc., Case No. 2:22-cv-377 (United States District Court for the District of Ohio).
24 E.g., Commonwealth of Massachusetts House Bill no. 357 (2023-2024).
25 “Sensitive data” is more commonly referred to as “special category” data. See Recital 10, which equates the two terms. Presumably, the former term is deemphasized to avoid the impression that “regular” personal data isn’t sensitive.
26 Accordingly, a variety of rules may apply across EU member states.
27 In which case the sensitive data may not be used.
28 See the term “explicit” in Art.9(2)(a) (regarding consent to the use of sensitive data). The same term doesn’t appear in Art.7 (regarding consent to the use of non-sensitive data). It’s unclear what difference that makes, since Art.7 doesn’t countenance implicit consents any more than Art.9 does. Nevertheless, the general rules of statutory interpretation deem the difference presumptively substantive. Hence the conclusion that a heightened standard is likely to apply.
29 They must also be empowered to revoke their consent expeditiously.
30 E.g., as applicable, Sensitive Location Data barred by the FTC or “special category” data barred by an EU member state.
31 …that I know of…
32 Including whether they even exist. See, e.g., Compl. ❡ 46, Case No. 2:22-cv-00377-BLW (U.S. Dist. Ct. for the District of Ohio, 07/15/2024) (“Consumers do not expect or want data brokers to collect their precise geolocation… Consumers also do not consent to such collection or disclosure. And because consumers do not know … this data [is being collected], consumers cannot avoid the harm resulting from the collection, use, or subsequent disclosure.”)
33 This is a Fundamental Right under the Charter of Fundamental Rights of the European Union and the Treaty on the Functioning of the European Union. See Recital 4 of GDPR.
34 The Interactive Advertising Bureau Technology Lab.
35 See Proposed Order, In the Matter of Mobilewalla, Inc. (United States of America Before the Federal Trade Commission) https://www.ftc.gov/system/files/ftc_gov/pdf/2023196mobilewallaorder.pdf
1>