CPPA Advances Proposed Regulations for Data Broker Deletion Mechanism

On March 7, 2025, the California Privacy Protection Agency (“CPPA”) voted to authorize the agency to advance proposed data broker regulations concerning the Delete Request and Opt-Out Platform (“DROP”) to formal rulemaking.
The CPPA’s proposed DROP regulations are part of the agency’s efforts to implement California’s Delete Act. The Delete Act requires the CPPA to establish an accessible deletion mechanism to allow consumers to request from registered data brokers the deletion of all non-exempt personal information related to the consumer through a single deletion request to the CPPA. The proposed DROP regulations coincide with the California AG’s enforcement sweep targeting the location data industry and recent enforcement activity by the CPPA against data brokers.
The accessible deletion mechanism provisions of the Delete Act apply to a data broker, meaning a business that knowingly collects and sells to third parties the personal information of a consumer with whom the business does not have a direct relationship. Notably, the proposed regulations would change the definition of “direct relationship” to clarify that a business does not have a direct relationship with a consumer simply because it collects personal information directly from the consumer. Instead, to have a direct relationship with the business, the consumer must intend to interact with the business. Therefore, the revision would bring within scope of covered data brokers businesses that collect and sell to third parties the personal information of a consumer that did not intend to interact with the business.
It is anticipated that DROP will be accessible to consumers by January 1, 2026, and to data brokers by August 1, 2026.

Navigating DORA Compliance: Recent Developments

The EU Digital Operational Resilience Act (DORA) took effect on 17 January 2025 after a two-year implementation period. DORA sets out new requirements for financial entities (FEs) and their information technology and communication (ICT) third-party service providers (TPPs). This note highlights recent developments in the EU’s efforts to facilitate in-scope firms’ compliance with DORA and authorities’ attempts to avoid duplication of operational resilience requirements.
Further information regarding DORA developments can be found in our previous articles (available here, here, here and here).
EBA Amends Guidelines on ICT and Security Risk Management 
On 11 February 2025, the European Banking Authority (EBA) amended its existing 2019 guidelines on ICT and security risk management measures (Guidelines) to align them with DORA.
The EBA has narrowed the scope of its Guidelines to cover:

only FEs subject to DORA, including credit institutions, payment institutions, account information service providers, exempted payment institutions and exempted e-money institutions; and
relationship management of the payment service users in relation to the provision of payment services.

The EBA’s aim is to simplify the ICT risk management framework and provide legal clarity for the industry, by avoiding duplication of requirements and ensuring consistency across the EU single market.
However, other types of payment service providers (PSPs), such as post-office giro institutions and credit unions, who are not covered by DORA, will still have to comply with the security and operational risk management requirements under the revised Payment Services Directive (PSD2), which has been in force since March 2018. In addition, PSPs that remain subject to the PSD2 security and operational risk management requirements can potentially be subject to additional national requirements.
The Guidelines will apply within two months of the publication of the translated versions.
The Guidelines and accompanying press release are available here and here, respectively.
Commission Adopts Delegated Regulation on Threat-led Penetration Testing under DORA
On 11 February 2025, the European Central Bank (ECB) published an updated version of its framework for threat intelligence-based ethical red teaming (TIBER-EU Framework) that aligns with the DORA regulatory technical standards on threat-led penetration testing (TLPT RTS). This follows the ECB’s publication of a paper considering the TIBER-EU Framework in the context of DORA. Further information on this earlier paper can be found in our previous article (available here).
DORA mandates the European Supervisory Authorities (ESAs), together with the ECB, to develop draft RTS in accordance with the TIBER-EU Framework to specify the following:

the criteria to identify FEs required to perform TLPT;
the requirements regarding test scope, testing methodology and results of TLPT;
the requirements and standards governing the use of internal testers; and
the rules on supervisory and other co-operation needed for the implementation of TLPT and for mutual recognition of testing. 

On 13 February 2025, the European Commission (Commission)adopted a delegated regulation (Delegated Regulation), with accompanying annexes 1-8, supplementing DORA in relation to the TLPT RTS. The Delegated Regulation shall enter into force and apply 20 days after its publication in the Official Journal of the European Union. 
The Delegated Regulation and updated version of the TIBER-EU Framework are available here and here, respectively. 
ESAs Publish Roadmap on the Designation of CTPPs under DORA
On 18 February 2025, the ESAs published a roadmap (Roadmap) for the designation of critical ICT TPPs (CTPPs), which will be subject to direct EU supervision under DORA.
Notably, the Roadmap sets out four steps to designation of CTPPs in 2025:

by 30 April 2025, the ESAs will collect the registers of information on ICT third-party arrangements submitted by FEs to their national competent authorities;
by the end of July 2025, the ESAs will perform the criticality assessments mandated by DORA and notify ICT TPPsif they are classified as critical; 
by mid-September 2025, there will be a six-week hearing period where TPPs can object to the assessment, with a reasoned statement and supporting information; and
by the end of 2025, the ESAs will have designated and published a list of CTPPs and commenced oversight engagement. 

The accompanying press release notes that TPPs that are not designated as critical can voluntarily request to be designated once the list of CTPPs is published, with details on how to raise such a request to be provided soon. 
The ESAs expect to organise an online workshop with TPPs in Q2 2025 to provide further clarity on preparatory activities, the designation process and the ESAs’ oversight approach.
The Roadmap and press release are available here and here, respectively.
Delegated and Implementing Regulations on Major ICT-Related Incidents and Cyber Threats Under DORA Published
On 20 February 2025, Delegated and Implementing Regulations (together, the Regulations) supplementing DORA were published in the Official Journal of the European Union, setting out the detailed requirements and procedures for reporting and notifying ICT-related incidents and cyber threats. The Commission adopted the Regulations in October 2024.

The Delegated Regulation specifies the content and time limits for the initial notification of, and intermediate and final report on, major ICT-related incidents by FEs, and the content of the voluntary notification for significant cyber threats. 
The Implementing Regulation sets out the standard forms, templates and procedures for FEs to report a major ICT-related incident and to notify a significant cyber threat.

Both Regulations will enter into force on 12 March 2025.
The Regulations are available here and here, respectively.
ESAs Publish Opinion on Commission’s Rejection of Draft RTS on Sub-contracting ICT Services Supporting Critical or Important Functions
On 7 March 2025, the ESAs published an opinion (Opinion) on the Commission’s rejection of its draft RTS on the elements an FE needs to determine and assess when sub-contracting ICT services supporting critical or important functions. 
The Commission notified the ESAs that it had rejected the draft RTS in January 2025 on the basis that certain requirements introduced by the draft RTS went beyond the mandate given to the ESAs under DORA. The Commission noted that Article 5 of the draft RTS, and the related recital 5, should be removed from the draft RTS. The Commission then stated it would adopt the RTS once the ESAs had made the necessary modifications.
In the Opinion, the ESAs acknowledge that the Commission’s amendments will ensure that the draft RTS are fully in line with its mandate under DORA. The ESAs do not recommend changes to the Commission’s proposed amendments. They note that FEs are expected to adhere to the provisions on subcontractors as set out in Article 29(2) of DORA and Article 3(6) of the implementing technical standards on the register of information.
The Opinion is available here.

BEHIND THE FILTERS: CapCut And TikTok Are Making The Cut And Maybe Your Personal Data Too

Greetings CIPAWorld!
Let’s get techy with it. Ever edited a TikTok or Instagram Reel using CapCut? It turns out that you might have handed over more than just your creativity. The Northern District of Illinois has delivered a mixed but consequential ruling in Rodriguez v. ByteDance, Inc., about how video editing apps collect and utilize our personal data. See Rodriguez v. ByteDance, Inc., No. 23 CV 4953, 2025 U.S. Dist. LEXIS 37355 (N.D. Ill. Mar. 3, 2025). You guessed it. TikTok is at issue here. If you’ve ever used CapCut to perfect a TikTok video or Instagram reel, this decision deserves your attention!
Yikes. Imagine editing a quick vacation video only to discover the app might be scanning every photo in your gallery and capturing your facial features! That’s precisely the kind of privacy implications at the center of this case. Make sure to always check your app permissions!
The Opinion in ByteDance, Inc. offers a nuanced examination of modern privacy law. It allows several significant claims to proceed while dismissing others. So, let’s get into a brief background first.
CapCut, developed and operated by Chinese technology giant ByteDance (which also owns TikTok), has exploded in popularity since its 2020 U.S. launch. Now, it’s one of the most downloaded apps globally, with over 200 million monthly active users! CapCut allows users to create, edit, and customize videos using templates, filters, and visual effects. Everyone wants to look good, right? While predominantly free, users can access premium features through subscription models. It’s remarkable how quickly CapCut became essential for content creators. When in actuality, this case forces us to confront the reality that the most user-friendly tools might also be invasive.
However, according to the Plaintiffs, this seemingly innocent video editor allegedly harbors a more problematic function—collecting vast amounts of user data without proper authorization. The Complaint alleges that CapCut collects everything from registration information and social network contacts to location data, photos, videos, and even biometric identifiers like face geometry scans and voiceprints. Yes you read that right… Biometric identifiers.
First, the Court’s reasoning behind allowing the California constitutional and common law privacy claims to proceed reveals evolving judicial thinking about digital privacy. Judge Alexakis emphasized that privacy violations don’t depend solely on the sensitivity of the content collected but also on the manner of collection. See Davis v. Facebook, Inc. (In re Facebook Inc. Internet Tracking Litig.), 956 F.3d 589, 603 (9th Cir. 2020).
Many of us miss this critical distinction in our everyday tech interactions. We often focus on what data apps collect rather than how they collect it. The Court’s analysis suggests that even innocuous data could trigger privacy concerns if gathered through deceptive or overly invasive methods—a crucial lesson for developers and users alike.
Here, the Judge found the allegations that CapCut accesses and collects all the videos and photos stored on their devices, not just those they voluntarily uploaded to the CapCut app, particularly troubling. If proven true, this broad data collection practice would violate reasonable user expectations. Ringing any bells here? Drawing parallels to Riley v. California, 573 U.S. 373, 397-99 (2014), which recognized that individuals have a reasonable expectation of privacy in the contents of their cell phones, Judge Alexakis noted that a reasonable CapCut user would not expect the app to access and collect all the photos and videos on their devices, regardless of whether they use those photos and videos to create content within the app. Makes perfect sense, right?
Let that sink in for a minute. An app potentially scans your entire photo library when you only intend to edit a single clip! This broad access would be like handing a stranger your family photo album when they only ask to see one vacation picture. The Court rightly recognized how this violates our intuitive sense of privacy.
For the California constitutional and common-law privacy claims regarding user identifiers and registration information, the Court specifically relied on United States v. Soybel, 13 F.4th 584, 590-91 (7th Cir. 2021) for the principle that a person “has no legitimate expectation of privacy in information he voluntarily turns over to third parties,” which was central to dismissing claims based on this type of information.
Next, the California Invasion of Privacy Act (“CIPA”) claims represented a significant but ultimately unsuccessful component of Plaintiffs’ case. As we know, under CIPA, individuals are protected against unauthorized electronic interception of communications. Section 631(a) prohibits any person from using electronic means to “learn the contents or meaning” of any “communication” without consent or in an “unauthorized manner.” Critically, neither CIPA nor the federal Electronic Communications Privacy Act (“ECPA”) impose liability on a party to the communication, as Judge Alexakis noted in Warden v. Kahn, 99 Cal. App. 3d 805, 811, 160 Cal. Rptr. 471 (1979) held that section 631… has been held to apply only to eavesdropping by a third party and not to recording by a participant to a conversation.
Here’s where Plaintiffs ran into a fascinating legal hurdle. When you voluntarily use an app, the law often treats that app as a communication “participant” rather than an eavesdropper. Think about how different this is from our intuitive understanding… Few of us would consider a video editor an equal “participant” in our creative process, yet that’s essentially the legal fiction applied here.
Plaintiffs’ attempted to circumvent this limitation by asserting that Defendants effectively intercepted their data by “redirecting” communications to unauthorized third parties, including the Chinese Communist Party. They relied on legal authorities like Davis, 956 F.3d at 596, 607-08, where Facebook used plugins to track browsing histories even after users logged out.
Conversely, Judge Alexakis found two factual flaws in this theory. First, Plaintiffs failed to plausibly allege that any communications were intercepted during transmission rather than merely shared after collection. Though the Court acknowledged that the Seventh Circuit hadn’t definitively ruled whether interception must be contemporaneous with transmission, it noted that every court of appeals to consider the issue had reached this conclusion. See Peters v. Mundelein Consol. High Sch. Dist. No. 120, No. 21 C 0336, 2022 WL 393572, at *11 (N.D. Ill. Feb. 9, 2022). Second, Paintiffs’ allegations fell short of the specific software-tracking mechanisms proven sufficient in cases like Facebook Tracking. They identified no particular mechanism by which ByteDance contemporaneously redirected communications to third parties.
Let’s dig a little deeper so this makes sense. There’s a (legal) difference between an app intercepting your data in transit (like wiretapping a phone call) versus collecting it at the endpoint and sharing it later. Most individuals would see little practical difference in the outcome. Your private data ends up in unexpected hands either way, yet courts maintain this technical distinction that significantly impacts your legal protections.
Next, the Court rejected claims under Section 632 of CIPA, which imposes liability on parties who use an electronic amplifying or recording device to eavesdrop upon or record confidential communication. Beyond the conclusory assertions that ByteDance intercepted and recorded videos without consent, Plaintiffs failed to allege that Defendants used any electronic amplifying or recording device to eavesdrop on conversations.
Let’s switch it up now. We are going to talk about something a little different. Perhaps the most meaningful survival in the decision concerns claims under Illinois’ Biometric Information Privacy Act (“BIPA”). The Court rejected ByteDance’s argument that BIPA only applies when companies use biometric data to identify individuals. Looking at the statute’s plain language, which defines “biometric identifier” to include “voiceprint[s]” and “scan[s] of… face geometry,” the Court found no requirement that the data be used for identification purposes.
This is genuinely interesting to me. Illinois lawmakers created one of the strongest biometric protection laws in the country, and the Court’s ruling reinforces just how far those protections extend. The practical effect here is enormous. Companies can’t just escape liability by claiming they collected your facial geometry or voiceprints for purposes other than identification. The mere collection without proper consent is enough to trigger liability. This reasoning aligns with an emerging consensus in the Northern District of Illinois. In Konow v. Brink’s, Inc., 721 F. Supp. 3d 752, 755 (N.D. Ill. 2024), the Court held that a defendant may violate BIPA without using technology to identify an individual; instead, BIPA bars the collection of biometric data that could be used to identify a plaintiff.
The Court also found persuasive Plaintiffs’ detailed allegations that ByteDance employs engineers specializing in “computer vision, convolutional neural network, and machine learning, all of which are used to generate the face geometry scans that Defendants derive from the videos of CapCut users.” These technical specifications helped elevate the claims beyond mere conclusory allegations.
What’s particularly impressive here is how Plaintiffs connected the dots between ByteDance’s engineering talent, their patent applications for voiceprint technology, and the actual functions of CapCut. This level of technical detail is increasingly necessary in privacy litigation. In turn, vague claims of data collection often fail without demonstrating the underlying mechanisms involved.
For BIPA’s Section 15(c) claim, the Court relied on the statutory interpretation principle of ejusdem generis in interpreting “otherwise profit.” In Circuit City Stores v. Adams, 532 U.S. 105, 114-15, 121 S. Ct. 1302, 149 L. Ed. 2d 234 (2001), the Court noted that when general words follow specific words, they “embrace only objects similar in nature to those objects enumerated by the preceding specific words.” This interpretive principle was key to the Court’s narrower reading of the statute, limiting “otherwise profit” to commercial transactions similar to selling, leasing, or trading data. Sequentially, the Court rejected ByteDance’s argument that internal use of biometric data—such as improving CapCut’s editing features—constitutes “otherwise profiting” under BIPA. It is fascinating how this determination narrows the scope of liability under Section 15(c), signaling that plaintiffs must show an actual external transaction involving biometric data to succeed on these claims.
Next, the Court analyzed various consumer protection claims that failed because Plaintiffs couldn’t demonstrate economic injury. For their claims under California’s Unfair Competition Law (“UCL”) and False Advertising Law (“FAL”), Judge Alexakis emphasized that, unlike the broader Article III standing requirements, these statutes demand a showing that plaintiffs lost money or property. This highlights one of the most frustrating aspects of privacy litigation for consumers: proving financial harm from privacy violations is extraordinarily difficult. We intuitively understand that our personal data has value (why else would companies collect it so aggressively?). Yet, courts often struggle to quantify it or recognize its loss as economic injury. It’s like recognizing theft only when something tangible is taken.
The Court was particularly unpersuaded by theories based on the diminished value of personal data, noting Plaintiffs hadn’t alleged they attempted to sell their data or received less than market value. Davis, 956 F.3d at 599, rejected a similar argument that a loss of control over personal data constituted economic harm. The Court also referenced Cahen v. Toyota Motor Corp., 717 F. App’x 720, 723 (9th Cir. 2017), which held that speculative claims about diminished data value, without concrete evidence of lost economic opportunity, are insufficient to establish standing under consumer protection statutes.
Moreover, like in Griffith v. TikTok, Inc., No. 5:23-cv-00964-SB-E, 2023 U.S. Dist. LEXIS 223098, at *6 (C.D. Cal. Dec. 13, 2023), the Court observed that Plaintiffs failed to show they attempted or intended to participate in the market for their data. Additionally, the Court noted that Plaintiffs failed to allege any direct financial loss tied to CapCut’s data practices, distinguishing their claims from cases where courts recognized economic harm due to specific monetary expenditures, such as fraudulent charges or paid services rendered worthless by deceptive conduct.
Next, the Court turned to the core issue underlying many of Plaintiffs’ claims—what ByteDance actually did with the data it collected and whether users had truly consented to these practices. One of the most fascinating aspects of the Opinion is the battle over consent. ByteDance mounted an aggressive defense centered on its Terms of Service and Privacy Policies, arguing that users effectively waived their rights by agreeing to these documents. The company submitted three distinct versions of its Privacy Policy from 2020, 2022, and 2023, each making various disclosures about data collection practices.
Let’s be honest…When was the last time any of us actually read a privacy policy before clicking “agree”? Side note: you should. ByteDance, like many tech companies, is backing their legal protection on documents they know full well most users never read. What’s remarkable is that courts are increasingly skeptical of this fiction, recognizing the reality of how users actually interact with digital products.
Judge Alexakis’s detailed analysis here offers insights for both app developers and users alike. She recognized that while the policies could potentially be incorporated by reference since Plaintiffs mentioned them in their Complaint, she refused to dismiss the case based on consent at this early stage. The Court emphasized that dismissing claims based on affirmative defenses like waiver is only appropriate if “the allegations of the complaint itself set forth everything necessary to satisfy the affirmative defense.” See United States v. Lewis, 411 F.3d 838, 842 (7th Cir. 2005).
Here, there are several critical factual questions that prevented the Court from accepting ByteDance’s consent defense. Most notably, there was no conclusive evidence about exactly when and how the plaintiffs agreed to the terms. While ByteDance simply asserted that “[u]sers expressly or impliedly consent to the policy upon downloading and using the app,” Plaintiffs countered that they “were able to access the CapCut platform without having to scroll through and read such policies before they were allowed to sign up for the services.”
The Court was particularly skeptical of ByteDance’s reliance on what appeared to be a browsewrap agreement—where terms of service are presented passively and users are presumed to agree simply by using the service. Judge Alexakis emphasized that browsewrap agreements are only enforceable if users have actual or constructive notice of the terms. See Specht v. Netscape Commc’ns Corp., 306 F.3d 17, 30-31 (2d Cir. 2002). This means that merely linking to a privacy policy at the bottom of a webpage or app interface is insufficient to establish consent. As such, actual consent requires more than theoretical access to terms.
Additionally, the Court noted that the placement and formatting of ByteDance’s consent prompts were unclear, raising doubts about whether the plaintiffs were ever explicitly informed of the policy’s existence before using CapCut. This aligns with precedent from Nguyen v. Barnes & Noble Inc., 763 F.3d 1171, 1177 (9th Cir. 2014), where courts declined to enforce arbitration clauses hidden in inconspicuous terms of service.
This dispute highlights a pervasive problem I’m recognizing in digital consent. There is a gap between technical legal compliance and actual user understanding. Despite ByteDance presenting screenshots showing a prompt requiring users to click “Agree and continue,” Judge Alexakis noted this evidence couldn’t establish whether these particular plaintiffs had seen and agreed to these specific terms, especially because they explicitly alleged they never read any privacy policy or terms of use.
The Court also highlighted another crucial factual gap. Plaintiffs asserted that the attached policies were only three of the ten (or more) versions of the policy that existed over time. In Patterson v. Respondus, Inc., 593 F. Supp. 3d 783, 805 (N.D. Ill. 2022), the Court declined to dismiss claims based on policies because the case “may involve factual questions about what [defendant’s] policies looked like at different moments in time.”
The Court’s skepticism should serve as a wake-up call. Concealing invasive data practices within complicated legal documents and merely asserting user consent may be coming to an end. Companies that are truly dedicated to privacy must go beyond minimal compliance and strive for actual transparency and meaningful choices for users.
For the Computer Fraud and Abuse Act (“CFAA”) claim (Count I), the Court undertook a analysis of the access without authorization element. The CFAA, that was originally enacted to combat hacking, imposes liability on anyone who intentionally accesses a computer without authorization or exceeds authorized access to obtain information. While finding that Plaintiffs’ allegations were insufficient, Judge Alexakis specifically distinguished this matter from Brodsky v. Apple Inc., No. 19-CV-00712-LHK, 2019 WL 4141936 (N.D. Cal. Aug. 30, 2019), where the plaintiffs had “concede[d] that [they] voluntarily installed the software update,” which unambiguously established authorization. Here, the Court emphasized that essential questions of fact exist about the scope of the authorization and the design of Plaintiffs’ operating systems, making dismissal based on implied authorization inappropriate at this stage. The Court noted that factual disputes, such as the scope of Defendants’ access, are not appropriately resolved on a motion to dismiss.
Particularly, the Court declined to follow cases like hiQ Lab’ys, Inc. v. LinkedIn Corp., 31 F.4th 1180, 1197 (9th Cir. 2022), which held that scraping publicly available data does not constitute unauthorized access under CFAA. Unlike hiQ Labs, where access restrictions were clear, Plaintiffs alleged that CapCut accessed files beyond what they knowingly permitted. However, the Court found that Plaintiffs failed to sufficiently allege that ByteDance exceeded authorized access under Carr v. Saul, 593 U.S. 83, 141 (2021), which clarified that merely misusing information one is entitled to access does not violate the CFAA.
In dismissing the Stored Communications Act (“SCA”) claims (Count IV), Judge Alexakis found particularly significant the timing mismatch in Plaintiffs’ allegations about data sharing with the Chinese Communist Party (“CCP”). The Court notably observed that even if ByteDance shared user communications with the CCP in 2018 (as alleged by a former employee cited in Plaintiffs’ Complaint), it is too much to presume based on the engineer’s statement that these activities were ongoing several years later when CapCut became available to users in the United States. This temporal gap and the lack of specificity about what data was shared rendered the allegations too speculative to survive dismissal.
The Court also found that Plaintiffs failed to allege that ByteDance qualified as a remote computing service (“RCS”) or electronic communications service (“ECS”) under the SCA. Under the SCA, ECS is any service that provides users with the ability to send or receive wire or electronic communications. At the same time, RCS is a service that provides computer storage or processing services to the public utilizing an electronic communications system. In Garcia v. City of Laredo, 702 F.3d 788, 792 (5th Cir. 2012), the Court noted that for a company to be liable under the SCA, it must provide services that facilitate the transmission, storage, or processing of electronic communications on behalf of users—not merely collect and store user data for its purposes. Because Plaintiffs did not establish that CapCut functioned as an ECS or RCS, their SCA claims failed as a matter of law.
Lastly, Judge Alexakis granted Plaintiffs until April 2, 2025, to file an amended complaint addressing deficiencies in their dismissed claims. So whether you’re a legal professional, casual content creator, or simply concerned about data privacy, as you should be, the ongoing developments in Rodriguez v. ByteDance merit your continued attention.
So, all in all, I’m particularly encouraged by how the Court emphasized consumer expectations throughout its analysis. This suggests a shift from formalistic legal reasoning toward how privacy functions in people’s lives. Most users have never heard of BIPA or CIPA, but they instinctively recognize when an app crosses a line and invades their privacy.
For everyday app users (myself included), this case is a reminder that seemingly innocuous tools like video editors may be far more invasive than they appear. The allegations that CapCut collects all photos and videos on a device—not just those edited—should give pause to anyone who casually grants broad permissions during app installation..
So be careful out there, folks. Next time you download that trending app, consider this before blindly agreeing to permission requests. That innocent-looking video editor allegedly might be analyzing your face, recording your voice, or scanning through years of personal photos—all while you’re just trying to add a filter to your weekend outing with family and friends. Scary stuff.
And yet, whether you have any real recourse if an app oversteps its bounds depends entirely on which law—if any—happens to apply. The fact that some claims survived in this matter while others failed underscores the fragmented and inconsistent nature of privacy law in the U.S. Right now, a company’s liability for invasive data collection often hinges on whether a lawsuit is filed under a state biometric law, a consumer protection statute, or federal wiretap regulations—each with different requirements and loopholes. This patchwork approach leaves consumers vulnerable and businesses uncertain about compliance.
Imagine if physical property rights varied so drastically between states—where some protected against trespassing while others only recognized theft if an item was taken. That’s essentially our current digital privacy landscape, and without unified standards, the gaps in protection will only widen.
As always,
Keep it legal, keep it smart, and stay ahead of the game.
Talk soon!

Why Financial Institutions Should Stay the Course

Introduction
Many regulated businesses believe that the only thing worse than strict regulations is a wholly uncertain regulatory environment. With many rule changes on hold and enforcement actions and investigations being terminated or limited, how do banks, payments program managers, processors, and fintechs move forward? Do they “take their gloves off” and take advantage of a possible enforcement void to maximize profits, or do they stay the course given that there are 50-year-old laws on the books that still apply and probably are not going anywhere? 
We say, continue to innovate with the expectation that certain fundamental laws and rules are unlikely to change and that consumers still want and need financial services and products.
The Resilience of Statutes and Regulations
Most financial institutions, payments companies, and fintechs have always designed their products and services for compliance. When new rules and orders come out, they often do not have to make changes because they had a robust compliance program in place and had already been using best practices. Similarly, they are not quick to take advantage of a “bad” ruling, knowing instinctively that a new statute, order, or ruling will soon restore the status quo.
Even in a time of regulatory uncertainty, the primary federal consumer protection rules that have existed since the late 1960s and 1970s are likely to stay in place. These include the following: 

The Truth in Lending Act (TILA) and its Regulation Z, which, among other things, require loan disclosures, periodic statements for open-end credit, and prepaid account disclosures, and provide consumers with protections from unauthorized credit card transactions. 
The Electronic Fund Transfers Act (EFTA) and its Regulation E, requiring initial disclosures, regulating electronic fund transfer (EFT) arrangements, and providing significant consumer protections from unauthorized EFTs.
The Equal Credit Opportunity Act (ECOA) and its Regulation B, prohibiting impermissible forms of credit discrimination and requiring “adverse action” notices or other notifications regarding credit applications and existing extensions of credit. While the scope of the impermissible discrimination rules may change from time to time, including as a result of court decisions, the basic credit notification requirements are unlikely to change.
The Truth in Savings Act and its Regulation DD, which requires initial disclosures for consumer deposit accounts and, if statements are provided, requires specific information to be included in such statements.
The Real Estate Settlement Procedures Act (RESPA) and its Regulation X. In addition to requiring certain mortgage loan disclosures, Section 8 of RESPA prohibits referral fee and kickback arrangements involving “settlement services.” Here is one area for which the rules might be relaxed. For many years, the ability to enter into marketing services agreements and similar arrangements has been severely limited due to the Section 8 interpretations and enforcement actions of the Consumer Financial Protection Bureau (CFPB). With the CFPB being under new leadership and its future uncertain, marketing arrangements that survived Section 8 scrutiny prior to the CFPB might again be viable. 

For all of the above, while enforcement by federal regulators might be reduced, enforcement by plaintiffs’ lawyers likely will not. This seems particularly likely for those laws such as TILA, the EFTA, and ECOA that provide for class-action liability. 
State laws governing credit interest rates, loan and other product and service fees, and consumer disclosures also are likely to stay in place. Those laws might shift in some states, particularly those laws that were made more burdensome in recent years, but they are unlikely to go away entirely. 
States May Fill the Void
All of the federal laws listed above are “federal consumer financial laws” under the Dodd-Frank Act, and state attorneys general and state regulators are empowered by that act to bring a civil action to enforce any of these laws. The main exception is that a state attorney general or regulator generally may not bring such civil actions against a national bank or federal savings association.
Conclusion
Although there may be some regulatory uncertainty, some things remain constant. Lawyers will be lawyers and lawsuits will be brought, and state attorneys general and regulators can enforce the federal consumer financial laws against most banks and nonbank businesses. 
It is just a question of complying with the existing laws, applying common sense rules, and developing attractive consumer options. We are not without regulatory guardrails, but old-fashioned banking with modern innovations still provides routes to develop and market consumer products and services and build customer relationships. Those businesses that continue to innovate can take the lead.

FCC Seeks Comment on Quiet Hours and Marketing Messages

We recently published a blog about a slew of class action complaints alleging that marketing text messages cannot be sent between the hours of 9:00 pm and 8:00 am (“Quiet Hours”) unless the recipient provides prior express invitation or permission to receive such messages during Quiet Hours (“Quiet Hour Claims”). As noted, based on the plain language of the Telephone Consumer Protection Act (“TCPA”), we disagree with this argument because marketing text messages already require prior express written consent from the called party. The Ecommerce Innovation Alliance (EIA) and others filed a petition for declaratory ruling (“Petition”) with the Federal Communications Commission (“FCC”) to address this application of Quiet Hours to marketing messages.
On March 11, 2025, the FCC released a Public Notice asking for comment on the Petition. So, the FCC, and its Consumer and Governmental Affairs Bureau, have moved quickly to seek public comment on the questions raised by the petitioners.
Initial comments are due by April 10; with reply comments due by April 25. The FCC will then consider the record in contemplating a decision. There is no requirement or specific deadline for the agency to take action on the Petition. However, the plethora of Quiet Hour Claims being filed could encourage relatively prompt FCC action to clarify the rules.

ANOTHER FCC TURN OF EVENTS: As Commissioner Starks Resigns

As we still await the approval of Olivia Trusty to fill the seat that was vacated with the shift of Chairwoman Rosenworcel stepping down and Commissioner Carr entering the role of Chairman, there is a new development as of today with Commissioner Starks making a statement that he plans to resign his seat in the spring. While there is not an exact timeline, Starks did mention fulfilling his duties over the “next few weeks”.
Starks said of his time at the FCC “Serving the American people as a Commissioner on the Federal Communications Commission has been the honor of my life. With my extraordinary fellow Commissioners and the incredible career staff at the agency, we have worked hard to connect all Americans, promote innovation, protect consumers, and ensure national security. I have learned so much from my time in this position, particularly when I have heard directly from Americans on the issues that matter to them. I have been inspired by the passion, engagement and commitment I have seen from colleagues, advocates, and industry.”
Chairman Carr followed up with his own notice and shared the following in response to Starks’ resignation: “Commissioner Starks led many of the FCC’s national security initiatives, and I welcomed the chance to work closely with him on important matters, including promoting new innovations, protecting consumers, and bringing families across the digital divide. Commissioner Starks put in the work and leaves an impressive legacy of accomplishments in public service. I always learned a lot from him and benefited from the many events we held together.”
Does this now leave the FCC with only one Democrat? We will soon find out but since the role is appointed by the President, I have a feeling it will be heavily weighted with Republicans.

Arkansas Attorney General Sues GM and OnStar Over Alleged Privacy Violations

On February 26, 2025, the Attorney General of Arkansas filed a lawsuit against General Motors Co. (“GM”) and its subsidiary, OnStar LLC (“OnStar”), alleging deceptive trade practices related to the collection and sale of drivers’ data. The complaint alleges that GM and OnStar gathered detailed driving data (including precise geolocation data, GM app usage data, and information about consumers’ driving behavior (e.g., start time, end time, vehicle speed, high-speed driving percentage, late-night driving percentage, acceleration data, braking data, and distance driven)) from over 100,000 Arkansas residents without their consent and sold it to third-party data brokers. The data brokers then allegedly sold the data to insurance companies, which used the data to deny coverage or increase insurance rates for consumers. The complaint asserts that GM and OnStar collected and sold the consumer data to generate additional revenue for the companies. The Arkansas Attorney General is seeking monetary damages, injunctive relief, and attorneys’ fees and expenses.
This lawsuit follows actions by the FTC and the Texas Attorney General over similar data-sharing allegations, and is part of a larger trend of state regulators examining the privacy practices of connected vehicle manufacturers.

Virginia Legislature Passes Bill Restricting Minors’ Use Of Social Media to One Hour Per Day

On March 11, 2025, the Virginia legislature passed a bill that would amend the Virginia Consumer Data Protection Act (“VCDPA”) to impose significant restrictions on minor users’ use of social media. The bill is pending signature by Virginia Governor Glenn Youngkin, who has until March 24, 2025 to sign it into law. The bill comes on the heels of recent children’s privacy amendments to the VCDPA that took effect on January 1, 2025.
If signed into law, the bill would amend the VCDPA to require social media platform operators to (1) use commercially reasonable methods (such as a neutral age screen) to determine whether a user is a minor under the age of 16 and (2) limit a minor’s use of the social media platform to one hour per day, unless a parent consents to increase the limit. The bill would prohibit social media platform operators from altering the quality or price of any social media service due to the law’s time use restrictions.
If signed into law, the amendments to the VCDPA would take effect on January 1, 2026.

Gender-Affirming Care Protections Eroded by Recent HHS Guidance and White House Executive Orders

On February 20, 2025, the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) announced the recission of “HHS Notice and Guidance on Gender Affirming Care, Civil Rights, and Patient Privacy” (the “Rescinded 2022 Guidance”) pursuant to recent Executive Order (“EO”) 14187 (“Protecting Children from Chemical and Surgical Mutilation”) and EO 14168 (“Defending Women from Gender Ideology Extremism and Restoring Biological Truth to the Federal Government”), issued under the current Trump administration.
These executive orders directed HHS to revoke policies promoting gender-affirming care and reconsider its interpretation of civil rights protections and health information privacy laws as they relate to such care.
Background on the Rescinded 2022 Guidance
The Rescinded 2022 Guidance, originally issued on March 2, 2022 under the Biden administration, and which we previously discussed here, established a framework for applying federal civil rights protections and patient privacy laws to gender-affirming care in three key ways:

Section 1557 of the Affordable Care Act (ACA): The Rescinded 2022 Guidance asserted that federally funded entities restricting access to gender-affirming care could be in violation of Section 1557, which prohibits discrimination based on sex, including gender identity.
Section 504 of the Rehabilitation Act and the Americans with Disabilities Act (ADA): The Rescinded 2022 Guidance took the position that gender dysphoria could qualify as a disability, meaning that restricting access to care based on gender dysphoria could constitute unlawful discrimination.
Health Insurance Portability and Accountability Act of 1996 (HIPAA): The Rescinded 2022 Guidance interpreted HIPAA’s Privacy Rule to prohibit the disclosure of protected health information (PHI) related to gender-affirming care without the patient’s authorization, except in limited circumstances when explicitly required by law.

HHS Bases for the Rescission
OCR Acting Director, Anthony Archeval, stated that the “recission is a significant step to align civil rights and health information privacy enforcement with a core Administrative policy that recognizes that there are only two sexes: male and female.” The HHS Office on Women’s Health also issued guidance expanding on the sex-based definitions set forth in the EO 14168. This HHS guidance contained the following definitions:

Sex: A person’s immutable biological classification as either male or female.
Female: is a person of the sex characterized by a reproductive system with the biological function of producing eggs (ova). We note that EO 14168 defines female in a slightly different manner to mean “a person belonging, at conception, to the sex that produces the large reproductive cell.”
Male: is a person of the sex characterized by a reproductive system with the biological function of producing sperm. We note that EO 14168 defines female in a slightly different manner to mean “a person belonging, at conception, to the sex that produces the small reproductive cell.”

In its February 20, 2025 press release, HHS further stated that “[t]his rescission supports Administration policy in Executive Order 14187 that HHS will not promote, assist, or support “the so-called ‘transition’ of a child from one sex to another, and it will rigorously enforce all laws that prohibit or limit these destructive and life-altering procedures.”
Further, OCR’s formal recission letter dated February 20, 2025, outlining several reasons leading to the Rescinded 2022 Guidance:

ACA (Section 1557): HHS cited recent federal cases, Texas v. EEOC and Bostock v. Clayton County, as calling into question the legal basis for extending Section 1557 protections to gender identity. But see Kadel v. Folwell, 2024 WL 1846802 (4th Cir. 2024) (On May 8, 2024, the Fourth Circuit affirmed the trial court rulings that the exclusion of coverage for gender affirming care by state health plans in West Virginia and North Carolina violated the nondiscrimination protections of the Affordable Care Act (ACA) Section 1557).
Rehabilitation Act and ADA: HHS argued that gender dysphoria does not meet the statutory definition of a disability, as the law explicitly excludes gender identity-related conditions unless resulting from a physical impairment. However, the Fourth Circuit, in Williams v. Kincaid, 45 F. 4th 759, 770 (4th Cir. 2022), concluded that gender dysphoria is a disability protected under the ADA and does not fall within the ADA’s exclusion for “gender identity disorders not resulting from physical impairments.” See also Blatt v. Cabela’s Retail, Inc., 2017 WL 2178123 (E.D. Pa. May. 18, 2017) (Plaintiff’s gender dysphoria, which substantially limits her major life activities of interacting with others, reproducing, and social and occupational functioning, is not excluded from ADA protection.)
HIPAA: HHS stated that the Rescinded 2022 Guidance lacked a legal foundation for restricting PHI disclosures beyond HIPAA’s established exceptions. However, we note that current established exceptions already allow disclosures without patient authorization in certain circumstances, including when required by law. Interestingly, the new reproductive health amendments to HIPAA, which became effective on December 23, 2024, may, if interpreted broadly, provide additional privacy protections to information related to gender affirming care. 

In addition to the recission, HHS also announced its launch of HHS’ Office on Women’s Health website, which we reference above, to promote these policies.
Impact on HIPAA and Patient Privacy
In the wake of the Rescinded 2022 Guidance and associated OCR statements, it remains unclear how OCR will now handle complaints related to the use and disclosure of PHI concerning gender-affirming care. Accordingly, entities that handle such data should carefully review their internal policies to ensure compliance with evolving interpretations of HIPAA’s Privacy Rule.
However, entities should also consider the HIPAA Privacy Rule to Support Reproductive Health Care Privacy, finalized in April 2024, which broadly defines “reproductive health care.” Gender-affirming care often falls within this definition, meaning that certain privacy protections may still apply under this rule despite the Rescinded 2022 Guidance. While HHS’s recent actions suggest a lack of intent to defend this interpretation, the 2024 reproductive health rule remains in effect despite ongoing litigation in Texas challenging these amendments. On September 8, 2024, the Texas Attorney General, in litigation pending in the Northern District of Texas, claimed that the new rule harms the AG’s ability to investigate medical care, lacks statutory authority, and is arbitrary and capricious. This litigation is still pending.
Compliance and Legal Considerations

Federal vs. State Law Conflicts: Entities must navigate the potential conflicts between state laws and the rescission of the Rescinded 2022 Guidance. For instance, Colorado and California have laws explicitly protecting access to gender-affirming care, which could create legal complexities for providers and insurers operating under multiple jurisdictions.
Litigation and Injunctions: On March 4, 2025, a federal judge in Maryland issued a preliminary injunction enjoining federal agencies from issuing regulations or guidance or otherwise implementing mandates of EO 14187. This injunction applies nationwide. In a more limited fashion, a judge in Washington issued a preliminary injunction which applies only to Washington, Colorado, Minnesota, and Oregon. As the Maryland court is still deciding on the merits of the case before it, entities should monitor these legal developments to understand go forward compliance obligations under both federal and state regulations.
Potential Whistleblower Protections. EO 14187 also directs HHS, in consultation with the Attorney General, to “issue new guidance protecting whistleblowers who take action related to ensuring compliance with this order.” Accordingly, it is possible that under such contemplated guidance, an increase in whistleblower-initiated compliance investigation may ensue. Yet, such increase in whistleblowing as an avenue to evaluate compliance would not address the potential friction between the requirements under the HIPAA Privacy Rule to Support Reproductive Health Care Privacy.
Threats to Funding.  On March 5, 2025, numerous health care providers enrolled in the Medicare and Medicaid programs received a letter from CMS stating that “CMS may begin taking steps in the future to align policy, including CMS-regulated provider requirements and agreements, with the highest-quality medical evidence in the treatment of the nation’s children” as it relates to gender affirming care. The following day, on March 6, 2025, SAMHSA and HRSA sent similar letters to Hospital Administrators and Grant Recipients referencing the March 5, 2025 CMS letter and threatening examination of current grants and the “re-scoping, delaying or potentially cancelling new grants in the future” depending upon the nature of the work being performed by the providers and/or grant recipients as it relates to gender affirming care for minors.

Key Takeaways
The rescission of the 2022 “HHS Notice and Guidance on Gender Affirming Care, Civil Rights, and Patient Privacy” seeks to align HHS’s policies with the Trump administration’s stance on gender-affirming care. The recission introduces financial and compliance challenges for entities regulated by the HHS. However, the recission of the Rescinded 2022 Guidance does not eliminate all HIPAA provisions related to reproductive health and other state-level protections may still provide certain privacy and anti-discrimination safeguards relative to individuals seeking gender affirming care. Given this uncertainty, organizations should revisit their policies and procedures, closely monitor the evolving regulatory landscape, and keep a close eye on litigation outcomes to ensure continued compliance.

The Latest Attack on Consumer Arbitration Agreements

The war against arbitration agreements continues apace. The latest volley comes from the U.S. Court of Appeals for the Fourth Circuit, Johnson v. Continental Finance Company, LLC, No. 23-2047 (4th Cir. Mar. 11, 2025). In Johnson, the court considered whether a change-in-terms provision in a cardholder agreement rendered arbitration and delegation clauses illusory under Maryland law. In a 2-1 decision featuring opinions by all three panel members, the court said “yes,” and found the arbitration and delegation clauses unenforceable.
Plaintiffs filed putative class-action complaints against Continental Finance Company, LLC and Continental Purchasing, LLC. Continental moved to compel arbitration pursuant to the arbitration provision contained in the cardholder agreement Plaintiffs received upon account opening. Plaintiffs opposed, arguing the cardholder agreement lacked consideration because the agreement’s change in terms provision permitted Continental to unilaterally amend the agreement at its “sole discretion”:
We can change any term of this Agreement, including the rate at which or manner in which INTEREST CHARGES, Fees, and Other Charges are calculated, in our sole discretion, upon such notice to you as is required by law. At our option, any change will apply both to your new activity and to your outstanding balance when the change is effective as permitted by law.
Affirming the district court, a majority of the panel agreed that the arbitration clause was illusory because the change-in-terms provision allowed Continental to “change any term of [the] Agreement in [its] sole discretion, upon such notice to [Plaintiffs] as is required by law.” Citing a decision by the Supreme Court of Maryland (Cheek v. United Healthcare), the majority said such provisions “are so one-sided and vague” under Maryland law that they “allow[] a party to escape all of its contractual obligations at will,” including the obligation to arbitrate. Based on this, the majority held that the arbitration and delegation clauses were unenforceable.
Judge Wilkinson’s lead opinion raises a difficult question: If the change-in-terms provision renders the arbitration clause illusory, then why doesn’t it render the entire cardholder agreement illusory? To be sure, the plaintiffs limited their argument to the arbitration and delegation clauses, and the majority affirmed that these were the only provisions that its judgment disturbed. The lead opinion doesn’t answer this question. To our eyes, we see no limiting principle that would prevent the same argument from taking down the entire cardholder agreement. What’s good for the goose is good for the gander: Arbitration agreements are to be treated just like every other contract under state law. If the change-in-terms provision nullifies the formation of the arbitration agreement, the same should be true for every other term in the contract. Such a drastic outcome would jeopardize the formation of countless consumer contracts. As the dissent (authored by Judge Niemeyer) points out, the change-in-terms language here is “legal and widespread.” All that is required is sufficient notice of the change. If consumers don’t like the change, they negotiate with their wallets and take their business elsewhere.
Perhaps sensing this gap in the lead opinion, Judge Wynn addresses it in his decisive concurrence. But in doing so, he frankly raises more troubling questions. He points to another Maryland Supreme Court case (Holmes v. Coverall N.A., Inc.) stating that “an arbitration provision contained within a broader contract is a separate agreement that requires separated consideration in order to be legally formed.” This strand of Maryland law strikes us as potentially unlawful as preempted under the Federal Arbitration Act. Again, arbitration agreements must be treated on the same footing as every other contract under state law. No one disagrees that every other provision in Continental’s contract can be negotiated collectively and supported by the same pot of consideration. So why do arbitration agreements require something different and more rigorous under Maryland law? Though we’re obviously Monday morning quarterbacking this case, our answer is: They shouldn’t.
As noted at the top, Johnson is part of a larger judicial war by plaintiffs’ lawyers and consumer advocacy groups against consumer arbitration—one that we expect to grow in ferocity given the Trump administration’s recent defanging (and defunding) of the CFPB. Several courts have limited the enforcement of arbitration provisions in consumer contracts where plaintiffs have argued that the unilateral modification of such contracts to include arbitration provisions was illusory or did not comply with the implied covenant of good faith and fair dealing. See Canteen v. Charlotte Metro Credit Union, 900 S.E.2d 890 (N.C. 2024); Decker v. Star Fin. Grp., Inc., 204 N.E.3d 918 (Ind. 2023); Badie v. Bank of Am., 67 Cal. App. 4th 779 (1998). And prior to the recent changes in Washington, the CFPB had proposed a rule making one-sided “change-in-terms” provisions illegal and unenforceable.
We note however that several courts have gone the other way, see, e.g., SouthTrust Bank v. Williams, 775 So. 2d 184 (Ala. 2000), and the cases that have refused to enforce arbitration provisions have indicated that such provisions may be enforceable where the change in terms clause expressly requires a detailed description of changes before they become effective (Johnson) or the contract previously had a governing law provision that specified the forum for the resolution of disputes (Canteen).
Companies that have arbitration provisions or are considering adding them to their consumer contracts should stay apprised of the developing law in this area, particularly in the states in which they are located. Please talk to a lawyer before you draft or promulgate an arbitration clause—an ounce of prevention is worth a pound of cure.

Proskauer on Privacy: 2024 Reflections & 2025 Predictions

2024 marked another significant year for privacy law, with new state legislation and high-stakes litigation reshaping the landscape. Legal battles over tracking technologies, biometric data, and children’s privacy intensified, while federal agencies, including the Federal Trade Commission (“FTC”) and the U.S. Department of Health and Human Services Office for Civil Rights (“HHS OCR”), ramped up their efforts through major enforcement actions and high-profile settlements, marking a new era of increased accountability.
Federal Privacy Law Gridlock
Attempts to pass comprehensive federal privacy legislation in 2024 fell short once again, leaving a significant gap in U.S. data protection standards and a lack of a national data privacy standard. Despite bipartisan support, the American Privacy Rights Act (“APRA”), designed to unify privacy laws, preempt conflicting state regulations, introduce a private right of action, and enforce opt-out mechanisms, did not pass the 118th Congress. Still, the last Congress passed, as part of a larger appropriations bill, the “Protecting Americans’ Data from Foreign Adversaries Act of 2024” (15 U.S.C. § 9901), which makes it unlawful for a data broker “to sell, license, rent, trade, transfer, release, disclose, provide access to, or otherwise make available personally identifiable sensitive data of a United States individual to (1) any foreign adversary country; or (2) any entity that is controlled by a foreign adversary.” Without a comprehensive federal privacy law, states were forced to fill the void by passing their own. But each state that did so had independent and distinct requirements for those laws, leading to burdensome compliance efforts, higher operational costs, and increased legal risks for businesses.
FTC Rulemaking and Enforcement Intensifies
In 2024, the FTC prioritized safeguarding sensitive data, focusing on location tracking, health data, children’s privacy, and cybersecurity. The agency secured key settlements, banning the sale of sensitive location data without consent or deidentification, investigating health data misuse, and filing a Children’s Online Privacy Protection Act (“COPPA”) action against TikTok. In terms of children’s privacy, it should also be noted that at the close of the Biden administration, the FTC finalized changes to the COPPA Rule to set new requirements surrounding the collection, use and disclosure of children’s personal information, including requiring covered websites and online service operators to obtain opt-in consent from parents for targeted advertising and other disclosures to third parties.
One notable FTC settlement prohibited a data broker from selling or sharing sensitive location data after it was collected and distributed without adequate safeguards. Another targeted a cybersecurity company accused of unlawfully selling browser data and engaging in deceptive practices. The FTC also filed complaints and secured proposed settlements with an alcohol addiction treatment service and a mental health telehealth company, alleging they illegally shared users’ health information for advertising purposes through third-party tracking tools.
The agency also intensified its focus on deceptive and fraudulent claims surrounding AI products and services. Companies using AI-driven platforms were also urged to take “necessary steps to prevent harm before and after deploying [an AI] product” to ensure fairness, minimize bias, and comply with evolving regulatory standards. As the FTC expanded enforcement in this area, businesses faced growing pressure to proactively mitigate risks and implement safeguards to avoid costly investigations and penalties.
HIPAA Enforcement and Judicial Constraints
In 2024, the HHS OCR focused heavily on enforcing the Health Insurance Portability and Accountability Act (“HIPAA”), concluding over 22 enforcement actions. However, the landmark ruling in American Hospital Association v. Becerra curtailed HHS’s authority over online tracking liability under HIPAA, holding that HHS could only regulate information that both identifies an individual and directly relates to their health.
Following the ruling, HHS voluntarily withdrew its appeal, signaling a shift in its approach to online tracking and privacy enforcement. The decision marked a critical limitation on HHS’s ability to regulate digital health technologies and underscored the ongoing tension between evolving digital practices and traditional privacy regulations.
Litigation Trends: Old Laws, Modern Issues
With no federal privacy law in place, plaintiffs in 2024 relied heavily on old electronic privacy statutes for class action lawsuits, including the Video Privacy Protection Act of 1988 (“VPPA”), Electronic Communications Privacy Act of 1986 (“ECPA”), and numerous state laws, such as California’s  Invasion of Privacy Act of 1967 (“CIPA”) and Song Beverly Credit Card Act of 1971 (“SCCA”), to address modern online privacy concerns.
While VPPA was designed to prevent video rental stores (e.g., Blockbuster) from sharing customers’ personal data and the ECPA and CIPA to prevent eavesdropping and traditional wiretapping, plaintiffs have recently repurposed these laws to target alleged misuse of internet technologies such as cookies, pixels, chatbots, and session replay technology, a trend that continued to gain traction throughout 2024. Plaintiffs have also attacked the use of these technologies using the SCCA—a statute that restricts businesses from collecting unnecessary personal identification information during credit card transactions. While originally intended for brick-and-mortar retailers, plaintiffs are now extending the statute’s application to digital commerce, limiting how businesses can request and store consumer data during online purchases.
Class action lawsuits over data breaches and mishandled opt-out requests also continued to surge, fueled by regulatory developments and high-profile breaches. Data subject requests for deletion, access, and opt-outs increased by 246% between 2021 and 2023, highlighting the demand for transparency and control. A 2024 audit found 75% of businesses failed to honor opt-out requests, highlighting the practical challenges of data privacy compliance.
To mitigate their legal privacy risks, companies will need to consider refining consent mechanisms, implementing robust consent management platforms, and exploring alternatives to cookie-based or pixel tracking. Compliance with all of these laws are critical to ensure proper disclosures, limit personal data requests, and reinforce consumer trust.
Comprehensive State Privacy Laws
In 2024, seven states enacted comprehensive privacy laws in 2024 – raising the total number of comprehensive state privacy laws to 20. Many of these laws, including Florida, Montana, Oregon, and Texas, went into effect in 2024 – Nebraska, New Hampshire, Delaware, Iowa, and New Jersey – went into effect at the beginning of 2025, Minnesota, Tennessee and Maryland will go into effect later in the year (i.e., July 2025 and October 2025 respectively). Kentucky, Rhode Island and Indiana are scheduled to go into effect in 2026.
State-level enforcement also intensified, with California, Texas, and New Hampshire leading major efforts. For example, California reached a settlement with DoorDash in February 2024 after the company purportedly sold its California customers’ personal information without providing notice or an opportunity to opt out in violation of the California Consumer Privacy Act (“CCPA”) and CalOPPA. In June 2024, the state reached another settlement with Tilting Point Media for violations of CCPA and COPPA for Tilting Point’s alleged collection and sharing children’s data without parental consent.
In addition, Texas reached several major settlements, two of which involved Meta and the company’s purported violations of biometric privacy laws, and a first of a kind settlement involving a Dallas based artificial intelligence healthcare tech company for alleged deceptive generative AI practices. The state also initiated a new suit against General Motors in August 2024 for unlawful sale of driving data, and announced an investigation into fifteen companies for potential violations of Texas’ Securing Children Online through Parental Empowerment Act and Data Privacy and Security Act.
2025 Privacy Predictions
2025 is expected to be another defining year for privacy regulation, with key trends from recent years continuing to evolve and present new challenges for businesses. The fragmentation of state-level privacy laws, increased enforcement, and the rapid evolution of rules governing biometric data and AI technologies are expected to intensify.
Businesses can expect heightened scrutiny on algorithmic transparency, and biometric protections. Generative AI is also expected to draw significant regulatory attention as the technology matures and states continue to consider additional legislation or regulations, whether it be related to marketing claims, employment, transparency, AI deepfakes, or publicity rights. Companies in health, finance, and technology, specifically, should remain vigilant as regulators push for stricter accountability. While compliance challenges and rising operational costs are likely, organizations that proactively audit data-sharing practices, update privacy policies, and ensure AI compliance will be equipped to navigate the evolving regulatory landscape and reduce overall legal risks.
Federal Legislative Efforts Still Struggle
Despite a growing appetite for a unified privacy framework, progress remains slow heading into 2025. The inability to advance the APRA in 2024 underscores the challenge of balancing state autonomy with uniform, national standards. These challenges are only further compounded by the Trump administration’s emphasis on deregulation and a heavily divided Congress. Businesses will likely continue operating without a comprehensive federal privacy law for the foreseeable future. However, renewed lobbying efforts, Congressional hearings, and mounting industry pressure suggest that the core concepts undergirding the APRA could reemerge with modifications. Moreover, it is conceivable Congress could pass legislation strengthening children’s privacy, given that the Senate overwhelmingly, with a 91-3 vote, passed legislation that included the Kids Online Safety Act and the Teen’s Online Privacy Protection Act (collectively known as COPPA 2.0); the legislation later died in the House, but it will likely be taken up again in the current session of Congress.
In the absence of clear federal guidance, businesses should expect to rely on recognized industry standards in the interim. While these standards are instructive, businesses should note that strict adherence to them may not ensure compliance with the complex web of multi-state regulations. Companies operating across multiple jurisdictions should be sure to consult legal counsel as they navigate the current patchwork of privacy laws to reduce their legal risk.
More States Join the Privacy Landscape. With More to Come?
In 2025, several state privacy laws have recently gone into effect and more are set to take effect later in the year, including Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, and Tennessee. These comprehensive privacy laws significantly expand state-level data protection regulations bringing the total number of states with privacy laws to 20. In addition, other states have lifted the data privacy law template and are debating similar bills of their own in 2025 (e.g., New York S365B), and have debated other bills related to consumer health privacy (e.g., New York Health Information Privacy Act, awaiting the governor’s signature), social media restrictions and other data privacy related issues.
With compliance becoming more complex, investments in automated tools to monitor regional legal variations are expected to grow, as businesses recognize them as critical for long-term regulatory resilience in an ever-changing environment.
Litigation Trends: Internet Tracking Technologies & Healthcare Data
Regulators and plaintiffs continue to focus on cases involving internet tracking technologies, particularly under statutes including VPPA, ECPA (and state wiretapping laws), and CIPA, as well as laws governing the general collection of website user information, such as the SCCA. These cases increasingly scrutinize how companies track, collect, and use consumer data, particularly in sensitive contexts such as healthcare and wellness.
Against this backdrop, Washington’s My Health My Data Act (“MHMDA”) which went into effect in 2024, imposes strict privacy protections on consumer health data, extending beyond traditional healthcare providers to include wellness apps, online health services, and companies handling health-related consumer information. The law requires businesses to obtain explicit consent before collecting or sharing health data, maintain transparent privacy policies, and enforce stringent security measures to prevent unauthorized access or misuse.
Notably, the first lawsuit under MHMDA was recently filed against Amazon, marking a significant test case for the law’s enforcement. Given the evolving regulatory landscape, businesses should closely monitor litigation and compliance developments in this space.
Continued Momentum for AI, Biometric and Neural Data
Neural data has become a significant privacy concern with the rapid growth of wearable devices and brain-computer interfaces. In 2024, California and Colorado amended their privacy laws to extend protections to neural data, sparking broader regulatory interest and prompting advocacy groups to push for ethical standards and stricter consent requirements. Companies developing neural data technologies, including VR applications, brainwave monitoring devices, and other wearables, are investing in advanced encryption, secure storage, and anonymization methods to safeguard this highly sensitive information.
AI also remains a key driver of both cybersecurity advancements and emerging risks in 2025. In response to privacy violations linked to AI-powered tracking in 2024, businesses are increasingly deploying AI tools to improve threat detection, monitor compliance, and secure sensitive data. Cybercriminals have also embraced AI, using it to execute more targeted and complex attacks, such as deepfake impersonation, advanced phishing schemes, automated network breaches, and large-scale data theft.
As AI adoption grows, companies face rising legal and regulatory risks. To address these challenges, businesses should consider comprehensive AI governance frameworks, including regular algorithm audits, bias detection systems, and accountability structures to meet regulatory standards and maintain consumer trust and a high-quality standard of work.
Conclusion
The transition from 2024 to 2025 marks another important moment in the privacy landscape, with escalating state regulatory demands and stricter enforcement reshaping business practices. Companies must embed privacy into their core operations. By investing in privacy-by-design frameworks, adaptive compliance systems, and monitoring of emerging risks, businesses can stay ahead of shifting regulations. Those that anticipate change, take decisive action, and prioritize reasonable data protection as a competitive advantage will not only reduce risks but position themselves as leaders in an era where privacy drives both trust and innovation.

When Does Venting Become a Complaint?

Imagine this all too-familiar scenario:

A company makes the difficult decision to terminate an employee’s employment due to poor performance. This should come as no surprise to the employee, who has been counselled and disciplined on numerous occasions. Yet, the employee expresses shock and outrage. During the termination meeting they express that they believe they are not being terminated because of their (well-documented) performance issues, but in retaliation for having made a “complaint.” Panic ensues.
The company’s human resources (HR) manager has never heard about any complaints lodged by this employee, and there is nothing in the personnel file or HR records reflecting any complaint. Baffled, the HR manager keeps digging and eventually learns that, a few weeks ago, in the midst of a casual chat with his supervisor, this employee mentioned that he works more hours than other employees, but his compensation doesn’t seem to reflect the additional time he puts in. 
This raises the age-old question — when does a general “vent session” become a protected complaint that the company must investigate? The answer may surprise some employers. According to the Department of Labor, employees cannot be retaliated against for inquiring about their pay, hours of work, or other rights. (Additionally, the National Labor Relations Act provides protections to covered workers who engage in concerted activity — which includes raising concerns about terms and conditions of employment.) Therefore, in a general sense, complaints of this nature should always be addressed and possibly investigated. 
However, it is often hard to know when a comment or vent session rises to the level of a protected complaint. As a rule of thumb, if an employee raises the topic of wages or house of work (as the employee in our scenario did), employers should err on the side of caution and investigate the issue. 
At a minimum, supervisors should be instructed to bring comments of this nature to HR or in-house or outside counsel for a determination of whether an investigation is warranted. While in some instances this may feel like overkill, it is a “better safe than sorry” approach that will protect the company in the end.