Navigating DORA Compliance: Recent Developments

The EU Digital Operational Resilience Act (DORA) took effect on 17 January 2025 after a two-year implementation period. DORA sets out new requirements for financial entities (FEs) and their information technology and communication (ICT) third-party service providers (TPPs). This note highlights recent developments in the EU’s efforts to facilitate in-scope firms’ compliance with DORA and authorities’ attempts to avoid duplication of operational resilience requirements.
Further information regarding DORA developments can be found in our previous articles (available here, here, here and here).
EBA Amends Guidelines on ICT and Security Risk Management 
On 11 February 2025, the European Banking Authority (EBA) amended its existing 2019 guidelines on ICT and security risk management measures (Guidelines) to align them with DORA.
The EBA has narrowed the scope of its Guidelines to cover:

only FEs subject to DORA, including credit institutions, payment institutions, account information service providers, exempted payment institutions and exempted e-money institutions; and
relationship management of the payment service users in relation to the provision of payment services.

The EBA’s aim is to simplify the ICT risk management framework and provide legal clarity for the industry, by avoiding duplication of requirements and ensuring consistency across the EU single market.
However, other types of payment service providers (PSPs), such as post-office giro institutions and credit unions, who are not covered by DORA, will still have to comply with the security and operational risk management requirements under the revised Payment Services Directive (PSD2), which has been in force since March 2018. In addition, PSPs that remain subject to the PSD2 security and operational risk management requirements can potentially be subject to additional national requirements.
The Guidelines will apply within two months of the publication of the translated versions.
The Guidelines and accompanying press release are available here and here, respectively.
Commission Adopts Delegated Regulation on Threat-led Penetration Testing under DORA
On 11 February 2025, the European Central Bank (ECB) published an updated version of its framework for threat intelligence-based ethical red teaming (TIBER-EU Framework) that aligns with the DORA regulatory technical standards on threat-led penetration testing (TLPT RTS). This follows the ECB’s publication of a paper considering the TIBER-EU Framework in the context of DORA. Further information on this earlier paper can be found in our previous article (available here).
DORA mandates the European Supervisory Authorities (ESAs), together with the ECB, to develop draft RTS in accordance with the TIBER-EU Framework to specify the following:

the criteria to identify FEs required to perform TLPT;
the requirements regarding test scope, testing methodology and results of TLPT;
the requirements and standards governing the use of internal testers; and
the rules on supervisory and other co-operation needed for the implementation of TLPT and for mutual recognition of testing. 

On 13 February 2025, the European Commission (Commission)adopted a delegated regulation (Delegated Regulation), with accompanying annexes 1-8, supplementing DORA in relation to the TLPT RTS. The Delegated Regulation shall enter into force and apply 20 days after its publication in the Official Journal of the European Union. 
The Delegated Regulation and updated version of the TIBER-EU Framework are available here and here, respectively. 
ESAs Publish Roadmap on the Designation of CTPPs under DORA
On 18 February 2025, the ESAs published a roadmap (Roadmap) for the designation of critical ICT TPPs (CTPPs), which will be subject to direct EU supervision under DORA.
Notably, the Roadmap sets out four steps to designation of CTPPs in 2025:

by 30 April 2025, the ESAs will collect the registers of information on ICT third-party arrangements submitted by FEs to their national competent authorities;
by the end of July 2025, the ESAs will perform the criticality assessments mandated by DORA and notify ICT TPPsif they are classified as critical; 
by mid-September 2025, there will be a six-week hearing period where TPPs can object to the assessment, with a reasoned statement and supporting information; and
by the end of 2025, the ESAs will have designated and published a list of CTPPs and commenced oversight engagement. 

The accompanying press release notes that TPPs that are not designated as critical can voluntarily request to be designated once the list of CTPPs is published, with details on how to raise such a request to be provided soon. 
The ESAs expect to organise an online workshop with TPPs in Q2 2025 to provide further clarity on preparatory activities, the designation process and the ESAs’ oversight approach.
The Roadmap and press release are available here and here, respectively.
Delegated and Implementing Regulations on Major ICT-Related Incidents and Cyber Threats Under DORA Published
On 20 February 2025, Delegated and Implementing Regulations (together, the Regulations) supplementing DORA were published in the Official Journal of the European Union, setting out the detailed requirements and procedures for reporting and notifying ICT-related incidents and cyber threats. The Commission adopted the Regulations in October 2024.

The Delegated Regulation specifies the content and time limits for the initial notification of, and intermediate and final report on, major ICT-related incidents by FEs, and the content of the voluntary notification for significant cyber threats. 
The Implementing Regulation sets out the standard forms, templates and procedures for FEs to report a major ICT-related incident and to notify a significant cyber threat.

Both Regulations will enter into force on 12 March 2025.
The Regulations are available here and here, respectively.
ESAs Publish Opinion on Commission’s Rejection of Draft RTS on Sub-contracting ICT Services Supporting Critical or Important Functions
On 7 March 2025, the ESAs published an opinion (Opinion) on the Commission’s rejection of its draft RTS on the elements an FE needs to determine and assess when sub-contracting ICT services supporting critical or important functions. 
The Commission notified the ESAs that it had rejected the draft RTS in January 2025 on the basis that certain requirements introduced by the draft RTS went beyond the mandate given to the ESAs under DORA. The Commission noted that Article 5 of the draft RTS, and the related recital 5, should be removed from the draft RTS. The Commission then stated it would adopt the RTS once the ESAs had made the necessary modifications.
In the Opinion, the ESAs acknowledge that the Commission’s amendments will ensure that the draft RTS are fully in line with its mandate under DORA. The ESAs do not recommend changes to the Commission’s proposed amendments. They note that FEs are expected to adhere to the provisions on subcontractors as set out in Article 29(2) of DORA and Article 3(6) of the implementing technical standards on the register of information.
The Opinion is available here.

Trending in Telehealth: February 2025

Trending in Telehealth highlights monthly state legislative and regulatory developments that impact the healthcare providers, telehealth and digital health companies, pharmacists and technology companies that deliver and facilitate the delivery of virtual care.
Trending in February:

Interstate compacts
Telepharmacy services
Veterinary services
Telehealth practice standards

A CLOSER LOOK
Proposed Legislation & Rulemaking:

North Dakota proposed amendments to the North Dakota Century Code related to optometrist licensure and standards for providing tele-optometry. The amendments delineate the circumstances under which a licensed optometrist may use telemedicine to provide care. Proposed practice standards include requirements to establish a proper provider-patient relationship and requirements related to informed consent.
In Indiana, Senate Bill 473 proposed amendments that would allow providers to prescribe certain agonist opioids through telemedicine technologies for the treatment or management of opioid dependence. Current law only allows partial agonist opioids to be prescribed virtually.

Finalized Legislation & Rulemaking Activity:

Ohio enacted Senate Bill 95, authorizing the operation of remote dispensing pharmacies, defined as pharmacies where the dispensing of drugs, patient counseling, and other pharmacist care is provided and monitored through telepharmacy systems.
The Texas Health and Human Services Commission adopted an amendment to the Texas Government Code, requiring that providers be reimbursed for teledentistry services. The amendment allows flexibility for a dentist to use synchronous audiovisual technologies to conduct an oral evaluation of an established client. This change makes oral evaluations more accessible and prevents unnecessary travel for clients in the Texas Health Steps Program.
The Arkansas governor signed Senate Bill 61 into law, authorizing the practice of veterinary telemedicine in the state. The bill includes practice standards for veterinary telemedicine and provision of emergency veterinary care.
Also in Arkansas, House Bill 1427 enacted the Healthy Moms, Healthy Babies Act. The act amends Arkansas law to improve maternal health and establish reimbursement procedures for remote ultrasounds.

Compact Activity:

Several states have advanced licensure compacts. These compacts enable certain categories of physicians to practice across state lines, whether in person or via telemedicine. The following states have introduced bills to enact these compacts:

Dietitian Licensure Compact: Mississippi, Kansas, and North Dakota.
Social Work Licensure Compact: Mississippi, Maryland, and North Dakota.
Occupational Therapy Licensure Compact: North Dakota and New Mexico.
Audiology and Speech Language Pathology Compact: New Mexico.

Why it matters:

States continue to expand practitioners’ ability to provide telehealth services across state lines. While telemedicine is often seen as an alternative method for care delivery, it can sometimes be the most effective and efficient option. Expanding interstate licensure compacts improves access to qualified practitioners, particularly in underserved and rural areas. These compacts also enhance career opportunities and reduce the burdens associated with obtaining multiple state licenses.
States continue to apply telehealth practice standards to various professions. Legislative and regulatory trends reflect recognition that telehealth can be used in a variety of specialty practices, including veterinary medicine, dentistry, and optometry.

Telehealth is an important development in care delivery, but the regulatory patchwork is complicated. 

BEHIND THE FILTERS: CapCut And TikTok Are Making The Cut And Maybe Your Personal Data Too

Greetings CIPAWorld!
Let’s get techy with it. Ever edited a TikTok or Instagram Reel using CapCut? It turns out that you might have handed over more than just your creativity. The Northern District of Illinois has delivered a mixed but consequential ruling in Rodriguez v. ByteDance, Inc., about how video editing apps collect and utilize our personal data. See Rodriguez v. ByteDance, Inc., No. 23 CV 4953, 2025 U.S. Dist. LEXIS 37355 (N.D. Ill. Mar. 3, 2025). You guessed it. TikTok is at issue here. If you’ve ever used CapCut to perfect a TikTok video or Instagram reel, this decision deserves your attention!
Yikes. Imagine editing a quick vacation video only to discover the app might be scanning every photo in your gallery and capturing your facial features! That’s precisely the kind of privacy implications at the center of this case. Make sure to always check your app permissions!
The Opinion in ByteDance, Inc. offers a nuanced examination of modern privacy law. It allows several significant claims to proceed while dismissing others. So, let’s get into a brief background first.
CapCut, developed and operated by Chinese technology giant ByteDance (which also owns TikTok), has exploded in popularity since its 2020 U.S. launch. Now, it’s one of the most downloaded apps globally, with over 200 million monthly active users! CapCut allows users to create, edit, and customize videos using templates, filters, and visual effects. Everyone wants to look good, right? While predominantly free, users can access premium features through subscription models. It’s remarkable how quickly CapCut became essential for content creators. When in actuality, this case forces us to confront the reality that the most user-friendly tools might also be invasive.
However, according to the Plaintiffs, this seemingly innocent video editor allegedly harbors a more problematic function—collecting vast amounts of user data without proper authorization. The Complaint alleges that CapCut collects everything from registration information and social network contacts to location data, photos, videos, and even biometric identifiers like face geometry scans and voiceprints. Yes you read that right… Biometric identifiers.
First, the Court’s reasoning behind allowing the California constitutional and common law privacy claims to proceed reveals evolving judicial thinking about digital privacy. Judge Alexakis emphasized that privacy violations don’t depend solely on the sensitivity of the content collected but also on the manner of collection. See Davis v. Facebook, Inc. (In re Facebook Inc. Internet Tracking Litig.), 956 F.3d 589, 603 (9th Cir. 2020).
Many of us miss this critical distinction in our everyday tech interactions. We often focus on what data apps collect rather than how they collect it. The Court’s analysis suggests that even innocuous data could trigger privacy concerns if gathered through deceptive or overly invasive methods—a crucial lesson for developers and users alike.
Here, the Judge found the allegations that CapCut accesses and collects all the videos and photos stored on their devices, not just those they voluntarily uploaded to the CapCut app, particularly troubling. If proven true, this broad data collection practice would violate reasonable user expectations. Ringing any bells here? Drawing parallels to Riley v. California, 573 U.S. 373, 397-99 (2014), which recognized that individuals have a reasonable expectation of privacy in the contents of their cell phones, Judge Alexakis noted that a reasonable CapCut user would not expect the app to access and collect all the photos and videos on their devices, regardless of whether they use those photos and videos to create content within the app. Makes perfect sense, right?
Let that sink in for a minute. An app potentially scans your entire photo library when you only intend to edit a single clip! This broad access would be like handing a stranger your family photo album when they only ask to see one vacation picture. The Court rightly recognized how this violates our intuitive sense of privacy.
For the California constitutional and common-law privacy claims regarding user identifiers and registration information, the Court specifically relied on United States v. Soybel, 13 F.4th 584, 590-91 (7th Cir. 2021) for the principle that a person “has no legitimate expectation of privacy in information he voluntarily turns over to third parties,” which was central to dismissing claims based on this type of information.
Next, the California Invasion of Privacy Act (“CIPA”) claims represented a significant but ultimately unsuccessful component of Plaintiffs’ case. As we know, under CIPA, individuals are protected against unauthorized electronic interception of communications. Section 631(a) prohibits any person from using electronic means to “learn the contents or meaning” of any “communication” without consent or in an “unauthorized manner.” Critically, neither CIPA nor the federal Electronic Communications Privacy Act (“ECPA”) impose liability on a party to the communication, as Judge Alexakis noted in Warden v. Kahn, 99 Cal. App. 3d 805, 811, 160 Cal. Rptr. 471 (1979) held that section 631… has been held to apply only to eavesdropping by a third party and not to recording by a participant to a conversation.
Here’s where Plaintiffs ran into a fascinating legal hurdle. When you voluntarily use an app, the law often treats that app as a communication “participant” rather than an eavesdropper. Think about how different this is from our intuitive understanding… Few of us would consider a video editor an equal “participant” in our creative process, yet that’s essentially the legal fiction applied here.
Plaintiffs’ attempted to circumvent this limitation by asserting that Defendants effectively intercepted their data by “redirecting” communications to unauthorized third parties, including the Chinese Communist Party. They relied on legal authorities like Davis, 956 F.3d at 596, 607-08, where Facebook used plugins to track browsing histories even after users logged out.
Conversely, Judge Alexakis found two factual flaws in this theory. First, Plaintiffs failed to plausibly allege that any communications were intercepted during transmission rather than merely shared after collection. Though the Court acknowledged that the Seventh Circuit hadn’t definitively ruled whether interception must be contemporaneous with transmission, it noted that every court of appeals to consider the issue had reached this conclusion. See Peters v. Mundelein Consol. High Sch. Dist. No. 120, No. 21 C 0336, 2022 WL 393572, at *11 (N.D. Ill. Feb. 9, 2022). Second, Paintiffs’ allegations fell short of the specific software-tracking mechanisms proven sufficient in cases like Facebook Tracking. They identified no particular mechanism by which ByteDance contemporaneously redirected communications to third parties.
Let’s dig a little deeper so this makes sense. There’s a (legal) difference between an app intercepting your data in transit (like wiretapping a phone call) versus collecting it at the endpoint and sharing it later. Most individuals would see little practical difference in the outcome. Your private data ends up in unexpected hands either way, yet courts maintain this technical distinction that significantly impacts your legal protections.
Next, the Court rejected claims under Section 632 of CIPA, which imposes liability on parties who use an electronic amplifying or recording device to eavesdrop upon or record confidential communication. Beyond the conclusory assertions that ByteDance intercepted and recorded videos without consent, Plaintiffs failed to allege that Defendants used any electronic amplifying or recording device to eavesdrop on conversations.
Let’s switch it up now. We are going to talk about something a little different. Perhaps the most meaningful survival in the decision concerns claims under Illinois’ Biometric Information Privacy Act (“BIPA”). The Court rejected ByteDance’s argument that BIPA only applies when companies use biometric data to identify individuals. Looking at the statute’s plain language, which defines “biometric identifier” to include “voiceprint[s]” and “scan[s] of… face geometry,” the Court found no requirement that the data be used for identification purposes.
This is genuinely interesting to me. Illinois lawmakers created one of the strongest biometric protection laws in the country, and the Court’s ruling reinforces just how far those protections extend. The practical effect here is enormous. Companies can’t just escape liability by claiming they collected your facial geometry or voiceprints for purposes other than identification. The mere collection without proper consent is enough to trigger liability. This reasoning aligns with an emerging consensus in the Northern District of Illinois. In Konow v. Brink’s, Inc., 721 F. Supp. 3d 752, 755 (N.D. Ill. 2024), the Court held that a defendant may violate BIPA without using technology to identify an individual; instead, BIPA bars the collection of biometric data that could be used to identify a plaintiff.
The Court also found persuasive Plaintiffs’ detailed allegations that ByteDance employs engineers specializing in “computer vision, convolutional neural network, and machine learning, all of which are used to generate the face geometry scans that Defendants derive from the videos of CapCut users.” These technical specifications helped elevate the claims beyond mere conclusory allegations.
What’s particularly impressive here is how Plaintiffs connected the dots between ByteDance’s engineering talent, their patent applications for voiceprint technology, and the actual functions of CapCut. This level of technical detail is increasingly necessary in privacy litigation. In turn, vague claims of data collection often fail without demonstrating the underlying mechanisms involved.
For BIPA’s Section 15(c) claim, the Court relied on the statutory interpretation principle of ejusdem generis in interpreting “otherwise profit.” In Circuit City Stores v. Adams, 532 U.S. 105, 114-15, 121 S. Ct. 1302, 149 L. Ed. 2d 234 (2001), the Court noted that when general words follow specific words, they “embrace only objects similar in nature to those objects enumerated by the preceding specific words.” This interpretive principle was key to the Court’s narrower reading of the statute, limiting “otherwise profit” to commercial transactions similar to selling, leasing, or trading data. Sequentially, the Court rejected ByteDance’s argument that internal use of biometric data—such as improving CapCut’s editing features—constitutes “otherwise profiting” under BIPA. It is fascinating how this determination narrows the scope of liability under Section 15(c), signaling that plaintiffs must show an actual external transaction involving biometric data to succeed on these claims.
Next, the Court analyzed various consumer protection claims that failed because Plaintiffs couldn’t demonstrate economic injury. For their claims under California’s Unfair Competition Law (“UCL”) and False Advertising Law (“FAL”), Judge Alexakis emphasized that, unlike the broader Article III standing requirements, these statutes demand a showing that plaintiffs lost money or property. This highlights one of the most frustrating aspects of privacy litigation for consumers: proving financial harm from privacy violations is extraordinarily difficult. We intuitively understand that our personal data has value (why else would companies collect it so aggressively?). Yet, courts often struggle to quantify it or recognize its loss as economic injury. It’s like recognizing theft only when something tangible is taken.
The Court was particularly unpersuaded by theories based on the diminished value of personal data, noting Plaintiffs hadn’t alleged they attempted to sell their data or received less than market value. Davis, 956 F.3d at 599, rejected a similar argument that a loss of control over personal data constituted economic harm. The Court also referenced Cahen v. Toyota Motor Corp., 717 F. App’x 720, 723 (9th Cir. 2017), which held that speculative claims about diminished data value, without concrete evidence of lost economic opportunity, are insufficient to establish standing under consumer protection statutes.
Moreover, like in Griffith v. TikTok, Inc., No. 5:23-cv-00964-SB-E, 2023 U.S. Dist. LEXIS 223098, at *6 (C.D. Cal. Dec. 13, 2023), the Court observed that Plaintiffs failed to show they attempted or intended to participate in the market for their data. Additionally, the Court noted that Plaintiffs failed to allege any direct financial loss tied to CapCut’s data practices, distinguishing their claims from cases where courts recognized economic harm due to specific monetary expenditures, such as fraudulent charges or paid services rendered worthless by deceptive conduct.
Next, the Court turned to the core issue underlying many of Plaintiffs’ claims—what ByteDance actually did with the data it collected and whether users had truly consented to these practices. One of the most fascinating aspects of the Opinion is the battle over consent. ByteDance mounted an aggressive defense centered on its Terms of Service and Privacy Policies, arguing that users effectively waived their rights by agreeing to these documents. The company submitted three distinct versions of its Privacy Policy from 2020, 2022, and 2023, each making various disclosures about data collection practices.
Let’s be honest…When was the last time any of us actually read a privacy policy before clicking “agree”? Side note: you should. ByteDance, like many tech companies, is backing their legal protection on documents they know full well most users never read. What’s remarkable is that courts are increasingly skeptical of this fiction, recognizing the reality of how users actually interact with digital products.
Judge Alexakis’s detailed analysis here offers insights for both app developers and users alike. She recognized that while the policies could potentially be incorporated by reference since Plaintiffs mentioned them in their Complaint, she refused to dismiss the case based on consent at this early stage. The Court emphasized that dismissing claims based on affirmative defenses like waiver is only appropriate if “the allegations of the complaint itself set forth everything necessary to satisfy the affirmative defense.” See United States v. Lewis, 411 F.3d 838, 842 (7th Cir. 2005).
Here, there are several critical factual questions that prevented the Court from accepting ByteDance’s consent defense. Most notably, there was no conclusive evidence about exactly when and how the plaintiffs agreed to the terms. While ByteDance simply asserted that “[u]sers expressly or impliedly consent to the policy upon downloading and using the app,” Plaintiffs countered that they “were able to access the CapCut platform without having to scroll through and read such policies before they were allowed to sign up for the services.”
The Court was particularly skeptical of ByteDance’s reliance on what appeared to be a browsewrap agreement—where terms of service are presented passively and users are presumed to agree simply by using the service. Judge Alexakis emphasized that browsewrap agreements are only enforceable if users have actual or constructive notice of the terms. See Specht v. Netscape Commc’ns Corp., 306 F.3d 17, 30-31 (2d Cir. 2002). This means that merely linking to a privacy policy at the bottom of a webpage or app interface is insufficient to establish consent. As such, actual consent requires more than theoretical access to terms.
Additionally, the Court noted that the placement and formatting of ByteDance’s consent prompts were unclear, raising doubts about whether the plaintiffs were ever explicitly informed of the policy’s existence before using CapCut. This aligns with precedent from Nguyen v. Barnes & Noble Inc., 763 F.3d 1171, 1177 (9th Cir. 2014), where courts declined to enforce arbitration clauses hidden in inconspicuous terms of service.
This dispute highlights a pervasive problem I’m recognizing in digital consent. There is a gap between technical legal compliance and actual user understanding. Despite ByteDance presenting screenshots showing a prompt requiring users to click “Agree and continue,” Judge Alexakis noted this evidence couldn’t establish whether these particular plaintiffs had seen and agreed to these specific terms, especially because they explicitly alleged they never read any privacy policy or terms of use.
The Court also highlighted another crucial factual gap. Plaintiffs asserted that the attached policies were only three of the ten (or more) versions of the policy that existed over time. In Patterson v. Respondus, Inc., 593 F. Supp. 3d 783, 805 (N.D. Ill. 2022), the Court declined to dismiss claims based on policies because the case “may involve factual questions about what [defendant’s] policies looked like at different moments in time.”
The Court’s skepticism should serve as a wake-up call. Concealing invasive data practices within complicated legal documents and merely asserting user consent may be coming to an end. Companies that are truly dedicated to privacy must go beyond minimal compliance and strive for actual transparency and meaningful choices for users.
For the Computer Fraud and Abuse Act (“CFAA”) claim (Count I), the Court undertook a analysis of the access without authorization element. The CFAA, that was originally enacted to combat hacking, imposes liability on anyone who intentionally accesses a computer without authorization or exceeds authorized access to obtain information. While finding that Plaintiffs’ allegations were insufficient, Judge Alexakis specifically distinguished this matter from Brodsky v. Apple Inc., No. 19-CV-00712-LHK, 2019 WL 4141936 (N.D. Cal. Aug. 30, 2019), where the plaintiffs had “concede[d] that [they] voluntarily installed the software update,” which unambiguously established authorization. Here, the Court emphasized that essential questions of fact exist about the scope of the authorization and the design of Plaintiffs’ operating systems, making dismissal based on implied authorization inappropriate at this stage. The Court noted that factual disputes, such as the scope of Defendants’ access, are not appropriately resolved on a motion to dismiss.
Particularly, the Court declined to follow cases like hiQ Lab’ys, Inc. v. LinkedIn Corp., 31 F.4th 1180, 1197 (9th Cir. 2022), which held that scraping publicly available data does not constitute unauthorized access under CFAA. Unlike hiQ Labs, where access restrictions were clear, Plaintiffs alleged that CapCut accessed files beyond what they knowingly permitted. However, the Court found that Plaintiffs failed to sufficiently allege that ByteDance exceeded authorized access under Carr v. Saul, 593 U.S. 83, 141 (2021), which clarified that merely misusing information one is entitled to access does not violate the CFAA.
In dismissing the Stored Communications Act (“SCA”) claims (Count IV), Judge Alexakis found particularly significant the timing mismatch in Plaintiffs’ allegations about data sharing with the Chinese Communist Party (“CCP”). The Court notably observed that even if ByteDance shared user communications with the CCP in 2018 (as alleged by a former employee cited in Plaintiffs’ Complaint), it is too much to presume based on the engineer’s statement that these activities were ongoing several years later when CapCut became available to users in the United States. This temporal gap and the lack of specificity about what data was shared rendered the allegations too speculative to survive dismissal.
The Court also found that Plaintiffs failed to allege that ByteDance qualified as a remote computing service (“RCS”) or electronic communications service (“ECS”) under the SCA. Under the SCA, ECS is any service that provides users with the ability to send or receive wire or electronic communications. At the same time, RCS is a service that provides computer storage or processing services to the public utilizing an electronic communications system. In Garcia v. City of Laredo, 702 F.3d 788, 792 (5th Cir. 2012), the Court noted that for a company to be liable under the SCA, it must provide services that facilitate the transmission, storage, or processing of electronic communications on behalf of users—not merely collect and store user data for its purposes. Because Plaintiffs did not establish that CapCut functioned as an ECS or RCS, their SCA claims failed as a matter of law.
Lastly, Judge Alexakis granted Plaintiffs until April 2, 2025, to file an amended complaint addressing deficiencies in their dismissed claims. So whether you’re a legal professional, casual content creator, or simply concerned about data privacy, as you should be, the ongoing developments in Rodriguez v. ByteDance merit your continued attention.
So, all in all, I’m particularly encouraged by how the Court emphasized consumer expectations throughout its analysis. This suggests a shift from formalistic legal reasoning toward how privacy functions in people’s lives. Most users have never heard of BIPA or CIPA, but they instinctively recognize when an app crosses a line and invades their privacy.
For everyday app users (myself included), this case is a reminder that seemingly innocuous tools like video editors may be far more invasive than they appear. The allegations that CapCut collects all photos and videos on a device—not just those edited—should give pause to anyone who casually grants broad permissions during app installation..
So be careful out there, folks. Next time you download that trending app, consider this before blindly agreeing to permission requests. That innocent-looking video editor allegedly might be analyzing your face, recording your voice, or scanning through years of personal photos—all while you’re just trying to add a filter to your weekend outing with family and friends. Scary stuff.
And yet, whether you have any real recourse if an app oversteps its bounds depends entirely on which law—if any—happens to apply. The fact that some claims survived in this matter while others failed underscores the fragmented and inconsistent nature of privacy law in the U.S. Right now, a company’s liability for invasive data collection often hinges on whether a lawsuit is filed under a state biometric law, a consumer protection statute, or federal wiretap regulations—each with different requirements and loopholes. This patchwork approach leaves consumers vulnerable and businesses uncertain about compliance.
Imagine if physical property rights varied so drastically between states—where some protected against trespassing while others only recognized theft if an item was taken. That’s essentially our current digital privacy landscape, and without unified standards, the gaps in protection will only widen.
As always,
Keep it legal, keep it smart, and stay ahead of the game.
Talk soon!

Why Financial Institutions Should Stay the Course

Introduction
Many regulated businesses believe that the only thing worse than strict regulations is a wholly uncertain regulatory environment. With many rule changes on hold and enforcement actions and investigations being terminated or limited, how do banks, payments program managers, processors, and fintechs move forward? Do they “take their gloves off” and take advantage of a possible enforcement void to maximize profits, or do they stay the course given that there are 50-year-old laws on the books that still apply and probably are not going anywhere? 
We say, continue to innovate with the expectation that certain fundamental laws and rules are unlikely to change and that consumers still want and need financial services and products.
The Resilience of Statutes and Regulations
Most financial institutions, payments companies, and fintechs have always designed their products and services for compliance. When new rules and orders come out, they often do not have to make changes because they had a robust compliance program in place and had already been using best practices. Similarly, they are not quick to take advantage of a “bad” ruling, knowing instinctively that a new statute, order, or ruling will soon restore the status quo.
Even in a time of regulatory uncertainty, the primary federal consumer protection rules that have existed since the late 1960s and 1970s are likely to stay in place. These include the following: 

The Truth in Lending Act (TILA) and its Regulation Z, which, among other things, require loan disclosures, periodic statements for open-end credit, and prepaid account disclosures, and provide consumers with protections from unauthorized credit card transactions. 
The Electronic Fund Transfers Act (EFTA) and its Regulation E, requiring initial disclosures, regulating electronic fund transfer (EFT) arrangements, and providing significant consumer protections from unauthorized EFTs.
The Equal Credit Opportunity Act (ECOA) and its Regulation B, prohibiting impermissible forms of credit discrimination and requiring “adverse action” notices or other notifications regarding credit applications and existing extensions of credit. While the scope of the impermissible discrimination rules may change from time to time, including as a result of court decisions, the basic credit notification requirements are unlikely to change.
The Truth in Savings Act and its Regulation DD, which requires initial disclosures for consumer deposit accounts and, if statements are provided, requires specific information to be included in such statements.
The Real Estate Settlement Procedures Act (RESPA) and its Regulation X. In addition to requiring certain mortgage loan disclosures, Section 8 of RESPA prohibits referral fee and kickback arrangements involving “settlement services.” Here is one area for which the rules might be relaxed. For many years, the ability to enter into marketing services agreements and similar arrangements has been severely limited due to the Section 8 interpretations and enforcement actions of the Consumer Financial Protection Bureau (CFPB). With the CFPB being under new leadership and its future uncertain, marketing arrangements that survived Section 8 scrutiny prior to the CFPB might again be viable. 

For all of the above, while enforcement by federal regulators might be reduced, enforcement by plaintiffs’ lawyers likely will not. This seems particularly likely for those laws such as TILA, the EFTA, and ECOA that provide for class-action liability. 
State laws governing credit interest rates, loan and other product and service fees, and consumer disclosures also are likely to stay in place. Those laws might shift in some states, particularly those laws that were made more burdensome in recent years, but they are unlikely to go away entirely. 
States May Fill the Void
All of the federal laws listed above are “federal consumer financial laws” under the Dodd-Frank Act, and state attorneys general and state regulators are empowered by that act to bring a civil action to enforce any of these laws. The main exception is that a state attorney general or regulator generally may not bring such civil actions against a national bank or federal savings association.
Conclusion
Although there may be some regulatory uncertainty, some things remain constant. Lawyers will be lawyers and lawsuits will be brought, and state attorneys general and regulators can enforce the federal consumer financial laws against most banks and nonbank businesses. 
It is just a question of complying with the existing laws, applying common sense rules, and developing attractive consumer options. We are not without regulatory guardrails, but old-fashioned banking with modern innovations still provides routes to develop and market consumer products and services and build customer relationships. Those businesses that continue to innovate can take the lead.

FCC Seeks Comment on Quiet Hours and Marketing Messages

We recently published a blog about a slew of class action complaints alleging that marketing text messages cannot be sent between the hours of 9:00 pm and 8:00 am (“Quiet Hours”) unless the recipient provides prior express invitation or permission to receive such messages during Quiet Hours (“Quiet Hour Claims”). As noted, based on the plain language of the Telephone Consumer Protection Act (“TCPA”), we disagree with this argument because marketing text messages already require prior express written consent from the called party. The Ecommerce Innovation Alliance (EIA) and others filed a petition for declaratory ruling (“Petition”) with the Federal Communications Commission (“FCC”) to address this application of Quiet Hours to marketing messages.
On March 11, 2025, the FCC released a Public Notice asking for comment on the Petition. So, the FCC, and its Consumer and Governmental Affairs Bureau, have moved quickly to seek public comment on the questions raised by the petitioners.
Initial comments are due by April 10; with reply comments due by April 25. The FCC will then consider the record in contemplating a decision. There is no requirement or specific deadline for the agency to take action on the Petition. However, the plethora of Quiet Hour Claims being filed could encourage relatively prompt FCC action to clarify the rules.

ANOTHER FCC TURN OF EVENTS: As Commissioner Starks Resigns

As we still await the approval of Olivia Trusty to fill the seat that was vacated with the shift of Chairwoman Rosenworcel stepping down and Commissioner Carr entering the role of Chairman, there is a new development as of today with Commissioner Starks making a statement that he plans to resign his seat in the spring. While there is not an exact timeline, Starks did mention fulfilling his duties over the “next few weeks”.
Starks said of his time at the FCC “Serving the American people as a Commissioner on the Federal Communications Commission has been the honor of my life. With my extraordinary fellow Commissioners and the incredible career staff at the agency, we have worked hard to connect all Americans, promote innovation, protect consumers, and ensure national security. I have learned so much from my time in this position, particularly when I have heard directly from Americans on the issues that matter to them. I have been inspired by the passion, engagement and commitment I have seen from colleagues, advocates, and industry.”
Chairman Carr followed up with his own notice and shared the following in response to Starks’ resignation: “Commissioner Starks led many of the FCC’s national security initiatives, and I welcomed the chance to work closely with him on important matters, including promoting new innovations, protecting consumers, and bringing families across the digital divide. Commissioner Starks put in the work and leaves an impressive legacy of accomplishments in public service. I always learned a lot from him and benefited from the many events we held together.”
Does this now leave the FCC with only one Democrat? We will soon find out but since the role is appointed by the President, I have a feeling it will be heavily weighted with Republicans.

Arkansas Attorney General Sues GM and OnStar Over Alleged Privacy Violations

On February 26, 2025, the Attorney General of Arkansas filed a lawsuit against General Motors Co. (“GM”) and its subsidiary, OnStar LLC (“OnStar”), alleging deceptive trade practices related to the collection and sale of drivers’ data. The complaint alleges that GM and OnStar gathered detailed driving data (including precise geolocation data, GM app usage data, and information about consumers’ driving behavior (e.g., start time, end time, vehicle speed, high-speed driving percentage, late-night driving percentage, acceleration data, braking data, and distance driven)) from over 100,000 Arkansas residents without their consent and sold it to third-party data brokers. The data brokers then allegedly sold the data to insurance companies, which used the data to deny coverage or increase insurance rates for consumers. The complaint asserts that GM and OnStar collected and sold the consumer data to generate additional revenue for the companies. The Arkansas Attorney General is seeking monetary damages, injunctive relief, and attorneys’ fees and expenses.
This lawsuit follows actions by the FTC and the Texas Attorney General over similar data-sharing allegations, and is part of a larger trend of state regulators examining the privacy practices of connected vehicle manufacturers.

Virginia Legislature Passes Bill Restricting Minors’ Use Of Social Media to One Hour Per Day

On March 11, 2025, the Virginia legislature passed a bill that would amend the Virginia Consumer Data Protection Act (“VCDPA”) to impose significant restrictions on minor users’ use of social media. The bill is pending signature by Virginia Governor Glenn Youngkin, who has until March 24, 2025 to sign it into law. The bill comes on the heels of recent children’s privacy amendments to the VCDPA that took effect on January 1, 2025.
If signed into law, the bill would amend the VCDPA to require social media platform operators to (1) use commercially reasonable methods (such as a neutral age screen) to determine whether a user is a minor under the age of 16 and (2) limit a minor’s use of the social media platform to one hour per day, unless a parent consents to increase the limit. The bill would prohibit social media platform operators from altering the quality or price of any social media service due to the law’s time use restrictions.
If signed into law, the amendments to the VCDPA would take effect on January 1, 2026.

Medicare Telehealth Gets Another Temporary Lifeline – Will Congress Make it Permanent?

On March 15, 2025, President Trump signed a continuing resolution to avert a government shutdown, which included a critical six-month extension of Medicare telehealth flexibilities through September 30, 2025. This six-month extension provides a temporary reprieve from the looming expiration of telehealth waivers that have been in place since the COVID-19 Public Health Emergency (PHE). While this is a positive development, it underscores the ongoing uncertainty surrounding Medicare’s long-term telehealth policy—an issue that Congress must address with a more permanent solution. The healthcare industry has increasingly emphasized the need for regulatory certainty to support long-term planning, investment in telehealth infrastructure and sustained access to care for Medicare beneficiaries.
What the Extension Means for Providers
Medicare providers will continue to operate under the existing telehealth flexibilities for an additional six months. This means:

No Geographic or Site Restrictions – Medicare beneficiaries can receive telehealth services regardless of their location, including from their homes.
Expanded Practitioner Eligibility – A broader range of healthcare providers, including physical therapists, occupational therapists and speech-language pathologists, can continue furnishing telehealth services.
Coverage for Audio-Only Services – Medicare will maintain reimbursement for certain audio-only visits, which have been critical for reaching patients without reliable broadband access.
Hospital and Facility-Based Telehealth – Flexibilities allowing hospitals and health systems to use telehealth for certain hospital-at-home and outpatient services remain in place.
FQHCs and RHCs Participation – Federally Qualified Health Centers and Rural Health Clinics can continue to offer telehealth services, ensuring access in underserved areas. 
Mental Health Flexibilities – The in-person evaluation requirement for mental health services delivered via telehealth has been deferred, allowing patients to continue receiving mental health care via telehealth.

For hospitals, health systems and provider groups that have invested heavily in telehealth infrastructure, this extension offers short-term stability. However, the uncertainty beyond September 2025 remains a pressing concern.
Industry Perspective on the Need for Regulatory Certainty
Since the expanded use of telehealth under Medicare, healthcare providers, hospitals and technology developers have adapted their care delivery models and made significant investments in telehealth infrastructure. Many industry stakeholders have highlighted the following considerations as Congress continues evaluating the long-term future of Medicare telehealth policy:

Regulatory Stability for Long-Term Decision-Making – Healthcare organizations make strategic decisions—ranging from workforce planning to technology investments—based on long-term regulatory and reimbursement expectations. Without a definitive, long-term Medicare telehealth policy, providers must plan within an uncertain framework, creating challenges in making sustainable investments.
Access to Care for Underserved and Rural Populations – Telehealth has played a key role in expanding access to care, particularly for rural and underserved populations who may face geographic, transportation or mobility barriers. Healthcare providers serving these communities have emphasized the importance of telehealth in maintaining access to primary care, specialty services and mental health treatment. Given the growing reliance on telehealth among Medicare beneficiaries, there is industry interest in ensuring continued access to these services beyond temporary extensions.
Innovation and Growth in Digital Health – The expansion of telehealth has supported technological innovation across the healthcare industry, from remote patient monitoring to AI-driven clinical documentation tools. Industry stakeholders have noted that uncertainty around Medicare’s long-term telehealth policy can impact investment in emerging digital health solutions, as healthcare organizations and technology developers assess future regulatory and reimbursement environments.

What’s Next? The Push for Permanent Reform
With the clock now ticking toward the new September 30, 2025, deadline, major healthcare organizations are advocating for permanent legislative action. The American Telemedicine Association (ATA) and American Hospital Association (AHA) continue to urge Congress to cement telehealth’s place in modern healthcare, emphasizing its role in expanding access, improving outcomes and addressing provider shortages. Similarly, several bipartisan efforts have been initiated to establish permanent telehealth policies:
1. Telehealth Modernization Act of 2024 (H.R. 7623)This bill seeks to permanently extend certain telehealth flexibilities that were initially authorized during the COVID-19 public health emergency.
2. Creating Opportunities Now for Necessary and Effective Care Technologies (CONNECT) for Health Act of 2023 (H.R. 4189; S. 2016)This bill proposes to expand coverage of telehealth services under Medicare, aiming to remove geographic restrictions and expand originating sites, including to allow patients to receive telehealth services in their homes.
3. Preserving Telehealth, Hospital, and Ambulance Access Act (H.R. 8261)This bill aims to extend key telehealth flexibilities through 2026, including provisions for hospital-at-home programs and ambulance services.
While there appears to be bipartisan support recognizing telehealth as a vital component of modern healthcare delivery, a long-term solution is critical to ensuring that telehealth remains a viable and effective care delivery option for Medicare beneficiaries well beyond 2025. Providers should take advantage of the additional time to solidify their telehealth strategies while remaining engaged in advocacy efforts.
Stakeholders—including hospitals, health systems, provider groups and digital health technology companies —must continue urging Congress to pass permanent telehealth legislation that preserves access, ensures fair reimbursement and provides regulatory clarity.

Medicare Telehealth Flexibilities Extended through September 30, 2025

On March 14, 2025, as part of a spending bill to avert a federal government shutdown, Congress extended COVID-era telehealth “waivers” applicable to Medicare until September 30, 2025.  These were originally scheduled to end March 31, 2025.
This is welcome news for health care organizations who have relied on the flexibility offered by these waivers to extend access to telehealth services for Medicare beneficiaries and other patients nationwide since the COVID-19 pandemic. However, this represents another short-term extension by the government and poses questions on whether all or some of the telehealth flexibilities will be codified into law.
As a reminder, a set of key waivers to Medicare telehealth payment restrictions were enacted under the Social Security Act temporarily in connection with COVID-19 pandemic measures. These statutory waivers have now been extended by act of Congress multiple times, and this latest extension will have the following impacts related to telehealth:

Telehealth at Home: Medicare patients will continue to be able to receive telehealth services in their homes and in any other location in the country through at least September 30, 2025.

In the absence of this extension, Medicare beneficiaries would have only been permitted to receive telehealth services in certain approved health care facilities in rural locations (outside of metropolitan statistical areas) as of April 1, 2025.
Note that the Social Security Act does include a narrow exception that permits telehealth services in the home (or other locations) for patients in specific circumstances approved by law or regulation, including patients being treated for acute stroke symptoms, patients with a substance use disorder diagnosis, or patients with a mental health disorder (but see the additional in-person requirement for mental health telehealth treatment noted below), and patients on home dialysis for related clinical assessments.

Audio-Only Telehealth: Telehealth services can continue to be provided via audio-only communications systems.

Without the extension, telehealth services would no longer have been available via audio-only systems as of April 1, 2025, and to be reimbursed for telehealth services would require the use of approved interactive telecommunications systems only (which are defined generally to refer to audio/video equipment allowing for two-way real-time interactive communications between the patient and provider, except in narrow exceptions for store-and-forward technology under telemedicine demonstration programs).

Telehealth Providers: Medicare patients can continue to receive telehealth services from all types of approved Medicare-enrolled providers (the waiver permits qualified occupational therapists, physical therapists, speech-language pathologists, and audiologists to furnish services via telehealth and be paid by Medicare for doing so).
FQHC/RHC Telehealth: Federally qualified health centers (FQHCs) and rural health clinics (RHCs) can continue to provide telehealth services to patients in other locations.

Additionally, the legislation extends until October 1, 2025, the effective date of a requirement for reimbursement by Medicare of telehealth services to a Medicare beneficiary for purposes of diagnosis, evaluation, or treatment of a mental health disorder that:

the provider must have furnished a Medicare-covered item or service to the beneficiary in-person (without the use of telehealth) within the prior 6 months before furnishing such telehealth services, and
the provider must continue to furnish Medicare-covered items or services in-person (without the use of telehealth) to the beneficiary at least once a year following each subsequent telehealth service.

The annual in-person follow-up is not required if the provider and beneficiary agree the risks of an in-person service outweigh the benefits.

Once required, the foregoing in-person visit requirement could also be fulfilled by another provider of the same specialty in the same group as the provider furnishing the telehealth service if the telehealth provider is not available to do so.
Despite this temporary reprieve to sustain current telehealth waivers through September 30, 2025, health care organizations should start preparing now for the potential end of the waivers and additional restrictions on telehealth services as soon as October 1, 2025. Moreover, health care organizations should also be aware that additional flexibilities and waivers tied to the COVID-19 era remain in place but are scheduled to expire at the end of 2025, including DEA tele-prescribing flexibilities previously discussed here.
Seth Orkand contributed to this article

The Symbiotic Future of Quantum Computing and AI

Quantum computing has the potential to revolutionize various fields, but practical deployments capable of solving real-world problems face significant headwinds due to the fragile nature of quantum systems. Qubits, the fundamental units of quantum information, are inherently unstable and susceptible to decoherence—a process by which interactions with the environment cause them to lose their quantum properties. External noise from thermal fluctuations, vibrations, or electromagnetic fields exacerbates this instability, necessitating extreme isolation and control, often achieved by maintaining qubits at ultra-low temperatures. Preserving quantum coherence long enough to perform meaningful computations remains one of the most formidable obstacles, particularly as systems scale.
Another major challenge is ensuring the accuracy and reliability of quantum operations, or “gates.” Quantum gates must manipulate qubits with extraordinary precision, yet hardware imperfections introduce errors that accumulate over time, jeopardizing the integrity of computations. While quantum error correction techniques offer potential solutions, they demand enormous computational resources, dramatically increasing hardware requirements. These physical and technical limitations present fundamental hurdles to building scalable, practical quantum computers.
The Intersection With Neural Networks
One promising approach to mitigating these issues lies in the unexpected ability of classical neural networks to approximate quantum states. As discussed in When Can Classical Neural Networks Represent Quantum States? (Yang et al., 2024), certain neural network architectures—such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs)—can be trained to exhibit quantum properties. This insight suggests that instead of relying entirely on fragile physical qubits, classical neural networks could serve as an intermediary computational layer, learning and simulating quantum behaviors in ways that reduce the burden on quantum processors. Yang further proposes that classical deep learning models may be able to efficiently learn and encode quantum correlations, allowing them to predict and correct errors dynamically, thereby improving fault tolerance without the need for excessive physical qubits.
Neural networks capable of representing quantum states could also enable new forms of hybrid computing. Instead of viewing artificial intelligence (AI) and quantum computing as separate domains, recent research suggests a future where they complement one another. Classical AI models could handle optimization, control, and data preprocessing, while quantum systems tackle computationally intractable problems.
Ultimately, the interplay between quantum mechanics and AI will most likely reshape our approach to computation. While quantum computers remain in their infancy, AI could provide a bridge to unlock their potential. By harnessing classical neural networks to mimic quantum properties, the scientific community may overcome the current limitations of quantum hardware and accelerate the development of practical, scalable quantum systems. The boundary between classical and quantum computation may not be as rigid as once thought.

Proskauer on Privacy: 2024 Reflections & 2025 Predictions

2024 marked another significant year for privacy law, with new state legislation and high-stakes litigation reshaping the landscape. Legal battles over tracking technologies, biometric data, and children’s privacy intensified, while federal agencies, including the Federal Trade Commission (“FTC”) and the U.S. Department of Health and Human Services Office for Civil Rights (“HHS OCR”), ramped up their efforts through major enforcement actions and high-profile settlements, marking a new era of increased accountability.
Federal Privacy Law Gridlock
Attempts to pass comprehensive federal privacy legislation in 2024 fell short once again, leaving a significant gap in U.S. data protection standards and a lack of a national data privacy standard. Despite bipartisan support, the American Privacy Rights Act (“APRA”), designed to unify privacy laws, preempt conflicting state regulations, introduce a private right of action, and enforce opt-out mechanisms, did not pass the 118th Congress. Still, the last Congress passed, as part of a larger appropriations bill, the “Protecting Americans’ Data from Foreign Adversaries Act of 2024” (15 U.S.C. § 9901), which makes it unlawful for a data broker “to sell, license, rent, trade, transfer, release, disclose, provide access to, or otherwise make available personally identifiable sensitive data of a United States individual to (1) any foreign adversary country; or (2) any entity that is controlled by a foreign adversary.” Without a comprehensive federal privacy law, states were forced to fill the void by passing their own. But each state that did so had independent and distinct requirements for those laws, leading to burdensome compliance efforts, higher operational costs, and increased legal risks for businesses.
FTC Rulemaking and Enforcement Intensifies
In 2024, the FTC prioritized safeguarding sensitive data, focusing on location tracking, health data, children’s privacy, and cybersecurity. The agency secured key settlements, banning the sale of sensitive location data without consent or deidentification, investigating health data misuse, and filing a Children’s Online Privacy Protection Act (“COPPA”) action against TikTok. In terms of children’s privacy, it should also be noted that at the close of the Biden administration, the FTC finalized changes to the COPPA Rule to set new requirements surrounding the collection, use and disclosure of children’s personal information, including requiring covered websites and online service operators to obtain opt-in consent from parents for targeted advertising and other disclosures to third parties.
One notable FTC settlement prohibited a data broker from selling or sharing sensitive location data after it was collected and distributed without adequate safeguards. Another targeted a cybersecurity company accused of unlawfully selling browser data and engaging in deceptive practices. The FTC also filed complaints and secured proposed settlements with an alcohol addiction treatment service and a mental health telehealth company, alleging they illegally shared users’ health information for advertising purposes through third-party tracking tools.
The agency also intensified its focus on deceptive and fraudulent claims surrounding AI products and services. Companies using AI-driven platforms were also urged to take “necessary steps to prevent harm before and after deploying [an AI] product” to ensure fairness, minimize bias, and comply with evolving regulatory standards. As the FTC expanded enforcement in this area, businesses faced growing pressure to proactively mitigate risks and implement safeguards to avoid costly investigations and penalties.
HIPAA Enforcement and Judicial Constraints
In 2024, the HHS OCR focused heavily on enforcing the Health Insurance Portability and Accountability Act (“HIPAA”), concluding over 22 enforcement actions. However, the landmark ruling in American Hospital Association v. Becerra curtailed HHS’s authority over online tracking liability under HIPAA, holding that HHS could only regulate information that both identifies an individual and directly relates to their health.
Following the ruling, HHS voluntarily withdrew its appeal, signaling a shift in its approach to online tracking and privacy enforcement. The decision marked a critical limitation on HHS’s ability to regulate digital health technologies and underscored the ongoing tension between evolving digital practices and traditional privacy regulations.
Litigation Trends: Old Laws, Modern Issues
With no federal privacy law in place, plaintiffs in 2024 relied heavily on old electronic privacy statutes for class action lawsuits, including the Video Privacy Protection Act of 1988 (“VPPA”), Electronic Communications Privacy Act of 1986 (“ECPA”), and numerous state laws, such as California’s  Invasion of Privacy Act of 1967 (“CIPA”) and Song Beverly Credit Card Act of 1971 (“SCCA”), to address modern online privacy concerns.
While VPPA was designed to prevent video rental stores (e.g., Blockbuster) from sharing customers’ personal data and the ECPA and CIPA to prevent eavesdropping and traditional wiretapping, plaintiffs have recently repurposed these laws to target alleged misuse of internet technologies such as cookies, pixels, chatbots, and session replay technology, a trend that continued to gain traction throughout 2024. Plaintiffs have also attacked the use of these technologies using the SCCA—a statute that restricts businesses from collecting unnecessary personal identification information during credit card transactions. While originally intended for brick-and-mortar retailers, plaintiffs are now extending the statute’s application to digital commerce, limiting how businesses can request and store consumer data during online purchases.
Class action lawsuits over data breaches and mishandled opt-out requests also continued to surge, fueled by regulatory developments and high-profile breaches. Data subject requests for deletion, access, and opt-outs increased by 246% between 2021 and 2023, highlighting the demand for transparency and control. A 2024 audit found 75% of businesses failed to honor opt-out requests, highlighting the practical challenges of data privacy compliance.
To mitigate their legal privacy risks, companies will need to consider refining consent mechanisms, implementing robust consent management platforms, and exploring alternatives to cookie-based or pixel tracking. Compliance with all of these laws are critical to ensure proper disclosures, limit personal data requests, and reinforce consumer trust.
Comprehensive State Privacy Laws
In 2024, seven states enacted comprehensive privacy laws in 2024 – raising the total number of comprehensive state privacy laws to 20. Many of these laws, including Florida, Montana, Oregon, and Texas, went into effect in 2024 – Nebraska, New Hampshire, Delaware, Iowa, and New Jersey – went into effect at the beginning of 2025, Minnesota, Tennessee and Maryland will go into effect later in the year (i.e., July 2025 and October 2025 respectively). Kentucky, Rhode Island and Indiana are scheduled to go into effect in 2026.
State-level enforcement also intensified, with California, Texas, and New Hampshire leading major efforts. For example, California reached a settlement with DoorDash in February 2024 after the company purportedly sold its California customers’ personal information without providing notice or an opportunity to opt out in violation of the California Consumer Privacy Act (“CCPA”) and CalOPPA. In June 2024, the state reached another settlement with Tilting Point Media for violations of CCPA and COPPA for Tilting Point’s alleged collection and sharing children’s data without parental consent.
In addition, Texas reached several major settlements, two of which involved Meta and the company’s purported violations of biometric privacy laws, and a first of a kind settlement involving a Dallas based artificial intelligence healthcare tech company for alleged deceptive generative AI practices. The state also initiated a new suit against General Motors in August 2024 for unlawful sale of driving data, and announced an investigation into fifteen companies for potential violations of Texas’ Securing Children Online through Parental Empowerment Act and Data Privacy and Security Act.
2025 Privacy Predictions
2025 is expected to be another defining year for privacy regulation, with key trends from recent years continuing to evolve and present new challenges for businesses. The fragmentation of state-level privacy laws, increased enforcement, and the rapid evolution of rules governing biometric data and AI technologies are expected to intensify.
Businesses can expect heightened scrutiny on algorithmic transparency, and biometric protections. Generative AI is also expected to draw significant regulatory attention as the technology matures and states continue to consider additional legislation or regulations, whether it be related to marketing claims, employment, transparency, AI deepfakes, or publicity rights. Companies in health, finance, and technology, specifically, should remain vigilant as regulators push for stricter accountability. While compliance challenges and rising operational costs are likely, organizations that proactively audit data-sharing practices, update privacy policies, and ensure AI compliance will be equipped to navigate the evolving regulatory landscape and reduce overall legal risks.
Federal Legislative Efforts Still Struggle
Despite a growing appetite for a unified privacy framework, progress remains slow heading into 2025. The inability to advance the APRA in 2024 underscores the challenge of balancing state autonomy with uniform, national standards. These challenges are only further compounded by the Trump administration’s emphasis on deregulation and a heavily divided Congress. Businesses will likely continue operating without a comprehensive federal privacy law for the foreseeable future. However, renewed lobbying efforts, Congressional hearings, and mounting industry pressure suggest that the core concepts undergirding the APRA could reemerge with modifications. Moreover, it is conceivable Congress could pass legislation strengthening children’s privacy, given that the Senate overwhelmingly, with a 91-3 vote, passed legislation that included the Kids Online Safety Act and the Teen’s Online Privacy Protection Act (collectively known as COPPA 2.0); the legislation later died in the House, but it will likely be taken up again in the current session of Congress.
In the absence of clear federal guidance, businesses should expect to rely on recognized industry standards in the interim. While these standards are instructive, businesses should note that strict adherence to them may not ensure compliance with the complex web of multi-state regulations. Companies operating across multiple jurisdictions should be sure to consult legal counsel as they navigate the current patchwork of privacy laws to reduce their legal risk.
More States Join the Privacy Landscape. With More to Come?
In 2025, several state privacy laws have recently gone into effect and more are set to take effect later in the year, including Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, and Tennessee. These comprehensive privacy laws significantly expand state-level data protection regulations bringing the total number of states with privacy laws to 20. In addition, other states have lifted the data privacy law template and are debating similar bills of their own in 2025 (e.g., New York S365B), and have debated other bills related to consumer health privacy (e.g., New York Health Information Privacy Act, awaiting the governor’s signature), social media restrictions and other data privacy related issues.
With compliance becoming more complex, investments in automated tools to monitor regional legal variations are expected to grow, as businesses recognize them as critical for long-term regulatory resilience in an ever-changing environment.
Litigation Trends: Internet Tracking Technologies & Healthcare Data
Regulators and plaintiffs continue to focus on cases involving internet tracking technologies, particularly under statutes including VPPA, ECPA (and state wiretapping laws), and CIPA, as well as laws governing the general collection of website user information, such as the SCCA. These cases increasingly scrutinize how companies track, collect, and use consumer data, particularly in sensitive contexts such as healthcare and wellness.
Against this backdrop, Washington’s My Health My Data Act (“MHMDA”) which went into effect in 2024, imposes strict privacy protections on consumer health data, extending beyond traditional healthcare providers to include wellness apps, online health services, and companies handling health-related consumer information. The law requires businesses to obtain explicit consent before collecting or sharing health data, maintain transparent privacy policies, and enforce stringent security measures to prevent unauthorized access or misuse.
Notably, the first lawsuit under MHMDA was recently filed against Amazon, marking a significant test case for the law’s enforcement. Given the evolving regulatory landscape, businesses should closely monitor litigation and compliance developments in this space.
Continued Momentum for AI, Biometric and Neural Data
Neural data has become a significant privacy concern with the rapid growth of wearable devices and brain-computer interfaces. In 2024, California and Colorado amended their privacy laws to extend protections to neural data, sparking broader regulatory interest and prompting advocacy groups to push for ethical standards and stricter consent requirements. Companies developing neural data technologies, including VR applications, brainwave monitoring devices, and other wearables, are investing in advanced encryption, secure storage, and anonymization methods to safeguard this highly sensitive information.
AI also remains a key driver of both cybersecurity advancements and emerging risks in 2025. In response to privacy violations linked to AI-powered tracking in 2024, businesses are increasingly deploying AI tools to improve threat detection, monitor compliance, and secure sensitive data. Cybercriminals have also embraced AI, using it to execute more targeted and complex attacks, such as deepfake impersonation, advanced phishing schemes, automated network breaches, and large-scale data theft.
As AI adoption grows, companies face rising legal and regulatory risks. To address these challenges, businesses should consider comprehensive AI governance frameworks, including regular algorithm audits, bias detection systems, and accountability structures to meet regulatory standards and maintain consumer trust and a high-quality standard of work.
Conclusion
The transition from 2024 to 2025 marks another important moment in the privacy landscape, with escalating state regulatory demands and stricter enforcement reshaping business practices. Companies must embed privacy into their core operations. By investing in privacy-by-design frameworks, adaptive compliance systems, and monitoring of emerging risks, businesses can stay ahead of shifting regulations. Those that anticipate change, take decisive action, and prioritize reasonable data protection as a competitive advantage will not only reduce risks but position themselves as leaders in an era where privacy drives both trust and innovation.