FTC Finalizes Updates to Children’s Privacy Rule…Again

After a period of regulatory review under Chairman Andrew Ferguson, on Tuesday, April 22, 2025, the U.S. Federal Trade Commission (FTC) published amendments to the Children’s Online Privacy Protection Act (COPPA) Rule (COPPA Rule or the Rule), which was last updated in 2013. As we reported earlier this year, the FTC finalized its most recent updates to the COPPA Rule on January 16, 2025. However, that version of the amended Rule was not published before President Trump took office on January 20, 2025, and ordered a freeze on “publishing any rule to the Office of the Federal Register until a new agency head appointed or designated by the President reviews and approves the rule.” Accordingly, the FTC, now under Chairman Ferguson, once again reviewed and approved amendments to the COPPA Rule with minor changes to the version approved during the Biden administration. The amended Rule will go into effect on June 23, 2025, although most of the substantive requirements are effective April 22, 2026.
The amended Rule published on April 22, 2025, remains substantively the same as the January 16, 2025, pre-publication version. The key revisions to the Rule we previously highlighted are unchanged. They include new parental notification requirements related to data shared with third-party vendors, a new definition for “mixed audience,” the addition of biometric and government identifiers to the list of “personal information,” more robust “reasonable security” provisions, and a requirement that operators adopt and provide notice of a data retention policy. Additionally, the published Rule retains new requirements for COPPA Safe Harbor programs.
In a concurring opinion supporting the January 16 version of the amended Rule, Chairman (then Commissioner) Ferguson identified a few areas where he felt clarification in the amended Rule language would be helpful, but those changes were not implemented. Issues he identified include:

The meaning of a “material” change requiring new parental consent remains undefined. Both the 2013 version of the Rule and the amended version require that operators obtain fresh parental consent for all “material” changes to privacy terms; however, “material” remains an undefined term in the amended Rule. This may raise several different compliance obstacles, but Ferguson took specific issue with the fact that the amended Rule also requires operators to disclose to parents the identities of third-party recipients of children’s data when obtaining parental consent. With no elaboration on what is meant by “material,” he speculated that all additions or changes to the identities of third-party vendors could require an operator to request new parental consent. This mandate would increase the costs of switching third-party vendors and thus discourage the use of upstart competitors, undermining business competition.
The meaning of “retained indefinitely” remains undefined. The 2013 version of the Rule made clear that children’s data should be retained only for “as long as is reasonably necessary to fulfill the purpose for which the information was collected.” The amended Rule retains this language with minor modifications but also specifies that “[p]ersonal information collected online from a child may not be retained indefinitely.” However, no time period is set for retention. In Ferguson’s concurring opinion, he stated that “it is unclear how the requirement is any different than the existing requirement to keep the information no longer than necessary to fulfill the purpose for which it was collected.”
Collection of personal information for age verification continues to require parental consent. In his concurring opinion, Ferguson asserted that operators of mixed-audience websites or online services “wanting to use more accurate age verification techniques than self-declaration” would need information “such as photographs or copies of government-issued IDs.” Ferguson argued that the amended Rule contains “many exceptions to the general prohibition on the unconsented collection of children’s data, and these amendments should have added an exception for the collection of children’s personal information for the sole purpose of age verification, along with a requirement that such information be promptly deleted once that purpose is fulfilled.”

As of this writing, neither the Chairman nor the other sitting commissioners issued additional statements on the publication of the amended Rule. However, there are aspects of the Federal Register preamble and the Rule itself that appear to address the Chairman’s prior concerns. For example, the “material” change notification requirement is discussed in a footnote to the preamble to the Federal Register notice (and was also present in the January 16 pre-publication version), where the FTC explains that “the Commission is not likely to consider the addition of a new third-party to the already-disclosed category of third-party recipients to be a material change that requires new consent.” The amended Rule’s requirement for a written data retention policy requires reference to a time period, which appears to address Ferguson’s earlier concerns that the amended Rule does not adopt a specific temporal limit on data retention as long as data is not retained indefinitely. In addition, it appears that the Commission considered the option of allowing personal information, including photographs and biometric identifiers, to be used for age verification, but ultimately determined that the potential benefits of using this type of information for age verification were outweighed by the risks of this data being misused.
Given that this most recent round of amendments was initiated in 2019, it seems unlikely that any further amendments will be made in the near term absent further Congressional action to amend COPPA itself. However, children’s privacy continues to be an area of focus for the FTC, which will hold a workshop entitled “The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families” at 9:00 a.m. ET on June 4, 2025. This workshop will cover a variety of topics, including strategies to protect children online, such as through age verification and parental consent requirements. Members of the public can register to attend in-person, and a link to the livestream will be posted to FTC.gov the morning of the event.

Privacy Tip #441 – Identity Theft Statistics Increasing in 2025

Unfortunately, identity theft continues to increase, and according to Identitytheft.org, the statistics are going to get worse in 2025. Some of the statistics cited by Identitytheft.org include:

1.4 million complaints of identity theft were received by the Federal Trade Commission
Total fraud and identity theft cases have nearly tripled over the last decade
Cybercrime losses totaled $10.2 billion
The median loss to fraud victims is $500
There is an identity theft case every 22 seconds
33% of Americans faced some form of identity theft at some point in their lives
Consumers aged 30-39 were the most victimized by identity theft
Georgia ranked #1 for identity theft and fraud cases

Identitytheft.org concludes:
“Identity theft has been a growing problem in the U.S. for the past few years. It is difficult for victims to deal with these issues because theft methods are becoming even more sophisticated with time. Citizens must safeguard their personal information by utilizing technology such as antivirus protection software, password managers, identity theft protection, and VPNs if they want to avoid identity theft scenarios in 2025.”
These are helpful tips to consider.

Clocked In: FCC Seeks Clarity on Call Time Rules

In our previous alert, Tick-Tock, Don’t Get Caught: Navigating TCPA’s Quiet Hours, we discussed a growing wave of lawsuits targeting businesses under the Telephone Consumer Protection Act (“TCPA”) for allegedly sending text messages outside the Federal Communications Commission’s (“FCC”) designated 8:00 AM to 9:00 PM window. These suits often involve texts sent just minutes before or after the hour. Worse yet, these suits frequently target businesses whose customers had voluntarily opted into Short Message Service (“SMS”) programs. As we explained, under the plain language of the TCPA and the FCC’s rules, such messages should not be considered “telephone solicitations” at all, because the recipients gave prior express consent. As a result, they should not also be subject to the quiet hours rule.
Subsequently, the FCC took an important step that could clarify this issue. Recently, the FCC released a Public Notice seeking comment on whether the TCPA’s quiet hours apply to text messages sent with the recipient’s prior express consent. The Public Notice has drawn strong responses from both sides of the debate, including plaintiff’s attorneys, consumer advocacy groups, industry associations, and privacy-focused nonprofits. This development suggests that regulatory guidance could be forthcoming—but until then, litigation risk remains.
The FCC’s Public Notice
In its Public Notice, the FCC asked for input on whether its rules prohibit businesses from sending text messages for telemarketing purposes outside the 8:00 AM to 9:00 PM window, even if the message was sent with the consumer’s prior express consent. The FCC also invited comment on related issues in the alternative, including whether there is a non-rebuttable assumption that the NPA-NXX (i.e., the area code and local exchange of the called party) is indicative of the called party’s location when applied to wireless phone numbers.
Comments on the Petition
Several industry stakeholders filed comments supporting the petition. These commenters emphasize the plain language of the TCPA and the FCC’s own rules, which state a “telephone solicitation” does not include calls or messages made with the recipient’s prior express invitation or permission. Because the quiet hours restrictions apply only to “telephone solicitations,” the argument goes, messages sent with prior consent should be exempt by definition.
Supporters argue that requiring businesses to track the local time zone of each customer creates an unreasonable compliance hurdle, especially given the mobile nature of modern communications. Responsible Enterprises Against Consumer Harassment (“R.E.A.C.H.”) included data showing a month-over-month increase in the number of cases filed and warned that without FCC intervention, a growing number of class actions will exploit the ambiguity around this issue.
However, several organizations and law firms that regularly advocate for consumer protections filed comments opposing the petition. These commenters generally argue that the quiet hours rule applies to all marketing messages, regardless of consent, and that carving out an exception would undermine consumer privacy. For example, the Law Offices of Jibrael S. Hindi—a firm that has filed many of the recent lawsuits under this theory—opposed the petition. The firm claimed consumers may not expect to receive messages during early morning or late evening hours, even if they opted into a marketing program, and suggested consent should not extend to contact during quiet hours unless explicitly disclosed and agreed to. The National Consumer Law Center (“NCLC”) likewise encouraged the FCC to allow the issue to play out in the courts and argued against creating a presumption about quiet times based on the called party’s area code and local exchange.
What Comes Next
Now that the comment period has closed, the FCC will review the submissions and decide whether to issue a declaratory ruling. There is no set timeline for the Commission to act, and it may take several months (or longer) before any formal decision is issued. In the meantime, plaintiffs’ attorneys may continue filing lawsuits under the current ambiguity.
What Businesses Should Do in the Meantime
Until the FCC provides definitive guidance, businesses that engage in SMS marketing should consider the following steps to reduce legal exposure:

Respect the Quiet Hours: Even if you have a strong argument your messages are exempt, the safest course is to avoid sending marketing texts before 8:00 AM or after 9:00 PM in any time zone in the United States.
Audit Your Consent Process: Ensure your SMS terms clearly disclose the nature and timing of messages. Consider adding language that explains messages may be sent at any time, including early morning or evening hours.
Implement Time-Zone Safeguards: Use geolocation tools or other technology to manage time-zone targeting. Even if not legally required, this step can show a good-faith effort to comply.
Document Everything: Maintain records showing when and how a consumer provided consent, what disclosures were made, and when messages were sent.
Monitor FCC Activity: Stay informed on the status of the petition and any related FCC guidance. An eventual ruling could impact how these lawsuits are litigated and defended. Businesses in litigation may wish to consider filing motions to stay pending the outcome of the FCC petition.

Conclusion
The FCC’s request for comment is a welcome sign that regulatory clarification may be on the horizon. However, until that clarification arrives, businesses must continue to navigate the uncertain legal terrain. Stakeholders remain hopeful that the FCC will confirm what the statute and regulations already suggest: that messages sent with prior express consent are not “telephone solicitations” and are, therefore, not subject to the quiet hours rule. In the meantime, businesses should remain vigilant, stay within the safe harbor when possible, and consult legal counsel when designing or updating their marketing programs.

EU Data Act Preparedness – Last Minute Fire Drill Exercise!

In less than six months, on 12 September 2025, most provisions of EU Regulation no. 2023/2854 (the EU Data Act) will go into effect. In light of the challenging compliance efforts, from legal and contractual points of view as well as from operational and product development perspectives, affected companies should act soon to avoid liability and administrative fines and to update their contractual frameworks. 
The below checklist provides initial level guidance to assist in companies in assessming their risk exposure and identifying mitigation steps.
While the EU Data Act covers many different data-related topics, topics that are most relevant for private companies are obligations regarding collection and use of Internet of Things (IoT) data (Section 1) and switching between cloud storage/service providers (Section 2).
For IoT Companies
Does Your Company Manufacture Connected Products (Connected Products)?
Definition: Connected Products are all categories of equipment collecting data about their use or vicinity and are able to transfer this data via internet connection, also commonly referred to as “smart devices” or “IoT devices,” such as cars, televisions, refrigerators, cleaning or lawn mowing robots, kitchen tools, etc. (Source: Art. 2(5) EU Data Act)
Does Your Company Offer Related Services (Related Services) in Connection With Connected Products?
Definition: Related Services include any digital service (usually provided via an app) essential for the intended use of a Connected Product or adding additional functionalities to a Connected Product. (Source: Art. 2(6) EU Data Act)
Who Are Your Users?
Definition: If your Connected Products or Related Services are offered to customers in the European Union, the EU Data Act will apply to such product or service, regardless of whether your customers are consumers (B2C) or commercial (B2B) customers. (Source: Art. 1.3 EU Data Act)
Does Your Connected Product or Related Service Allow Users to Access Collected Product Data (Product Data)?
Definition: Product Data is all information collected by a Connected Product or Related Service in connection with using such product or service or its environment, regardless of whether such data is considered “personal data” under GDPR or not.
Action: Users have a right to have access to Product Data in real time, either directly in the IoT device or related app, or at least separately in a machine-readable format. Technical measures for enabling this access need to be implemented.
Action: Users must be provided with core information when purchasing a Connected Product or Related Service, e.g., regarding the types and amount of usage data collected, for what purposes the data will be used, where and how long the data is stored, and how the data can be accessed and stored. Information documents need to be prepared.
Action: Users may also request to grant third parties access to their Product Data. It is recommended to assess upfront under which, if any, conditions such disclosure may be rejected and on which grounds.
Does Your Company Use the Product Data for Its Own Purposes?
Action: Any use of this data for own purposes (e.g., analytics or business intelligence or advertisement) is only permitted under permission from the user of the Connected Product or Related Service to be given in a contract, including detailed provisions on the use and protective mechanisms. These contracts must follow a strict agenda and must contain certain mandatory terms. Existing customer agreements and new customer agreements will need to be updated accordingly before either 12 September 2025 (new contracts) or 12 September 2027 (for contracts executed prior to 12 September 2025 and (i) of indefinite duration or (ii) due to expire after 11 January 2034.)
Does Your Company Share Any of the Product Data With Third Parties?
Action: Product Data may be shared by your company with third parties only on the basis of a contract between the third party and the user in addition to the contractual relationship of your company with the user.. These contracts must follow a strict agenda and must contain certain mandatory terms. Agreements need to be put in place with users and third parties receiving usage data.
Does Your Company Currently Have Contracts in Place With Customers or Third Parties Entitling or Requiring Your Company to Access or Share Product Data?
Action: Existing agreements need to be reviewed for clauses regarding access to Product Data and, if such clauses exist, need to be updated to meet the above data sharing requirements.
For Cloud Storage/Service Providers:
Does Your Company Offer Cloud Services?
Definition: These are usually services enabling customers to upload their data to cloud servers; not only cloud infrastructure providers are covered, but each provider offering services around data hosting is covered, even if the cloud infrastructure is owned by another service provider.
Who Are Your Customers?
Definition: If your Connected Products or Related Services are offered to customers in the European Union, the EU Data Act will apply to your product or service, regardless of whether your customers are consumers (B2C) or commercial (B2B) customers.
Does Your Company Enable Customers To Migrate to Another Service Provider or Replace the Service With an On-Premises Solution?
Action: The EU Data Act obliges cloud service providers to remove obstacles that could deter customers from switching to another provider or an on-premises solution, regardless of the nature of the obstacle and including in particular contractual and technical obstacles. Service providers must assess if their service setup may raise such obstacles and, if necessary, remove these.
Action: Customer contracts must provide wording regarding the procedures, rights, and obligations of the parties for switching to another service provider, including termination and migration rights.
Action: Customer data must be maintained in a file format that can easily be transferred.
Does Your Company Charge Fees for Migrating Customer Data to Another Service Provider or an On-Premises Solution?
Action: From 12 January 2027 onward, cloud service providers must not charge any fees if the customer decides to migrate to another service provider or an on-premises solution. Until then, fees may not exceed the internal costs of the service provider arising in direct context with the migration.

Google Announces Next Steps for Privacy Sandbox and Tracking Protections in Chrome Browser

On April 22, 2025, Google announced that it will continue to offer third-party cookies in its Chrome browser and will not roll out a new standalone prompt for third-party cookie preferences. Chrome users must continue to make third-party cookie choices through Chrome’s existing Privacy and Security Settings. This development follows Google’s July 2024 announcement that it was scrapping its previously-declared plan to phase out the use of third-party cookies in its Chrome browser.
According to Google, this latest development is a result of the company’s engagement with stakeholders, including publishers, developers, regulators and the ad industry, which Google notes demonstrated that there remains “divergent perspectives on making changes that could impact the availability of third-party cookies.” Google also cited other factors which it had taken into account, such as the accelerated adoption of privacy-enhancing technologies and the emergence of new opportunities to safeguard and secure users’ browsing experiences with artificial intelligence (“AI”).
Google’s announcement indicates that the company intends to continue to enhance existing tracking protections and invest in technologies, such as built-in password protections and AI-powered security protections. Google also noted that in light of its update, it understands that the Privacy Sandbox APIs may have a different role to play in supporting the ad ecosystem and indicated it would share an updated roadmap for these technologies in the coming months.

EPA Announces Updates to MyPest, a Pesticide Registration Tracking App for Companies

On April 18, 2025, the U.S. Environmental Protection Agency (EPA) announced updates to its pesticide registration tracking app, MyPest. EPA states MyPest allows registrants of pesticide products to monitor the status of their pesticide registration submissions in real time. The launch of the latest version of MyPest includes updates to an enhanced dashboard page with information about the registrant’s cases and products, the ability to view detailed information of each application, and the capability to communicate with EPA staff directly within the application page.
According to EPA, MyPest gives pesticide registrants greater insight into the registration process and provides an easier way for them to communicate with EPA on registration packages under review. EPA believes this update will be “a significant step forward in making the regulatory process more efficient and transparent.” This work is part of EPA’s overall digital transformation strategy and process streamlining that will improve the timeliness of pesticide registration decisions.
EPA states over 1,200 registrants have already signed up for MyPest. Additional updates planned for later this year include further enhancements to the user experience and detailed information on the progress of registration review cases and data call-ins.
Additional information from EPA on pesticide registration is available on EPA’s website and on our blog.

GIVE UP THE NAME!: TCPA Defendant Ordered to Identify BPO Involved in Allegedly Illegal Despite Ongoing Criminal Proceedings

Every once in a while I am asked by a client to “keep so and so out of the case.”
The rule that bind attorneys in civil litigation–especially in federal court– lean quite heavily in favor of discovery of known and relevant facts. And whereas a Defendant CERTAINLY has rights to avoid burdensome or needlessly intrusive discovery, simple questions like “who made the calls” or “where did the leads come from” are almost always going to result in a court requiring an answer (no matter how great and powerful your attorney might be.)
In MARGO SIMMONS v. WP LIGHTHOUSE LLC, No. 1:24-cv-01602-SEB-MKK (S.D. Ind. April 22, 2025), for instance, a Defendant refused to identify a BPO that may have made the calls at issue in its behalf.
As the story goes, the BPO provider “is subject to ongoing criminal proceedings” and the Defendant did not want to identify the BPO for fear it would incriminate itself. That is, if the calls the BPO is under investigation for were actually made at the behest of WP Lighthouse it fears being included in the criminal proceeding.
Pause.
Does WP Whitehouse really think the BPO isn’t going to give them up to the feds/state anyway?
Unpause.
The Court in Simmons made short work of the 5th amendment argument here. Businesses have no fifth amendment rights– which is odd since they seem certainly have other constitutional rights–so the court rejected the refusal to answer just that simply. It held the Defendant must identify the BPO and provide additional information related to its relationship with the BPO.
The defendant also refused to provide information regarding its dialing platform–RingCentral–so the Court also ordered it to provide copies of contracts, communications and other records.
Pretty clear lesson here– TCPA defendants can and should fight to protect themselves against needless and burdensome discovery, but simple stuff like the names of other companies involved with phone calls are almost always going to be ordered.
As if to drive home that point the Court in Simmons is going to issue SANCTIONS against the defendant. The Court found the Defendant’s position was not substantially justified and, as a result, intends to award Plaintiff’s counsel– the Wolf Anthony Paronich–the attorneys fees incurred in having to bring the motion to compel.
Eesh. Terrible.
But so it goes.
One last note here, INDIVIDUALS who are sued personally in TCPA cases DO have 5th amendment privilege because the TCPA does contain criminal penalties. So whereas the Defendant in Simmons could not raise the privilege, if you find yourself named personally in a TCPA lawsuit be sure to discuss the issue of privilege with your counsel.

OCR Reaches Settlements with Northeast Radiology and Guam Memorial Hospital Over HIPAA Security Rule Violations

The Department of Health and Human Services’ Office for Civil Rights (“OCR”) recently announced two HIPAA enforcement actions involving failures to safeguard electronic protected health information (“ePHI”) in violation of the HIPAA Security Rule. Both cases stem from investigations into incidents that exposed sensitive health data, underscoring ongoing federal scrutiny of entities that fail to implement core compliance measures such as HIPAA risk analyses, system activity reviews and workforce access controls, into their security programs.
Northeast Radiology, P.C. (“NERAD”) agreed to a $350,000 settlement after OCR launched an investigation into the company’s use of a medical imaging storage system (“PACS”) that lacked proper access controls. The investigation stemmed from a March 2020 breach report in which NERAD disclosed that, between April 2019 and January 2020, unauthorized individuals had accessed radiology images stored on its PACS server containing unsecured ePHI, gaining access to the ePHI of nearly 300,000 individuals. OCR found that NERAD had not conducted a comprehensive HIPAA risk analysis, failed to implement procedures to monitor access to ePHI, and lacked adequate policies to safeguard sensitive data. 
In addition to the monetary settlement, NERAD agreed to a two-year corrective action plan that requires it to conduct a thorough HIPAA risk analysis to assess potential threats to the confidentiality, integrity, and availability of ePHI; implement a risk management plan to address identified security vulnerabilities; establish a process for regularly reviewing system activity, including audit logs and access reports; maintain and update written HIPAA policies and procedures; and enhance its HIPAA and security training program for all workforce members with access to PHI.
Guam Memorial Hospital Authority (“GMHA”) reached a $25,000 settlement following OCR’s investigation into two separate security incidents: a ransomware attack in December 2019 and a 2023 breach involving hackers who retained access to ePHI. Through its investigation, OCR determined that GMHA had failed to conduct an accurate and thorough HIPAA risk analysis to determine the potential risks and vulnerabilities to ePHI held in its systems. 
As part of a three-year corrective action plan, GMHA is required to conduct a comprehensive HIPAA risk analysis to identify risks to the confidentiality, integrity and availability of its ePHI; implement a risk management plan to mitigate those risks; develop a process for regularly reviewing system activity, such as audit logs and access reports; and adopt written policies and procedures to comply with the HIPAA Privacy, Security and Breach Notification Rules. GMHA also must strengthen its HIPAA training program, review and manage access credentials to ePHI, and conduct breach risk assessments, and provide supporting documentation to OCR.
Together, these enforcement actions reinforce OCR’s expectation that covered entities and business associates adopt and maintain robust, enterprise-wide security programs capable of preventing, detecting and responding to threats that compromise ePHI.

Financial Industry Concerns Cause FCC to Delay Implementation of Broad Consent Revocation Requirement under TCPA

On April 11, 2025, a controversial new rule by the Federal Communications Commission (FCC) was set to take effect to modify consent revocation requirements under the Telephone Consumer Protection Act (TCPA). But each of the rule’s mandates, as codified at 47 CFR § 64.1200(a)(10), did not go into effect on that date. Just four days before, the FCC issued an Order delaying the rule’s requirement that callers must “treat a request to revoke consent made by a called party in response to one type of message as applicable to all future robocalls and robotexts . . . on unrelated matters.” See FCCOrder, Apr. 7, 2025 (emphasis added).
The plain language of the rule is generally broad. It states that consumers may use “any reasonable method” to revoke consent to autodialed or prerecorded calls and texts, and that such requests must be honored “within a reasonable time not to exceed ten business days.” The rule then goes on to delineate certain “per se” reasonable methods by which consumers may revoke consent. For example, if a consumer responds to a text message with the words “stop,” “quit,” “end,” “revoke,” “opt out,” “cancel,” or “unsubscribe,” then the consumer’s consent is “definitively revoked” and the sender is thereafter barred from sending any “additional robocalls and robotexts.”
Many industry participants—especially the banking industry—have been critical of the rule. One major concern is its sprawling effect. For example, under the rule, if a consumer were to respond to a marketing communication with the word “unsubscribe” or the like, then the sender and all of its business units may be forced to cease unrelated forms of communication on issues such as the provision of account notices or other informational matters. 
The banking industry has taken issue with the burdens imposed by the rule as well. That include concerns about “numerous challenges” financial institutions face in attempting to modify existing call platforms to comply with the rule, with “substantial work” being required by “larger institutions with many business units with separate caller systems.” See FCC Order ¶ 6. The bank industry has also raised challenges faced by financial institutions in “designing a system that allows the institution . . . [to] not apply a customer’s revocation to a broader category of messages than the customer intended.” See FCC Order ¶ 9.
The banking industry’s concerns ultimately appear to be what persuaded the FCC to stay the implementation of Section 64.1200(a)(10) in part earlier this month. The new rule is now set to not go fully into effect until April 11, 2026. For the time being, that means banks and other companies receiving a consent revocation request from a consumer in response to one type of message may not necessarily be prohibited from communicating with the consumer using “robocalls and robotexts from that caller on unrelated matters.” The FCC nonetheless suggests—albeit vaguely—that it will enforce any additional obligations required under the new Section 64.1200(a)(10), so companies engaging in TCPA-regulated communication practices should take heed accordingly. 

MORE IS REQUIRED: Senior Life Insurance Company Out of TCPA Class Action For Too Thin Allegations

Quick one for you this am TCPAWorld.
Senior Life Insurance Company–which has the unfortunate acronym of SLIC– was sued in a TCPA class action in Virginia recently. It moved to dismiss arguing the complaint did not actually state FACTS to show it made the calls at issue.
Earlier this week the Court agreed in Matthews v. Senior Life Insurance 2025 WL 1181789 (E.D. VA April 22, 2025).
Interestingly the complaint actually did allege the calls were “from” SLIC and that a caller was an “employee” of SLIC. Indeed Plaintiff even alleges that during one of the calls he was asked “regarding qualifying for SLIC life insurance.”
Still the court found these allegations too conclusory to state a claim. Unstated here is the assumption that someone else might have been making calls on SLIC’s behalf–which shows a pretty sophisticated court that understands SLIC’s business model likely does not include a bunch of captive w-2 agents calling out to sell policies.
Pretty interesting one that TCPA defendants should keep in mind.

Florida Appellate Court Calls Audible: Agency Principles Bind Sport Spectator to Arbitration Agreement in Electronic Ticket She Never Saw

On April 9, 2025, a Florida appellate court addressed whether a football game spectator had to arbitrate her claims under the terms of a ticket she did not buy or possess. Applying traditional agency principles, the court held she did.
In Miami Dolphins, Ltd. v. Engwiller, __ So. 3d __, 2025 WL 1064381, Florida’s Third District Court of Appeal addressed whether a football game spectator who gained access to the stadium when her mother displayed the electronic tickets for both of them was required to arbitrate her negligence claims against stadium management and the football team pursuant to the ticket terms. Following the U.S. Courts of Appeals for the Fourth Circuit (applying Maryland law) and Fifth Circuit (applying Texas law), the court applied traditional agency principles in reasoning that the spectator, a non-signatory, was bound to arbitrate.
Plaintiff filed a negligence action against the Miami Dolphins and stadium management for injuries she sustained at the Hard Rock Stadium in late 2022 after a fight broke out at a Miami Dolphins-Pittsburgh Steelers game. She gained access to the stadium with electronic tickets her mother accepted from her employer. To accept the tickets, her mother logged into the Dolphins Account Manager website, which displayed the following notice between the user log-in fields and the “Sign In” button: “By continuing past this page, you agree to the Terms of Use . . . .” The phrase “Terms of Use” was hyperlinked, bold, and in a contrasting aqua ink. That hyperlink directed users to the “2022-2023 Hard Rock Stadium Ticketback Terms,” which explained that the ticket was a revocable license to enter the stadium for the event, subject to the described terms that included a mandatory arbitration provision. 
Pursuant to this provision, the team and the stadium owner moved to compel arbitration. The trial court denied the motion. The appellate court reversed in favor of arbitration. It first considered whether the mother had accepted the arbitration agreement and (1) determined that the phrase “Terms of Use” was conspicuous enough to put a reasonable user on notice and that Plaintiff’s mother assented to these terms when she claimed the tickets; and (2) rejected the Plaintiff’s assertion that the terms were merely “exemplars” and that a party seeking enforcement of an electronic contract must produce a screenshot from the same device used by the other contracting party.
Turning to the Plaintiff/daughter, the court determined that, although she never accessed or possessed the tickets, once Plaintiff allowed her mother to present the ticket on her behalf to enter the stadium, her mother acted as her agent. The court explained that all entrants to the stadium were required to agree to the conditions of the single-use license set forth in the terms and that finding otherwise would allow a guest to accept the benefits of that license without the related conditions.
Event attendees often purchase tickets on behalf of family and friends. In so doing, they accept the applicable terms. Traditional agency principles bind non-signatories to those terms, including arbitration provisions, when they use that ticket regardless of whether they access or possess it so long as the purchaser received notice of the terms.

TOO CLASSY FOR THIS SUIT: What Two Google Rulings Say About How Not To Define A Class

Greetings CIPAWorld!
Here are some exciting case updates involving Google. What started as a headline-making copyright case against Google just became required reading for anyone litigating under CIPA. So, you may be asking, what do copyright and CIPA have in common? Don’t worry… the connection will become clear as we explore these cases. In In re Google Generative AI Copyright Litig., No. 23-cv-03440-EKL, 2025 U.S. Dist. LEXIS 75740 (N.D. Cal. Apr. 21, 2025), a class of creators claimed that Google scraped their copyrighted works without permission to train its AI models. It was pitched as a massive data appropriation lawsuit. Still, the case stumbled temporarily because, as litigators know all too well, the Plaintiffs proposed an improperly defined class.
The Plaintiffs, a group of authors, illustrators, and content creators, accused Google of using their copyrighted materials to train its generative AI models without permission. I find this fascinating! While in law school, I wrote a white paper on this topic, examining the copyright implications of using creative works to train AI systems. The intersection of copyright law and emerging technologies presents novel legal challenges.
In this case, it was a sweeping theory of unauthorized data use, but the case ran into trouble the moment plaintiffs defined their class. They limited membership to individuals “whose exclusive rights under 17 U.S.C. § 106 in their registered works were infringed upon.” In re Google Generative AI Copyright Litig., 2025 U.S. Dist. LEXIS 75740, at *6. In other words, you were only in the class if Google violated your copyright.
It may seem straightforward to target affected individuals, but the Court immediately identified the problem. The class only included those who would ultimately prevail on the merits. As such, the Court couldn’t determine who was in the class without deciding if Google was liable to each potential class member. News flash… that’s what courts call a “fail-safe” class.
Judge Lee explained that “the Court cannot determine who is a member of the class without deciding the merits of each potential class member’s claim, including whether the potential class member has a valid copyright registration, whether Google infringed the class member’s work(s), and whether Google has a valid defense based on fair use or license.” Id. at *10.
As the Ninth Circuit explained in Kamar v. Radio Shack Corp., 375 F. App’x 734, 736 (9th Cir. 2010), a fail-safe class is impermissible because membership is conditioned on a legal finding. It’s circular. Because Plaintiffs’ proposed class was tied to the elements of infringement, the Court struck the class allegations under Fed. R. Civ. P. 12(f). See Google Generative AI Copyright Litig., 2025 U.S. Dist. LEXIS 75740, at *11. Judge Lee didn’t dismiss the case outright, but she gave Plaintiffs fourteen days to amend their definition.
The Court also offered a suggestion: reframe the class based on factual criteria. The revised definition proposed by Plaintiffs, “all persons or entities domiciled in the United States who owned a United States copyright in any work used by Google to train Google’s Generative AI Models during the Class Period,” was precisely that. Id. Judge Lee acknowledged that this revised definition “would not require an upfront determination by the Court that each potential class member will prevail on the merits of an infringement claim.” Id. This makes perfect sense. That’s the difference between a procedural dead-end and a viable class.
This issue isn’t unique to copyright litigation. I mean, this is CIPAWorld, right!? Plaintiffs continue to define classes as people “whose communications were intercepted” or “whose data was unlawfully shared.” These definitions don’t identify a group of people based on facts. They identify a group based on whether they’ve already proven their claim. That’s precisely what courts are rejecting.
I saw a nearly identical issue in In re Google RTB Consumer Priv. Litig., No.: 4:21-cv-2155-YGR, 2024 U.S. Dist. LEXIS 119157 (N.D. Cal. Apr. 4, 2024) a few weeks prior. That case focused on Google’s Real-Time Bidding (“RTB”) platform. Plaintiffs alleged that the system shared sensitive user data with advertisers through real-time ad auctions. The class was defined as Google account holders “whose personal information was sold or shared.” Sounds familiar, right?
Judge Yvonne Gonzalez Rogers found the class definition flawed. Like Judge Lee, she concluded that the definition was “fail safe” because it required resolving the merits. Specifically, whether Google “impermissibly shared” personal information, just to identify who belonged in the class. The Court stated: “The Court agrees with Google that, as written, the class definition is fail safe. The question on which this suit hinges is whether Google impermissibly shared its account holders’ personal information through RTB.” Id. at *18.
But that wasn’t the only issue the court addressed. Judge Rogers also cautioned that removing the contested phrases from the class definition might broaden the class so much that it would include users who weren’t harmed. The Court stated, “Defining a class so as to avoid, on one hand, being over-inclusive and, on the other hand, the fail-safe problem is more of an art than a science.” Id. at *17.
CIPA litigators should take note. These rulings aren’t just about definitions, but they’re about strategy. If the class can’t be defined in a way tethered to objective facts, plaintiffs won’t make it to the merits. Courts aren’t guessing anymore. They’re asking: Can we identify class members without deciding if the law was broken? If the answer is no, certification won’t happen.
The RTB case also surfaced another common problem in CIPA litigation: individualized consent. Judge Rogers denied certification of a Rule 23(b)(3) damages class because determining who saw disclosures and who didn’t would require user-by-user analysis. That inquiry would overwhelm common issues.
Still, the Court acknowledged that a Rule 23(b)(2) injunctive class could be appropriate. While Plaintiffs could not satisfy the predominance requirement for a damages class, the Court noted that prospective injunctive relief might proceed under a different analysis. A forward-looking injunction targets company practices going forward and doesn’t require resolving individualized consent issues for each user. But even injunctive claims must be grounded in a well-defined, objectively ascertainable class.
Despite presenting expert evidence involving millions of RTB bid requests, Plaintiffs faced one more obstacle. The Court was not persuaded that the data set reliably reflected the experience of the proposed class as a whole. Plaintiffs alleged that advertisers could determine what content users viewed and even infer their locations. But the Court held that this wasn’t enough. The data needed to be representative of the entire class experience, and the plaintiffs hadn’t met that burden. See In re Google RTB Consumer Priv. Litig., 2024 U.S. Dist. LEXIS 119157, at *32-33.
For defense counsel, the takeaway is that challenges to class definitions and evidentiary gaps remain powerful early tools to be utilized. Whether the issue is consent variability, class overbreadth, or sampling deficiencies, these rulings reinforce that procedural missteps can and often derail class actions before the merits stage. As CIPA litigation continues to sweep across California, these two Google rulings illustrate where cases are getting stuck. Defining your class around legal conclusions, relying on non-representative data, or ignoring consent variations are no longer technical errors. They are strategic liabilities.
Whether you’re responding to claims involving chat features, embedded scripts, or real-time data flows, the foundational question remains the same: who’s in your class, and how do you know? If answering that requires proving liability, the case may never reach certification.
As always,
Keep it legal, keep it smart, and stay ahead of the game.
Talk soon!