FCC Seeks Comment on Quiet Hours and Marketing Messages

We recently published a blog about a slew of class action complaints alleging that marketing text messages cannot be sent between the hours of 9:00 pm and 8:00 am (“Quiet Hours”) unless the recipient provides prior express invitation or permission to receive such messages during Quiet Hours (“Quiet Hour Claims”). As noted, based on the plain language of the Telephone Consumer Protection Act (“TCPA”), we disagree with this argument because marketing text messages already require prior express written consent from the called party. The Ecommerce Innovation Alliance (EIA) and others filed a petition for declaratory ruling (“Petition”) with the Federal Communications Commission (“FCC”) to address this application of Quiet Hours to marketing messages.
On March 11, 2025, the FCC released a Public Notice asking for comment on the Petition. So, the FCC, and its Consumer and Governmental Affairs Bureau, have moved quickly to seek public comment on the questions raised by the petitioners.
Initial comments are due by April 10; with reply comments due by April 25. The FCC will then consider the record in contemplating a decision. There is no requirement or specific deadline for the agency to take action on the Petition. However, the plethora of Quiet Hour Claims being filed could encourage relatively prompt FCC action to clarify the rules.

ANOTHER FCC TURN OF EVENTS: As Commissioner Starks Resigns

As we still await the approval of Olivia Trusty to fill the seat that was vacated with the shift of Chairwoman Rosenworcel stepping down and Commissioner Carr entering the role of Chairman, there is a new development as of today with Commissioner Starks making a statement that he plans to resign his seat in the spring. While there is not an exact timeline, Starks did mention fulfilling his duties over the “next few weeks”.
Starks said of his time at the FCC “Serving the American people as a Commissioner on the Federal Communications Commission has been the honor of my life. With my extraordinary fellow Commissioners and the incredible career staff at the agency, we have worked hard to connect all Americans, promote innovation, protect consumers, and ensure national security. I have learned so much from my time in this position, particularly when I have heard directly from Americans on the issues that matter to them. I have been inspired by the passion, engagement and commitment I have seen from colleagues, advocates, and industry.”
Chairman Carr followed up with his own notice and shared the following in response to Starks’ resignation: “Commissioner Starks led many of the FCC’s national security initiatives, and I welcomed the chance to work closely with him on important matters, including promoting new innovations, protecting consumers, and bringing families across the digital divide. Commissioner Starks put in the work and leaves an impressive legacy of accomplishments in public service. I always learned a lot from him and benefited from the many events we held together.”
Does this now leave the FCC with only one Democrat? We will soon find out but since the role is appointed by the President, I have a feeling it will be heavily weighted with Republicans.

Arkansas Attorney General Sues GM and OnStar Over Alleged Privacy Violations

On February 26, 2025, the Attorney General of Arkansas filed a lawsuit against General Motors Co. (“GM”) and its subsidiary, OnStar LLC (“OnStar”), alleging deceptive trade practices related to the collection and sale of drivers’ data. The complaint alleges that GM and OnStar gathered detailed driving data (including precise geolocation data, GM app usage data, and information about consumers’ driving behavior (e.g., start time, end time, vehicle speed, high-speed driving percentage, late-night driving percentage, acceleration data, braking data, and distance driven)) from over 100,000 Arkansas residents without their consent and sold it to third-party data brokers. The data brokers then allegedly sold the data to insurance companies, which used the data to deny coverage or increase insurance rates for consumers. The complaint asserts that GM and OnStar collected and sold the consumer data to generate additional revenue for the companies. The Arkansas Attorney General is seeking monetary damages, injunctive relief, and attorneys’ fees and expenses.
This lawsuit follows actions by the FTC and the Texas Attorney General over similar data-sharing allegations, and is part of a larger trend of state regulators examining the privacy practices of connected vehicle manufacturers.

Virginia Legislature Passes Bill Restricting Minors’ Use Of Social Media to One Hour Per Day

On March 11, 2025, the Virginia legislature passed a bill that would amend the Virginia Consumer Data Protection Act (“VCDPA”) to impose significant restrictions on minor users’ use of social media. The bill is pending signature by Virginia Governor Glenn Youngkin, who has until March 24, 2025 to sign it into law. The bill comes on the heels of recent children’s privacy amendments to the VCDPA that took effect on January 1, 2025.
If signed into law, the bill would amend the VCDPA to require social media platform operators to (1) use commercially reasonable methods (such as a neutral age screen) to determine whether a user is a minor under the age of 16 and (2) limit a minor’s use of the social media platform to one hour per day, unless a parent consents to increase the limit. The bill would prohibit social media platform operators from altering the quality or price of any social media service due to the law’s time use restrictions.
If signed into law, the amendments to the VCDPA would take effect on January 1, 2026.

Gender-Affirming Care Protections Eroded by Recent HHS Guidance and White House Executive Orders

On February 20, 2025, the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) announced the recission of “HHS Notice and Guidance on Gender Affirming Care, Civil Rights, and Patient Privacy” (the “Rescinded 2022 Guidance”) pursuant to recent Executive Order (“EO”) 14187 (“Protecting Children from Chemical and Surgical Mutilation”) and EO 14168 (“Defending Women from Gender Ideology Extremism and Restoring Biological Truth to the Federal Government”), issued under the current Trump administration.
These executive orders directed HHS to revoke policies promoting gender-affirming care and reconsider its interpretation of civil rights protections and health information privacy laws as they relate to such care.
Background on the Rescinded 2022 Guidance
The Rescinded 2022 Guidance, originally issued on March 2, 2022 under the Biden administration, and which we previously discussed here, established a framework for applying federal civil rights protections and patient privacy laws to gender-affirming care in three key ways:

Section 1557 of the Affordable Care Act (ACA): The Rescinded 2022 Guidance asserted that federally funded entities restricting access to gender-affirming care could be in violation of Section 1557, which prohibits discrimination based on sex, including gender identity.
Section 504 of the Rehabilitation Act and the Americans with Disabilities Act (ADA): The Rescinded 2022 Guidance took the position that gender dysphoria could qualify as a disability, meaning that restricting access to care based on gender dysphoria could constitute unlawful discrimination.
Health Insurance Portability and Accountability Act of 1996 (HIPAA): The Rescinded 2022 Guidance interpreted HIPAA’s Privacy Rule to prohibit the disclosure of protected health information (PHI) related to gender-affirming care without the patient’s authorization, except in limited circumstances when explicitly required by law.

HHS Bases for the Rescission
OCR Acting Director, Anthony Archeval, stated that the “recission is a significant step to align civil rights and health information privacy enforcement with a core Administrative policy that recognizes that there are only two sexes: male and female.” The HHS Office on Women’s Health also issued guidance expanding on the sex-based definitions set forth in the EO 14168. This HHS guidance contained the following definitions:

Sex: A person’s immutable biological classification as either male or female.
Female: is a person of the sex characterized by a reproductive system with the biological function of producing eggs (ova). We note that EO 14168 defines female in a slightly different manner to mean “a person belonging, at conception, to the sex that produces the large reproductive cell.”
Male: is a person of the sex characterized by a reproductive system with the biological function of producing sperm. We note that EO 14168 defines female in a slightly different manner to mean “a person belonging, at conception, to the sex that produces the small reproductive cell.”

In its February 20, 2025 press release, HHS further stated that “[t]his rescission supports Administration policy in Executive Order 14187 that HHS will not promote, assist, or support “the so-called ‘transition’ of a child from one sex to another, and it will rigorously enforce all laws that prohibit or limit these destructive and life-altering procedures.”
Further, OCR’s formal recission letter dated February 20, 2025, outlining several reasons leading to the Rescinded 2022 Guidance:

ACA (Section 1557): HHS cited recent federal cases, Texas v. EEOC and Bostock v. Clayton County, as calling into question the legal basis for extending Section 1557 protections to gender identity. But see Kadel v. Folwell, 2024 WL 1846802 (4th Cir. 2024) (On May 8, 2024, the Fourth Circuit affirmed the trial court rulings that the exclusion of coverage for gender affirming care by state health plans in West Virginia and North Carolina violated the nondiscrimination protections of the Affordable Care Act (ACA) Section 1557).
Rehabilitation Act and ADA: HHS argued that gender dysphoria does not meet the statutory definition of a disability, as the law explicitly excludes gender identity-related conditions unless resulting from a physical impairment. However, the Fourth Circuit, in Williams v. Kincaid, 45 F. 4th 759, 770 (4th Cir. 2022), concluded that gender dysphoria is a disability protected under the ADA and does not fall within the ADA’s exclusion for “gender identity disorders not resulting from physical impairments.” See also Blatt v. Cabela’s Retail, Inc., 2017 WL 2178123 (E.D. Pa. May. 18, 2017) (Plaintiff’s gender dysphoria, which substantially limits her major life activities of interacting with others, reproducing, and social and occupational functioning, is not excluded from ADA protection.)
HIPAA: HHS stated that the Rescinded 2022 Guidance lacked a legal foundation for restricting PHI disclosures beyond HIPAA’s established exceptions. However, we note that current established exceptions already allow disclosures without patient authorization in certain circumstances, including when required by law. Interestingly, the new reproductive health amendments to HIPAA, which became effective on December 23, 2024, may, if interpreted broadly, provide additional privacy protections to information related to gender affirming care. 

In addition to the recission, HHS also announced its launch of HHS’ Office on Women’s Health website, which we reference above, to promote these policies.
Impact on HIPAA and Patient Privacy
In the wake of the Rescinded 2022 Guidance and associated OCR statements, it remains unclear how OCR will now handle complaints related to the use and disclosure of PHI concerning gender-affirming care. Accordingly, entities that handle such data should carefully review their internal policies to ensure compliance with evolving interpretations of HIPAA’s Privacy Rule.
However, entities should also consider the HIPAA Privacy Rule to Support Reproductive Health Care Privacy, finalized in April 2024, which broadly defines “reproductive health care.” Gender-affirming care often falls within this definition, meaning that certain privacy protections may still apply under this rule despite the Rescinded 2022 Guidance. While HHS’s recent actions suggest a lack of intent to defend this interpretation, the 2024 reproductive health rule remains in effect despite ongoing litigation in Texas challenging these amendments. On September 8, 2024, the Texas Attorney General, in litigation pending in the Northern District of Texas, claimed that the new rule harms the AG’s ability to investigate medical care, lacks statutory authority, and is arbitrary and capricious. This litigation is still pending.
Compliance and Legal Considerations

Federal vs. State Law Conflicts: Entities must navigate the potential conflicts between state laws and the rescission of the Rescinded 2022 Guidance. For instance, Colorado and California have laws explicitly protecting access to gender-affirming care, which could create legal complexities for providers and insurers operating under multiple jurisdictions.
Litigation and Injunctions: On March 4, 2025, a federal judge in Maryland issued a preliminary injunction enjoining federal agencies from issuing regulations or guidance or otherwise implementing mandates of EO 14187. This injunction applies nationwide. In a more limited fashion, a judge in Washington issued a preliminary injunction which applies only to Washington, Colorado, Minnesota, and Oregon. As the Maryland court is still deciding on the merits of the case before it, entities should monitor these legal developments to understand go forward compliance obligations under both federal and state regulations.
Potential Whistleblower Protections. EO 14187 also directs HHS, in consultation with the Attorney General, to “issue new guidance protecting whistleblowers who take action related to ensuring compliance with this order.” Accordingly, it is possible that under such contemplated guidance, an increase in whistleblower-initiated compliance investigation may ensue. Yet, such increase in whistleblowing as an avenue to evaluate compliance would not address the potential friction between the requirements under the HIPAA Privacy Rule to Support Reproductive Health Care Privacy.
Threats to Funding.  On March 5, 2025, numerous health care providers enrolled in the Medicare and Medicaid programs received a letter from CMS stating that “CMS may begin taking steps in the future to align policy, including CMS-regulated provider requirements and agreements, with the highest-quality medical evidence in the treatment of the nation’s children” as it relates to gender affirming care. The following day, on March 6, 2025, SAMHSA and HRSA sent similar letters to Hospital Administrators and Grant Recipients referencing the March 5, 2025 CMS letter and threatening examination of current grants and the “re-scoping, delaying or potentially cancelling new grants in the future” depending upon the nature of the work being performed by the providers and/or grant recipients as it relates to gender affirming care for minors.

Key Takeaways
The rescission of the 2022 “HHS Notice and Guidance on Gender Affirming Care, Civil Rights, and Patient Privacy” seeks to align HHS’s policies with the Trump administration’s stance on gender-affirming care. The recission introduces financial and compliance challenges for entities regulated by the HHS. However, the recission of the Rescinded 2022 Guidance does not eliminate all HIPAA provisions related to reproductive health and other state-level protections may still provide certain privacy and anti-discrimination safeguards relative to individuals seeking gender affirming care. Given this uncertainty, organizations should revisit their policies and procedures, closely monitor the evolving regulatory landscape, and keep a close eye on litigation outcomes to ensure continued compliance.

The Latest Attack on Consumer Arbitration Agreements

The war against arbitration agreements continues apace. The latest volley comes from the U.S. Court of Appeals for the Fourth Circuit, Johnson v. Continental Finance Company, LLC, No. 23-2047 (4th Cir. Mar. 11, 2025). In Johnson, the court considered whether a change-in-terms provision in a cardholder agreement rendered arbitration and delegation clauses illusory under Maryland law. In a 2-1 decision featuring opinions by all three panel members, the court said “yes,” and found the arbitration and delegation clauses unenforceable.
Plaintiffs filed putative class-action complaints against Continental Finance Company, LLC and Continental Purchasing, LLC. Continental moved to compel arbitration pursuant to the arbitration provision contained in the cardholder agreement Plaintiffs received upon account opening. Plaintiffs opposed, arguing the cardholder agreement lacked consideration because the agreement’s change in terms provision permitted Continental to unilaterally amend the agreement at its “sole discretion”:
We can change any term of this Agreement, including the rate at which or manner in which INTEREST CHARGES, Fees, and Other Charges are calculated, in our sole discretion, upon such notice to you as is required by law. At our option, any change will apply both to your new activity and to your outstanding balance when the change is effective as permitted by law.
Affirming the district court, a majority of the panel agreed that the arbitration clause was illusory because the change-in-terms provision allowed Continental to “change any term of [the] Agreement in [its] sole discretion, upon such notice to [Plaintiffs] as is required by law.” Citing a decision by the Supreme Court of Maryland (Cheek v. United Healthcare), the majority said such provisions “are so one-sided and vague” under Maryland law that they “allow[] a party to escape all of its contractual obligations at will,” including the obligation to arbitrate. Based on this, the majority held that the arbitration and delegation clauses were unenforceable.
Judge Wilkinson’s lead opinion raises a difficult question: If the change-in-terms provision renders the arbitration clause illusory, then why doesn’t it render the entire cardholder agreement illusory? To be sure, the plaintiffs limited their argument to the arbitration and delegation clauses, and the majority affirmed that these were the only provisions that its judgment disturbed. The lead opinion doesn’t answer this question. To our eyes, we see no limiting principle that would prevent the same argument from taking down the entire cardholder agreement. What’s good for the goose is good for the gander: Arbitration agreements are to be treated just like every other contract under state law. If the change-in-terms provision nullifies the formation of the arbitration agreement, the same should be true for every other term in the contract. Such a drastic outcome would jeopardize the formation of countless consumer contracts. As the dissent (authored by Judge Niemeyer) points out, the change-in-terms language here is “legal and widespread.” All that is required is sufficient notice of the change. If consumers don’t like the change, they negotiate with their wallets and take their business elsewhere.
Perhaps sensing this gap in the lead opinion, Judge Wynn addresses it in his decisive concurrence. But in doing so, he frankly raises more troubling questions. He points to another Maryland Supreme Court case (Holmes v. Coverall N.A., Inc.) stating that “an arbitration provision contained within a broader contract is a separate agreement that requires separated consideration in order to be legally formed.” This strand of Maryland law strikes us as potentially unlawful as preempted under the Federal Arbitration Act. Again, arbitration agreements must be treated on the same footing as every other contract under state law. No one disagrees that every other provision in Continental’s contract can be negotiated collectively and supported by the same pot of consideration. So why do arbitration agreements require something different and more rigorous under Maryland law? Though we’re obviously Monday morning quarterbacking this case, our answer is: They shouldn’t.
As noted at the top, Johnson is part of a larger judicial war by plaintiffs’ lawyers and consumer advocacy groups against consumer arbitration—one that we expect to grow in ferocity given the Trump administration’s recent defanging (and defunding) of the CFPB. Several courts have limited the enforcement of arbitration provisions in consumer contracts where plaintiffs have argued that the unilateral modification of such contracts to include arbitration provisions was illusory or did not comply with the implied covenant of good faith and fair dealing. See Canteen v. Charlotte Metro Credit Union, 900 S.E.2d 890 (N.C. 2024); Decker v. Star Fin. Grp., Inc., 204 N.E.3d 918 (Ind. 2023); Badie v. Bank of Am., 67 Cal. App. 4th 779 (1998). And prior to the recent changes in Washington, the CFPB had proposed a rule making one-sided “change-in-terms” provisions illegal and unenforceable.
We note however that several courts have gone the other way, see, e.g., SouthTrust Bank v. Williams, 775 So. 2d 184 (Ala. 2000), and the cases that have refused to enforce arbitration provisions have indicated that such provisions may be enforceable where the change in terms clause expressly requires a detailed description of changes before they become effective (Johnson) or the contract previously had a governing law provision that specified the forum for the resolution of disputes (Canteen).
Companies that have arbitration provisions or are considering adding them to their consumer contracts should stay apprised of the developing law in this area, particularly in the states in which they are located. Please talk to a lawyer before you draft or promulgate an arbitration clause—an ounce of prevention is worth a pound of cure.

Proskauer on Privacy: 2024 Reflections & 2025 Predictions

2024 marked another significant year for privacy law, with new state legislation and high-stakes litigation reshaping the landscape. Legal battles over tracking technologies, biometric data, and children’s privacy intensified, while federal agencies, including the Federal Trade Commission (“FTC”) and the U.S. Department of Health and Human Services Office for Civil Rights (“HHS OCR”), ramped up their efforts through major enforcement actions and high-profile settlements, marking a new era of increased accountability.
Federal Privacy Law Gridlock
Attempts to pass comprehensive federal privacy legislation in 2024 fell short once again, leaving a significant gap in U.S. data protection standards and a lack of a national data privacy standard. Despite bipartisan support, the American Privacy Rights Act (“APRA”), designed to unify privacy laws, preempt conflicting state regulations, introduce a private right of action, and enforce opt-out mechanisms, did not pass the 118th Congress. Still, the last Congress passed, as part of a larger appropriations bill, the “Protecting Americans’ Data from Foreign Adversaries Act of 2024” (15 U.S.C. § 9901), which makes it unlawful for a data broker “to sell, license, rent, trade, transfer, release, disclose, provide access to, or otherwise make available personally identifiable sensitive data of a United States individual to (1) any foreign adversary country; or (2) any entity that is controlled by a foreign adversary.” Without a comprehensive federal privacy law, states were forced to fill the void by passing their own. But each state that did so had independent and distinct requirements for those laws, leading to burdensome compliance efforts, higher operational costs, and increased legal risks for businesses.
FTC Rulemaking and Enforcement Intensifies
In 2024, the FTC prioritized safeguarding sensitive data, focusing on location tracking, health data, children’s privacy, and cybersecurity. The agency secured key settlements, banning the sale of sensitive location data without consent or deidentification, investigating health data misuse, and filing a Children’s Online Privacy Protection Act (“COPPA”) action against TikTok. In terms of children’s privacy, it should also be noted that at the close of the Biden administration, the FTC finalized changes to the COPPA Rule to set new requirements surrounding the collection, use and disclosure of children’s personal information, including requiring covered websites and online service operators to obtain opt-in consent from parents for targeted advertising and other disclosures to third parties.
One notable FTC settlement prohibited a data broker from selling or sharing sensitive location data after it was collected and distributed without adequate safeguards. Another targeted a cybersecurity company accused of unlawfully selling browser data and engaging in deceptive practices. The FTC also filed complaints and secured proposed settlements with an alcohol addiction treatment service and a mental health telehealth company, alleging they illegally shared users’ health information for advertising purposes through third-party tracking tools.
The agency also intensified its focus on deceptive and fraudulent claims surrounding AI products and services. Companies using AI-driven platforms were also urged to take “necessary steps to prevent harm before and after deploying [an AI] product” to ensure fairness, minimize bias, and comply with evolving regulatory standards. As the FTC expanded enforcement in this area, businesses faced growing pressure to proactively mitigate risks and implement safeguards to avoid costly investigations and penalties.
HIPAA Enforcement and Judicial Constraints
In 2024, the HHS OCR focused heavily on enforcing the Health Insurance Portability and Accountability Act (“HIPAA”), concluding over 22 enforcement actions. However, the landmark ruling in American Hospital Association v. Becerra curtailed HHS’s authority over online tracking liability under HIPAA, holding that HHS could only regulate information that both identifies an individual and directly relates to their health.
Following the ruling, HHS voluntarily withdrew its appeal, signaling a shift in its approach to online tracking and privacy enforcement. The decision marked a critical limitation on HHS’s ability to regulate digital health technologies and underscored the ongoing tension between evolving digital practices and traditional privacy regulations.
Litigation Trends: Old Laws, Modern Issues
With no federal privacy law in place, plaintiffs in 2024 relied heavily on old electronic privacy statutes for class action lawsuits, including the Video Privacy Protection Act of 1988 (“VPPA”), Electronic Communications Privacy Act of 1986 (“ECPA”), and numerous state laws, such as California’s  Invasion of Privacy Act of 1967 (“CIPA”) and Song Beverly Credit Card Act of 1971 (“SCCA”), to address modern online privacy concerns.
While VPPA was designed to prevent video rental stores (e.g., Blockbuster) from sharing customers’ personal data and the ECPA and CIPA to prevent eavesdropping and traditional wiretapping, plaintiffs have recently repurposed these laws to target alleged misuse of internet technologies such as cookies, pixels, chatbots, and session replay technology, a trend that continued to gain traction throughout 2024. Plaintiffs have also attacked the use of these technologies using the SCCA—a statute that restricts businesses from collecting unnecessary personal identification information during credit card transactions. While originally intended for brick-and-mortar retailers, plaintiffs are now extending the statute’s application to digital commerce, limiting how businesses can request and store consumer data during online purchases.
Class action lawsuits over data breaches and mishandled opt-out requests also continued to surge, fueled by regulatory developments and high-profile breaches. Data subject requests for deletion, access, and opt-outs increased by 246% between 2021 and 2023, highlighting the demand for transparency and control. A 2024 audit found 75% of businesses failed to honor opt-out requests, highlighting the practical challenges of data privacy compliance.
To mitigate their legal privacy risks, companies will need to consider refining consent mechanisms, implementing robust consent management platforms, and exploring alternatives to cookie-based or pixel tracking. Compliance with all of these laws are critical to ensure proper disclosures, limit personal data requests, and reinforce consumer trust.
Comprehensive State Privacy Laws
In 2024, seven states enacted comprehensive privacy laws in 2024 – raising the total number of comprehensive state privacy laws to 20. Many of these laws, including Florida, Montana, Oregon, and Texas, went into effect in 2024 – Nebraska, New Hampshire, Delaware, Iowa, and New Jersey – went into effect at the beginning of 2025, Minnesota, Tennessee and Maryland will go into effect later in the year (i.e., July 2025 and October 2025 respectively). Kentucky, Rhode Island and Indiana are scheduled to go into effect in 2026.
State-level enforcement also intensified, with California, Texas, and New Hampshire leading major efforts. For example, California reached a settlement with DoorDash in February 2024 after the company purportedly sold its California customers’ personal information without providing notice or an opportunity to opt out in violation of the California Consumer Privacy Act (“CCPA”) and CalOPPA. In June 2024, the state reached another settlement with Tilting Point Media for violations of CCPA and COPPA for Tilting Point’s alleged collection and sharing children’s data without parental consent.
In addition, Texas reached several major settlements, two of which involved Meta and the company’s purported violations of biometric privacy laws, and a first of a kind settlement involving a Dallas based artificial intelligence healthcare tech company for alleged deceptive generative AI practices. The state also initiated a new suit against General Motors in August 2024 for unlawful sale of driving data, and announced an investigation into fifteen companies for potential violations of Texas’ Securing Children Online through Parental Empowerment Act and Data Privacy and Security Act.
2025 Privacy Predictions
2025 is expected to be another defining year for privacy regulation, with key trends from recent years continuing to evolve and present new challenges for businesses. The fragmentation of state-level privacy laws, increased enforcement, and the rapid evolution of rules governing biometric data and AI technologies are expected to intensify.
Businesses can expect heightened scrutiny on algorithmic transparency, and biometric protections. Generative AI is also expected to draw significant regulatory attention as the technology matures and states continue to consider additional legislation or regulations, whether it be related to marketing claims, employment, transparency, AI deepfakes, or publicity rights. Companies in health, finance, and technology, specifically, should remain vigilant as regulators push for stricter accountability. While compliance challenges and rising operational costs are likely, organizations that proactively audit data-sharing practices, update privacy policies, and ensure AI compliance will be equipped to navigate the evolving regulatory landscape and reduce overall legal risks.
Federal Legislative Efforts Still Struggle
Despite a growing appetite for a unified privacy framework, progress remains slow heading into 2025. The inability to advance the APRA in 2024 underscores the challenge of balancing state autonomy with uniform, national standards. These challenges are only further compounded by the Trump administration’s emphasis on deregulation and a heavily divided Congress. Businesses will likely continue operating without a comprehensive federal privacy law for the foreseeable future. However, renewed lobbying efforts, Congressional hearings, and mounting industry pressure suggest that the core concepts undergirding the APRA could reemerge with modifications. Moreover, it is conceivable Congress could pass legislation strengthening children’s privacy, given that the Senate overwhelmingly, with a 91-3 vote, passed legislation that included the Kids Online Safety Act and the Teen’s Online Privacy Protection Act (collectively known as COPPA 2.0); the legislation later died in the House, but it will likely be taken up again in the current session of Congress.
In the absence of clear federal guidance, businesses should expect to rely on recognized industry standards in the interim. While these standards are instructive, businesses should note that strict adherence to them may not ensure compliance with the complex web of multi-state regulations. Companies operating across multiple jurisdictions should be sure to consult legal counsel as they navigate the current patchwork of privacy laws to reduce their legal risk.
More States Join the Privacy Landscape. With More to Come?
In 2025, several state privacy laws have recently gone into effect and more are set to take effect later in the year, including Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, and Tennessee. These comprehensive privacy laws significantly expand state-level data protection regulations bringing the total number of states with privacy laws to 20. In addition, other states have lifted the data privacy law template and are debating similar bills of their own in 2025 (e.g., New York S365B), and have debated other bills related to consumer health privacy (e.g., New York Health Information Privacy Act, awaiting the governor’s signature), social media restrictions and other data privacy related issues.
With compliance becoming more complex, investments in automated tools to monitor regional legal variations are expected to grow, as businesses recognize them as critical for long-term regulatory resilience in an ever-changing environment.
Litigation Trends: Internet Tracking Technologies & Healthcare Data
Regulators and plaintiffs continue to focus on cases involving internet tracking technologies, particularly under statutes including VPPA, ECPA (and state wiretapping laws), and CIPA, as well as laws governing the general collection of website user information, such as the SCCA. These cases increasingly scrutinize how companies track, collect, and use consumer data, particularly in sensitive contexts such as healthcare and wellness.
Against this backdrop, Washington’s My Health My Data Act (“MHMDA”) which went into effect in 2024, imposes strict privacy protections on consumer health data, extending beyond traditional healthcare providers to include wellness apps, online health services, and companies handling health-related consumer information. The law requires businesses to obtain explicit consent before collecting or sharing health data, maintain transparent privacy policies, and enforce stringent security measures to prevent unauthorized access or misuse.
Notably, the first lawsuit under MHMDA was recently filed against Amazon, marking a significant test case for the law’s enforcement. Given the evolving regulatory landscape, businesses should closely monitor litigation and compliance developments in this space.
Continued Momentum for AI, Biometric and Neural Data
Neural data has become a significant privacy concern with the rapid growth of wearable devices and brain-computer interfaces. In 2024, California and Colorado amended their privacy laws to extend protections to neural data, sparking broader regulatory interest and prompting advocacy groups to push for ethical standards and stricter consent requirements. Companies developing neural data technologies, including VR applications, brainwave monitoring devices, and other wearables, are investing in advanced encryption, secure storage, and anonymization methods to safeguard this highly sensitive information.
AI also remains a key driver of both cybersecurity advancements and emerging risks in 2025. In response to privacy violations linked to AI-powered tracking in 2024, businesses are increasingly deploying AI tools to improve threat detection, monitor compliance, and secure sensitive data. Cybercriminals have also embraced AI, using it to execute more targeted and complex attacks, such as deepfake impersonation, advanced phishing schemes, automated network breaches, and large-scale data theft.
As AI adoption grows, companies face rising legal and regulatory risks. To address these challenges, businesses should consider comprehensive AI governance frameworks, including regular algorithm audits, bias detection systems, and accountability structures to meet regulatory standards and maintain consumer trust and a high-quality standard of work.
Conclusion
The transition from 2024 to 2025 marks another important moment in the privacy landscape, with escalating state regulatory demands and stricter enforcement reshaping business practices. Companies must embed privacy into their core operations. By investing in privacy-by-design frameworks, adaptive compliance systems, and monitoring of emerging risks, businesses can stay ahead of shifting regulations. Those that anticipate change, take decisive action, and prioritize reasonable data protection as a competitive advantage will not only reduce risks but position themselves as leaders in an era where privacy drives both trust and innovation.

When Does Venting Become a Complaint?

Imagine this all too-familiar scenario:

A company makes the difficult decision to terminate an employee’s employment due to poor performance. This should come as no surprise to the employee, who has been counselled and disciplined on numerous occasions. Yet, the employee expresses shock and outrage. During the termination meeting they express that they believe they are not being terminated because of their (well-documented) performance issues, but in retaliation for having made a “complaint.” Panic ensues.
The company’s human resources (HR) manager has never heard about any complaints lodged by this employee, and there is nothing in the personnel file or HR records reflecting any complaint. Baffled, the HR manager keeps digging and eventually learns that, a few weeks ago, in the midst of a casual chat with his supervisor, this employee mentioned that he works more hours than other employees, but his compensation doesn’t seem to reflect the additional time he puts in. 
This raises the age-old question — when does a general “vent session” become a protected complaint that the company must investigate? The answer may surprise some employers. According to the Department of Labor, employees cannot be retaliated against for inquiring about their pay, hours of work, or other rights. (Additionally, the National Labor Relations Act provides protections to covered workers who engage in concerted activity — which includes raising concerns about terms and conditions of employment.) Therefore, in a general sense, complaints of this nature should always be addressed and possibly investigated. 
However, it is often hard to know when a comment or vent session rises to the level of a protected complaint. As a rule of thumb, if an employee raises the topic of wages or house of work (as the employee in our scenario did), employers should err on the side of caution and investigate the issue. 
At a minimum, supervisors should be instructed to bring comments of this nature to HR or in-house or outside counsel for a determination of whether an investigation is warranted. While in some instances this may feel like overkill, it is a “better safe than sorry” approach that will protect the company in the end.

My Health, My Dollar: Amazon’s Health Data Troubles in Washington

Amazon faces allegations of unauthorized data collection in violation of federal and state privacy laws, including a first-of-its-kind claim under Washington’s My Health My Data Act (“MHMDA”).
The MHMDA restricts businesses from collecting, sharing, or selling any-health related information about a consumer without their consent of “valid authorization”, going beyond the typical protections provided by the Health Insurance Portability Accountability Act (“HIPAA”).
The case against Amazon brings into focus the potential repercussions for companies dealing in health-related data and using modern internet tracking technologies for the operation of their websites.
Businesses—especially those dealing in health-related data—must scrutinize their data privacy practices to ensure alignment with an ever-evolving legal landscape.

* * *
Privacy and health law experts no longer need to hold their breath: the first major lawsuit under Washington’s recently enacted MHMDA was filed against Amazon. (Maxwell v. Amazon.com, Inc., No. 2:25-cv-00261 (W.D. Wash. Filed Feb. 10, 2025)). In broad terms, the Western District of Washington lawsuit alleges that Amazon violated federal wiretapping laws and Washington state privacy and consumer protection rules by gathering location data via its software development kits (“SDKs”), which it then used for targeted advertising and third party data sales, all without affirmative user consent or valid authorization.
At the heart of Maxwell is the alleged violation of the MHMDA. Under the MHMDA, a violation is deemed an unfair or deceptive act under the Washington state consumer protection statute (the “Washington CPA”). The case underscores the growing risks companies engaging with consumer health information face in the modern privacy era.
Washington’s My Health My Data Act
Enacted in April 2023 and effected March 2024, MHMDA (HB 1155) represents a significant stride toward enhancing privacy protections related to health data within Washington. Emerging from growing concerns surrounding the misuse of reproductive health data, the Act aims to safeguard personal health information from unauthorized collection, storage, or sale, except where explicit consent is given by individuals.
Specifically, the MHMDA states that a regulated entity or a “small business” may not collect or share any consumer health data except “with consent from the consumer for such collection for a specified purpose” or “to the extent necessary to provide a product or service that the consumer to whom such consumer health data relates has requested from such regulated entity or small business.” The Act also applies to a wider range of consumer health data than what is typically covered under HIPAA, obliging entities falling under its scope to meticulously manage health-related data practices and paving the way for increased scrutiny over the efficacy of those practices in protecting sensitive consumer information.
Notably, the MHMDA grants a private right of action to impacted plaintiffs, with remedies that include actual damages and attorney’s fees (plus the potential for an additional award of trebled damages) under the Washington CPA.
Maxwell v. Amazon
The Maxwell case marks the debut of the first private right of action for a MHMDA violation. The putative class action complaint alleges that Amazon improperly accessed and monetized user data obtained through certain location-based apps (e.g., OfferUp and the Weather Channel) equipped with its SDKs, taking advantage of geolocation functions inherent in them. According to the lawsuit, these apps transmitted sensitive information, including biometric and precise location data, which might reflect individuals’ engagements with health services or attempts to acquire or receive health services or supplies—a direct breach of the MHMDA’s stringent privacy mandate. 
In addition, the complaint alleges that beyond not obtaining consumer consent, Amazon did not make certain MHMDA-required disclosures, such as failing to: “clearly and conspicuously disclose the categories of consumer health data collected or shared; the purpose of the collection or sharing of consumer health data; the categories of entities with whom the consumer health data is shared; and how the consumer can withdraw consent from future collection.to disclose prior to the data collection the categories of consumer health data collected or shared, the purpose of such alleged data collection, the categories of entities with whom the consumer health data is shared; and how the consumer can withdraw consent from future collection.”
According to the plaintiff, Amazon defies the prohibitions outlined by both federal statutes and the MHMDA because users were unaware of—and thus did not consent to—Amazon’s full data access when using those apps. The complaint asserts that when a mobile app using Amazon’s SDK requests location data access, users are “not provided with an opportunity to grant or deny access to Amazon as well.” The suit seeks not only injunctive relief to halt data practices lacking user consent but also damages for the purported privacy violations.
While the outcome remains uncertain, the first-of-its-kind case will serve as a critical data point in evaluating the MHMDA’s strength and definition in legal environments, drawing parallels to prior claims under California’s privacy laws.
Key Takeaways

Implicated business navigating this novel territory will want to pay close attention to the Maxwell case. 
More importantly, those businesses should be sure to normalize regular assessments of their privacy policies and tracking technology functionalities to ensure compliance with, among the patchwork of state privacy laws across the country, the MHMDA.
Legal counsel should guide companies involved in the data-driven market in tailoring strategies to mitigate privacy risks, avoiding hefty fines and legal disputes.

What Honda’s CCPA Penalty Means for Your Privacy Compliance

The California Privacy Protection Agency (CPPA) has reached a settlement with American Honda Motor Co., Inc. (Honda), as outlined in this Order of Decision. The Order is the CPPA’s first public enforcement action involving a significant monetary penalty of $632,500, arising from its investigation into the privacy practices of connected vehicle manufacturers that began in July 2023.  
The CPPA asserted that Honda violated the California Consumer Privacy Act (CCPA) by requiring consumers to undergo an extensive identity verification process, including for requests where verification is not permitted under the CCPA. Honda’s process for accepting data subject requests through authorized agents also included unnecessary and non-permitted steps.
Additionally, the CPPA asserted that Honda’s cookie management platform violated the CCPA, as it required a two-step process for opting out of advertising cookies and tracking technologies while consenting (or reconsenting) to cookies required just a single click, making it more burdensome to opt out of, rather than consent to such data processing. Honda was also unable to produce any of its contracts with third party advertising vendors to show that they were implementing the required contractual provisions under the CCPA. 
To resolve the CPPA’s allegations, Honda has agreed to pay $632,500 in monetary penalties and revise its privacy practices, including implementing a simpler process for consumers to exercise their privacy rights, minimizing data collection for verification purposes and modifying its contract management and tracking processes.  
The CPPA’s Order signals an intent to hold businesses accountable for their data subject request processes. Below are some steps you can take to ensure compliance and mitigate the risk of similar penalties:

Revisit your process for responding to data subject requests and ensure that your verification process is appropriately tailored.
Review (or implement) a process for receiving, verifying and responding to data subject requests.
Review your contracts with vendors to confirm they include the required provisions.
Assess (or implement) your cookie management platform to ensure opt-out processes are simple and symmetrical.

New York Attorney General Reaches $650,000 Settlement with Student Social Networking App Developer Over Privacy Violations

On March 7, 2025, New York Attorney (“NY AG”) General Letitia James announced a $650,000 settlement with Saturn Technologies Inc. (“Saturn”), the developer of the Saturn App, a social networking app geared towards high school students and built around customized school calendars.
In its action against Saturn, the NY AG alleged that the company promised at various times between 2018 and August 2023 to verify users’ school email credentials to ensure (1) that the Saturn App did not allow non-students to join and (2) only users from the same school could interact with each other on the app. The NY AG alleged that, in contrast to these promises, Saturn stopped authenticating high school email credentials in 2021, thereby permitting users from different high schools to message each other and allowing “unverified” non-students to join with almost complete access to all Saturn App features. The NY AG alleged that these practices violated New York Executive Law § 63(12), which prohibits engaging in repeated fraudulent acts in the carrying on, conducting, or transaction of business. The NY AG also alleged that Saturn engaged in deceptive trade practices, violating both New York General Business Law § 349 and Section 5 of the FTC Act.
The AG’s investigation also determined that Saturn:

Did not screen out new users based on birth date to determine they were high-school aged until August 2023, and continues to not screen out fraudulent users based on location.
Copied users’ contact books (with names, personal phone numbers, and other contact information) and continued using the information even when users updated their settings to deny the Saturn App access to their contacts.
Implemented a “friendship verification” process with security vulnerabilities, which enabled unverified users to continue to access certain personal information of verified Saturn App users.
Promoted the Saturn App through other high school students (“Student Ambassadors”) without disclosing that those students received compensation for completing assigned marketing tasks.
Failed to keep sufficient records regarding data privacy, data permissions, user verification, and user privacy.

Under the terms of the settlement, Saturn must pay $650,000 in penalties and costs, provide users under the age of 18 with enhanced privacy options (including hiding social media links from non-friends for all new users under the age of 18 by default), document all changes related to user privacy policies and procedures, submit its user interface for NY AG approval, and develop a marketing training program.
The settlement agreement also requires Saturn to:

Notify users regarding app verification changes and provide them with options to modify privacy settings.
Prompt all users under 18 to review their privacy settings every six months.
Refrain from making future claims about user safety or verification unless the company has a reasonable basis for making the claim based on competent and reliable scientific evidence.
Limit the information about non-Saturn App-users that can be entered into the App by Saturn App users (i.e., the non-Saturn App user’s class enrollment or event attendance).
Allow teachers to block student names, initials or other personal identifiers from appearing in the Saturn App’s class schedule feature.
Delete retained copies of the phone contact books of certain users.
Hide the personal information of current users under 18 until Saturn Technologies obtains informed consent to the new Saturn App terms.

ONCE A BUSINESS NUMBER NOT ALWAYS A BUSINESS NUMBER: Court Finds Shelton Can Sue For B2B Calls to Number That He Used to Use for Business Purposes But Not Anymore

One of the most commonly asked questions I receive is whether B2B calls made to numbers on the DNC list are legal.
It is a bit of a tricky answer. I fully explore it here.
Quickly: the DNC prevents calls to residential numbers, so the purpose of the call does not matter, only the use of the number called. And when a number is used both for business and for residential purposes it is considered a “mix use” number and counts as a residential line.
In essence, therefore, B2B calls to a cellular phone on the DNC list are simply not safe to make without EBR or permission because there’s generally no way to know if the number is a business or residential line.
Just to drive that point home, imagine a situation where a cellular phone was actually found–by a court– to be a business line and a resulting TCPA suit was dismissed as a result of calls to that exact number.
Such a number would be safe to call as a business line right?
Wrong.
Check this out.
In Shelton v. Pro Source 2025 WL 817485 (E.D. Pa March 14, 2025) the Plaintiff–the famous James Shelton–brought suit for allegedly unsolicited marketing calls to his cell phone.
Now this was the very same cell phone number that was previously found to be a business number that was not protected by the DNC rules in Shelton v. Target AdvanceLLC, No. 18-2070, 2019 WL 1641353 (E.D. Pa. Apr. 16, 2019). There the judge held because Plaintiff “held his phone number out to the world as a business phone number” he lacked standing under the TCPA on that claim.
Five years later a lady named Brittney Wilson, an employee of Pro Source Lending Group LLC, called that very same cell phone number apparently in an effort to offer Mr. Shelton a business loan.
Shelton sued Brittney personally–as well as her employer–arguing that he had since STOPPED using the phone for business purposes and that it is now just his residential phone.
Brittney and co. moved to dismiss and guess what? The Court sided with Shelton and found that because he has stopped using the phone for business purposes five years ago his phone was, once again, a residential number.
The Court was also unmoved by the fact that Shelton had filed so many TCPA suits and had hired a lawyer–Andrew Perrrong–who, himself was previously a serial litigant. Indeed, the Court pointed out this “dynamic duo” had joined forces to bring this suit:
In this case, James Shelton, a prolific plaintiff, and his counsel, Andrew Roman Perrong, equally prolific as a litigator under the TCPA, have joined forces to file a class complaint against Defendants. 
But the court determined that Shelton’s volume of litigation alone did not bar him from bringing suit.
Last, the Court held that Ms. Wilson can be sued personally for the calls at issue. The Court followed the majority of cases that have found an employee, agent or officer of a company engaged in conduct that violates the TCPA can be personally liable for that conduct. The fact she made the call for her employer is irrelevant.
Because she allegedly made the calls at issue from her cell phone she can be personally liable to Plaintiff–and potentially the class.
Eesh.
Take aways:

B2B cold calls ae extremely dangerous; and
Personal lability under the TCPA lurks everywhere. This lady picked up her cell phone to make the calls at issue and was still sued personally. Don’t hang your employees out to dry! Get good counsel and protect yourself (and them).

Chat soon.