WhatsApp? A Legally Binding Contract….
In the recent case of Jaevee Homes Limited v. Mr Steven Fincham, the English High Court has handed down judgment that an exchange of WhatsApp messages between the parties formed a basic and legally binding contract, providing a reminder to parties involved in pre-contract discussions to exercise caution.
Background facts
The case centred around a contractual dispute between a property developer, Jaevee Homes Limited (“Claimant”) and a demolition contractor, Steve Fincham, trading as Fincham Demolition (“Defendant”) who the Claimant had hired to undertake demolition works. The parties exchanged WhatsApp messages in April-May 2023 regarding the work with the Claimant confirming the job via WhatsApp on 17 May 2023. On 26 May 2023, a formal subcontract and purchase order was emailed on the behalf of the Claimant to the Defendant however, it was never signed or acknowledged.
The key issue in dispute was determining the exact terms of the contract between the parties, particularly in relation to the payment terms. The Claimant argued that the terms of the written subcontract which incorporated its standard terms and sent to the defendant on 26 May 2023 were binding. On the other hand, the Defendant believed that a basic contract had been formed as a result of WhatsApp messages exchanged on 17 May 2023.
Outcome
The basic criteria for concluding most types of legally binding contract under English is well established: one party makes an offer which the other accepts and some money (or something else of at least nominal value) must pass between the parties. Importantly though, in most cases there is no requirement for a contract to be reduced to writing and signed by the parties nor is there any requirement for acceptance to be formally communicated with acceptance by conduct or implication being very common.
In reaching a decision in this case, the Judge applied these principles, using a common sense and contextual approach when reviewing the WhatsApp messages which had passed between the parties. As a result the Judge confirmed that the messages “whilst informal, evidenced and constituted a concluded contract” as opposed to pre-contractual negotiations. Although the messages did not agree all aspects concerning payment, the messages did confirm the relevant fees, scope of work and final date of payment.
In particular, the Judge focused on a relatively informal exchange of messages between the parties on 17 May 2023 in which the Defendant asked “Are we saying it’s my job mate so I can start getting organised mate” to which the Claimant responded “Yes”, holding that meant a legally binding contract had come into force on that date.
The Judge went on to note that at this point “there was no express indication that the final terms of the agreement between the Parties depended upon agreement as to any other matter such as incorporation of the Claimant’s standard terms”. Therefore, the subcontract and purchase order issued to the Defendant on 26 May which had never been signed or acknowledged by the Defendant were irrelevant as a legally binding contract had already come into force nine days before.
Impact of the ruling
This judgment is a helpful reminder that under English contract law, it is easy to (inadvertently) create a legally binding contract and that caution should be exercised when engaging in informal pre-contract discussions. In particular, if a party’s position is that any award of work is made subject to its standard terms or the conclusion of a formal written contract, that should be clearly stated in discussions and work not allowed to commence until a formal written contract has been concluded. If a written contract is concluded, that should contain an “entire agreement” clause which seeks to exclude any pre-contract discussions to provide certainty that neither party will be able to rely on statements or representations made during discussions which are not reflected in the final written contract.
Second Circuit Sets Precedent Limiting VPPA in Facebook Pixel Cases
Over the past several years, class action litigants have flooded federal dockets with Video Privacy Protection Act (VPPA) cases against companies that embed Facebook’s Pixel tool on their websites. The plaintiffs have generally claimed that Pixel improperly tracks and relays information about consumers’ video watching habits and that the transfer of such data to Facebook violates VPPA’s data privacy protections.
The Second Circuit just turned the tide this month on that category of VPPA claims. In Solomon v. Flipps Media, Inc., the court held that an online video provider does not violate VPPA simply by disclosing PII if an “ordinary person” cannot readily identify a specific individual’s video-watching behavior from the data disclosed. In other words, coded or encrypted data disclosures that are indecipherable by the average person—who lacks the same sophistication of the tech companies collecting the data—do not run afoul of the statute. By way of example, the mere disclosure of coded URLs and Facebook ID numbers, without a more direct link to a consumer’s personal identity, is insufficient.
In adopting the “ordinary person” standard above, the Second Circuit notably joins the Third and Ninth Circuits (which apply a similar standard), while rejecting a more liberal “reasonable foreseeability” test used in the First Circuit.
The Solomon decision is a significant shift in favor of defendants in that the decision reinforces a more restrained interpretation of VPPA, after courts had previously expanded the statute’s application to digital mediums that did not exist when the statute was first enacted. Congress had passed VPPA in 1988 to address limited privacy concerns (and really political concerns) after a newspaper disclosed one Supreme Court nominee’s rental history from a local video store. The statute’s enactment had nothing to do with online video content. The plain language of the statute itself focuses on the privacy of consumers who rent or purchase “video cassette tapes” or “similar audio visual materials” through a “video tape service provider.”
The Second Circuit thus provides a much-needed course correction, grounding the statute closer to its original purpose while acknowledging the realities of digital technology. Moving forward, future plaintiffs seeking to bring a VPPA action within the Second Circuit will face material hurdles if they cannot articulate how the defendant may have disclosed video-watching data that a lay person can understand.
Quiet Hours Violation: Dick’s Sporting Goods Allegedly Caught Off Guard Over After-Hours Texts
Hey TCPAWorld,
Its heating up in South Florida and I am not talking about our sunny weather. Late-night marketing texts are fueling serious litigation, and this time, Dick’s Sporting Goods finds itself in some (legal) heat!
A new class action lawsuit was filed in the Southern District of Florida, and it is another reminder that the “Quiet Hours” complaints are keeping folks up at night, literally! Under the TCPA and its implementing regulation, telephone solicitations to any residential telephone subscriber are prohibited before 8 a.m. or after 9 p.m. local time. 47 C.F.R. § 64.1200(c)(1). And a familiar South Florida law firm is making a name for itself with these lawsuits.
In Melinda Tindol v. Dick’s Sporting Goods, the plaintiff is suing Dick’s for allegedly sending marketing text messages in the middle of the night. According to the complaint, between July 11, 2024, and February 9, 2025, Dick’s Sporting Goods initiated multiple telephone solicitations to the Tindol’s cell phone. Specifically, the messages were sent at 10:32 PM, 1:03 AM, 1:47 AM, 1:51 AM, 3:02 AM, 3:05 AM, and even 4:40 AM—all in the plaintiff’s local time zone. Tindol alleges that these late-night texts were sent to advertise, promote, or market Dick’s property, goods, or services in violation of the TCPA.
Tindol seeks to represent a class of: All persons in the United States who from four years prior to the filing of this action through the date of class certification (1) Defendant, or anyone on Defendant’s behalf, (2) placed more than one marketing text message within any 12-month period; (3) where such marketing text messages were initiated before the hour of 8 a.m. or after 9 p.m. (local time at the called party’s location).
This lawsuit follows the typical playbook: screenshots of the texts, precise time stamps, and a proposed class definition designed to pull in thousands.
While these lawsuits continue to mount, R.E.A.C.H. isn’t staying on the sidelines. Last month, R.E.A.C.H. asked the FCC to help end these abusive lawsuits. R.E.A.C.H.’s position is simple: the TCPA’s time restrictions apply only to unsolicited messages, and texts sent with a consumer’s prior express invitation or permission fall outside those limits. If someone opts in, the timing restrictions shouldn’t apply. The full comment is available here: REACH – Comments on ‘Quiet Hours’ Petition -04102025
CFPB Withdraws Medical Debt Rule After Legal Challenge from Industry Groups
On May 1, the CFPB filed a joint motion with two financial trade groups to vacate a Biden-era rule barring most medical debt from appearing on consumer credit reports. The motion comes after lender groups filed a lawsuit in January, arguing that the rule unlawfully exceeded the CFPB’s statutory authority under the Fair Credit Reporting Act (FCRA).
The vacated rule would have removed an estimated $49 billion in medical debt from credit reports. The rule would have also barred creditors from considering medical debt in credit decisions and prohibited consumer reporting agencies (CRAs) from furnishing coded medical debt data for such purposes.
According to the joint filing, the CFPB’s rule contradicted express statutory permission by not allowing CRAs to furnish, and creditors to use, medical debt data if it coded to conceal health details. In addition, the rule allegedly rested on outdated and limited evidence, and would have imposed significant compliance costs and degrade the utility of consumer credit reports by suppressing accurate, non-identifying information about borrowers obligations.
Putting It Into Practice: The CFPB’s request to vacate its own medical debt reporting rule marks another example of the Bureau narrowing its regulatory focus under new leadership (previously discussed here and here). Credit reporting agencies should continue to monitor CFPB related developments and assess whether compliance updates are needed.
Listen to this post
Charges Dropped Against Early Cryptocurrency Exchange Operator
Go-To Guide:
In early 2025, a federal judge dismissed charges against an Indiana businessman for failure to register his cryptocurrency exchange platform with Financial Crimes Enforcement Network (FinCEN).
In doing so, the court noted that FinCEN had not put the industry on notice that cryptocurrency businesses could be subject to the registration requirement until late 2013, after the core time period at issue in the indictment.
Although the Department of Justice (DOJ) immediately appealed the decision, it later withdrew its appeal and voluntarily dismissed all proceedings against the defendant on April 23.
This dismissal of charges against an early cryptocurrency exchange operator follows DOJ’s recent announcement of a nationwide shift in its approach to digital asset enforcement.
In late April, at the government’s request, an Indiana federal judge put a final end to the prosecution of an Indiana man for allegations that he engaged in unlicensed money transmission (and related tax offenses) in connection with his operation of a virtual currency exchange from 2009 to 2013.1 The case represents a relatively rare instance in which a court granted a pretrial motion to dismiss charges related to unlicensed money transmission, although the impact of the decision may be limited to cases from 2013 and earlier—the year that FinCEN issued key guidance on the topic. The case has also attracted attention for what it may signal about DOJ’s digital asset enforcement priorities.
United States v. Pilipis
In early 2024, federal prosecutors in Indiana charged Maximiliano Pilipis with money laundering and willful failure to file tax returns, based on allegations that, from 2009 to 2013, Pilipis operated cryptocurrency exchange platform AurumXchange that was required to, but did not, register as a money transmitting business.
The Bank Secrecy Act requires money transmitters to register with FinCEN, and 18 U.S.C. § 1960 makes it a crime, among other things, to knowingly operate an unregistered money transmitting business. In 2013 and again in 2019, FinCEN issued guidance to clarify that the definition of money transmitter includes those who make a business of accepting, exchanging, and transmitting virtual currencies such as Bitcoin. And Congress, in the Anti-Money Laundering Act of 2020, codified the extension of money transmission to any transfers of “value that substitutes for currency.”2 Notably, however, the conduct at issue in the Pilipis indictment predated that guidance and legislation.
In a motion to dismiss the indictment, Pilipis argued that the government already investigated AurumXchange 14 years ago and did not identify any wrongdoing. He also argued that at the time his business was operational, the legal framework surrounding virtual currencies was ambiguous, and prior to March 2013, it was unclear whether entities like AurumXchange were even required to register with FinCEN.
In February 2025, the court dismissed the money laundering counts to the extent they were predicated on a violation of § 1960 prior to the issuance of the 2013 FinCEN guidance, concluding that AurumXchange had no obligation to register with FinCEN prior to that guidance. The court allowed the tax charges to proceed and also indicated that there was a fact issue about whether any of the alleged money laundering conduct post-dated the 2013 FinCEN guidance and might therefore state a viable offense.
DOJ appealed the dismissal order to the Seventh Circuit in late February 2025, but later withdrew its appeal and moved to dismiss both the criminal case and a related civil forfeiture case on April 23, 2025. Judge Magnus-Stinson granted that motion and dismissed the criminal and civil cases with prejudice the same day.
The move follows an April 7, 2025, memorandum issued by U.S. Deputy Attorney General Todd Blanche, which announced that DOJ will “no longer pursue litigation or enforcement actions that have the effect of superimposing regulatory frameworks on digital assets while President Trump’s actual regulators do this work outside the punitive criminal justice framework.” Among other things, the memorandum directed prosecutors not to charge “regulatory violations in cases involving digital assets,” including “unlicensed money transmitting under 18 U.S.C. § 1960(b)(l)(A) and (B)… unless there is evidence that the defendant knew of the licensing or registration requirement at issue and violated such a requirement willfully.” Takeaways
Virtual currency businesses and other early adopters of emerging technologies have been subject to a certain degree of legal uncertainty for some time, as laws and regulations struggle to keep up with the pace of innovation. In this instance, the district court declined to apply regulatory guidance retroactively, and the DOJ abandoned its enforcement efforts.
Members of the digital assets community should continue to monitor developments in this space in light of the administration’s approach to digital asset regulation and enforcement. However, even if the DOJ may be limiting the types of enforcement cases it will bring against digital asset firms, it may continue to prosecute cases that reflect the administration’s priorities (e.g., fraud, money laundering, and sanctions violations). In addition, state enforcement activity is continuing to date and may increase.
1 United States v. Pilipis, Case No. 1:24-cr-00009-JMS-MKK.
2 Pub. L. No. 116–283 § 6201(d), codified at 31 U.S.C. § 5330(d)(1)(A) (eff. Jan. 1, 2021).
The BR Privacy & Security Download: May 2025
Welcome to this month’s issue of The BR Privacy & Security Download, the digital newsletter of Blank Rome’s Privacy, Security, & Data Protection practice.
STATE & LOCAL LAWS & REGULATIONS
State Regulators Form Bipartisan Consortium for Privacy Issues: The California Privacy Protection Agency and the Attorneys General of California, Colorado, Connecticut, Delaware, Indiana, New Jersey, and Oregon have created the Consortium of Privacy Regulators (the “Consortium”), a bipartisan consortium, to collaborate on various privacy issues. The seven states all have comprehensive privacy laws that are currently or will be in effect, and the Consortium will collaborate on the implementation and enforcement of their respective state laws. The Consortium will hold regular meetings not only to share expertise and resources, but also to coordinate efforts to investigate potential violations of applicable laws.
CPPA Issues Updated ADMT Proposed Rules and Opens Comment Period for Data Broker Deletion Mechanism Proposed Rules; California Governor Urges CPPA to Not Enact ADMT Proposed Rules: The California Privacy Protection Agency (“CPPA”), the regulatory authority charged with enforcing the California Consumer Privacy Act, as amended by the California Privacy Rights Act (“CCPA”), has released a revised version of its proposed regulations on cybersecurity audits, risk assessments, and automated decision-making technology (“ADMT”). Among the notable modifications offered by the CPPA were to narrow the definition of ADMT, remove behavioral advertising from ADMT and risk assessment requirements, and reduce the kinds of evaluations that businesses would have to undertake when using ADMT. California’s Governor, Gavin Newsom, sent a letter to the CPPA, urging the agency not to enact the proposed regulations on ADMT, stating that the regulations “could create significant unintended consequences and impose substantial costs that threaten California’s enduring dominance in technological innovation.” In addition to the proposed ADMT regulations, the CPPA has progressed its rulemaking under the California Delete Act. The CPPA has opened the formal public comment period on its proposed regulations for the Delete Request and Opt-Out Platform. The Delete Act requires the CPPA to establish an accessible deletion mechanism to allow consumers to request the deletion of personal information from all registered data brokers through a single deletion request to the CPPA. The comment period will remain open until June 10, 2025.
Bill Introduced to Stop California CIPA Claims: The California Senate introduced S.B. 690, which aims to stop lawsuits for violations of the California Invasion of Privacy Act (“CIPA”) based on the use of cookies and other online tracking technologies. There has been a recent trend of class actions under CIPA, where plaintiffs claim that the use of cookies and tracking technologies on websites violates CIPA because such technologies facilitate wiretapping and constitute illegal pen registers or trap and trace devices. Not even businesses compliant with the CCPA that provide consumers with the ability to opt out of the sharing of personal information with providers of tracking technologies are immune from CIPA class actions. S.B. 690 would exempt online technologies used for a “commercial business purpose” from wiretapping and pen register or trap-and-trace liability. “Commercial business purpose” is defined as the processing of personal information in a manner permitted by the CCPA.
Arkansas’ Social Media Safety Act Struck Down; Arkansas Legislature Passes Amendments in Response: The U.S. District Court for the Western District of Arkansas held that the Arkansas’ Social Media Safety Act (“SMSA”), a law limiting minors’ access to social media platforms, was unconstitutional and granted a permanent injunction blocking SMSA from taking effect. The District Court held that SMSA violated the First Amendment because it did not meet the requisite standard of strict scrutiny. The District Court held that SMSA’s age verification requirements blocking minors’ access to social media platforms were not narrowly tailored to prevent minors from interacting online with predators and other harmful content. The District Court also found that SMSA was unconstitutionally vague, as it is not clear which of NetChoice’s members are subject to SMSA’s requirements, while SMSA regulates companies like Facebook and Instagram, it specifically exempts Google, WhatsApp, and Snapchat. In response to the District Court’s ruling, the Arkansas Legislature passed a new bill, S.B. 611, to amend SMSA to broaden the scope and applicability of SMSA to include additional online platforms, narrow the age of applicability to users under 16 (rather than 18), strengthen privacy protections for minor users, and add a private right of action for parents of minor users.
Connecticut Attorney General Issues Annual Report on Connecticut Data Privacy Act Enforcement: The Connecticut Attorney General released a new report detailing the actions it has taken to enforce the Connecticut Data Privacy Act (“CTDPA”). The report provides updates on: (1) the Connecticut Attorney General’s broader privacy and data security efforts; (2) consumer complaints received under the CTDPA to date; (3) several enforcement efforts highlighted in the Connecticut Attorney General’s initial report; (4) expanded enforcement priorities; and (5) recommendations for strengthening the CTDPA’s protections. While the Connecticut Attorney General seems to remain focused on enforcing the CTDPA’s transparency requirements (i.e., disclosures to be included in privacy notices) and requirements to obtain opt-in consent to process sensitive data, it seems to also have broadened its efforts to address opt-out practices and dark patterns. The Connecticut Attorney General’s priorities have further expanded as the CTDPA’s universal opt-out provisions became effective and new legislation related to minors’ privacy and consumer health data took effect.
Oregon Attorney General Reports Spike in Complaints on Use of Personal Data by Government Entities: The Oregon Department of Justice’s (“ODOJ”) Privacy Unit reported a big spike in the first three months of 2025 in complaints about the Department of Government Efficiency (“DOGE”). As of March 31, 2025, the Privacy Unit reports it received more than 250 complaints about DOGE. In addition to the DOGE complaints, the Privacy Unit received 47 complaints between January and March of this year relating to the Oregon Consumer Privacy Act (“OCPA”). In addition, ODOJ announced the publication of a 2025 Quarter 1 Enforcement Report, which addresses outreach and enforcement efforts of the OCPA from January 1 to March 31, 2025, and identifies broad privacy trends in Oregon. ODOJ previously issued a Six-Month Enforcement Report, which addressed enforcement efforts for the first six months of the OCPA. ODOJ plans to continue to issue these reports quarterly, with a longer report published every six months.
Ohio’s Age Verification Law Struck Down: The U.S. District Court for the Western District of Arkansas struck down Ohio’s Social Media Parental Notification Act, which required social media companies to verify user age and obtain parental consent for users under 16. NetChoice, a technology industry trade group that has challenged a number of recently enacted social media laws around the country on constitutional grounds, including Arkansas’ SMSA, alleged that the act violated the First Amendment. The District Court agreed and held that the law’s age verification requirement blocking minors’ access to social media is not narrowly tailored to protect children from the harms of social media. The District Court also held that the law’s definitions for which websites had to comply with the law were a content-based restriction because it favored some forms of engagement with certain topics to the exclusion of others.
California Attorney General Appeals Age-Appropriate Design Code Act Decision: As previously reported, NetChoice obtained a second preliminary injunction temporarily blocking the enforcement of the California Age-Appropriate Design Code Act (“AADC”). The California Attorney General has appealed this decision, stating that it is “deeply concerned about further delay in implementing protections for children online.” The AADC would place extensive new requirements on websites and online services that are “likely to be accessed by children” under the age of 18. NetChoice won its first preliminary injunction in September 2023 on the grounds that the AADC would likely violate the First Amendment. In April 2025, NetChoice’s motion for preliminary injunction was again granted on the grounds that the AADC regulates protected speech, triggering a strict scrutiny review, and while California has a compelling interest in protecting the privacy and well-being of children, this interest alone is not sufficient to satisfy a strict scrutiny standard.
FEDERAL LAWS & REGULATIONS
DOJ Issues Data Security Program Compliance Guide and FAQ; Provides 90 Day Limited Enforcement Policy: The National Security Division of the U.S. Department of Justice (“DOJ NSD”) released a compliance guide and FAQ as part of its implementation of its final rule on protecting Americans’ sensitive data from foreign adversaries (the “Final Rule”). The compliance guide is intended to provide general information to assist individuals and entities in complying with the Final Rule’s legal requirements and to facilitate an understanding of the scope and purposes of the Final Rule. The FAQ answers 108 questions regarding Final Rule topics such as the definition of sensitive personal data, prohibited and restricted transactions, and scope of the Final Rule’s application to certain corporate group transactions, among other topics. Concurrently, the DOJ NSD issued a limited enforcement policy through July 8, 2025. Under the limited enforcement policy, the DOJ NSD stated that it will not prioritize civil enforcement actions against any person for violations of the DSP that occur from April 8 through July 8, 2025, so long as the person is engaging in good faith efforts to comply with or come into compliance with the DSP during that time. NSD stated it will pursue penalties and other enforcement actions as appropriate for egregious, willful violations and is not limited in pursuit of civil enforcement if good faith compliance efforts, such as reviewing data flows, conducing data inventories, renegotiating vendor agreements, transferring services to new vendors, and conducting diligence on new vendors, are not undertaken.
FTC Sends Letter to Office of U.S. Trustee Regarding 23andMe Bankruptcy: Federal Trade Commission (“FTC”) Chairman Andrew N. Ferguson issued a letter to the U.S. Trustee regarding the 23andMe bankruptcy proceeding, expressing the concerns consumers have with the potential sale or transfer of their 23andMe data. The letter emphasizes the fact that the data 23andMe collects and processes is extremely sensitive, and highlights some of the public-facing privacy and data security-related representations the company has made. Chairman Ferguson urges the U.S. Trustee to ensure that any bankruptcy-related sale or transfer involving 23andMe users’ personal information and biological samples will be subject to the representations the company has made to users about both privacy and data security.
OMB Issues Memoranda on Federal Government Purchase and Use of AI: The U.S. Office of Management and Budget (“OMB”) issued memoranda providing guidance on federal agency use of AI and purchase of AI systems. The guidance in the memoranda builds on Executive Order 14179, Removing Barriers to American Leadership in Artificial Intelligence, signed by President Trump in January. The memorandum fact sheet states, “The Executive Branch is shifting to a forward-leaning, pro-innovation and pro competition mindset rather than pursuing the risk-averse approach of the previous administration.” Notwithstanding that characterization, the guidance does share many risk management and performance tracking concepts included in Biden administration directives. The guidance describes how to manage “high-impact” AI, which is defined as AI where the output serves as a principal basis for decisions or actions that have legal, material, binding, or significant effect on AI rights or safety. There are several examples of high-impact AI in the guidance, including enforcement of trade policies, safety functions for critical infrastructure, transporting chemical agents, certain law enforcement activities, and when protected speech is removed. Environmental impacts and algorithmic bias are not mentioned. However, the guidance directs agencies to use AI in a way that improves public services while maintaining strong safeguards for civil rights, civil liberties, and privacy.
States’ Attorneys General Challenge the Firing of FTC Commissioners: A coalition of 21 Attorneys General (the “Coalition”) supported two FTC Commissioners in challenging the decision by President Trump to fire them without cause. Led by the Colorado Attorney General, the Coalition filed an amicus brief in Slaughter v. Trump, emphasizing the important role the FTC has played in consumer protection and antitrust. The Coalition stated that the strong track record of the FTC is due in large part to the bipartisan structure of the FTC’s leadership and that “[a]llowing the president to have at-will removal authority would ruin the FTC’s independence by allowing the commission to become a partisan agency subject to the political whims of the president.”
NIST Releases Initial Draft of New Version of Incident Response Recommendations: The U.S. Department of Commerce National Institute of Standards and Technology (“NIST”) released the initial public draft of Special Publication 800-61 Rev. 3 (“SP 800-61”) for public comment. SP 800-61 is designed to assist organizations in incorporating cybersecurity incident response considerations throughout NIST Cybersecurity Framework 2.0 risk management activities to improve the efficiency and effectiveness of their incident detection, response, and recovery activities. The public comment period is open through May 20, 2025.
NIST Releases Initial Public Draft of Privacy Framework 1.1: NIST released a draft update to the NIST Privacy Framework (“PFW”). Updates include targeted changes to the content and structure of the NIST PFW to enable organizations to better use it in conjunction with the NIST Cybersecurity Framework, which was updated to version 2.0 in 2024 (“CSF 2.0”). The PFW’s draft update makes targeted changes to align with CSF 2.0, with a focus on the Govern Function (i.e., risk management strategy and policies) and the Protect Function (i.e., privacy and cybersecurity safeguards). The new draft also includes changes responsive to stakeholder feedback since the initial release of the PFW five years ago. The draft PFW also includes a new section on AI and privacy risk management and moves PFW use guidelines online. NIST is accepting comments on the draft through June 13, 2025.
FCC Delays Part of TCPA Rule Amendments: The Federal Communications Commission (“FCC”) announced that it was extending the effective date of one part of the amendments to the Telephone Consumer Protection Act (“TCPA”) rules the FCC released last year. The delayed amendments were initially set to become effective April 11, 2025, and relate to consumers’ revocation of consent. Amendments to C.F.R. § 64.1200(a)(10) were designed to make it easier for consumers to revoke consent under the TCPA by requiring callers to apply a revocation request received for one type of message to all future calls and texts. However, in response to industry comments, the FCC extended the effective date of C.F.R. § 64.1200(a)(10) until April 11, 2026, “to the extent that it requires callers to apply a request to revoke consent made in response to one type of message to all future robocalls and robotexts from that caller on unrelated matters.” The remaining portions of the amended rule went into effect on April 11, 2025.
U.S. LITIGATION
Fifth Circuit Vacates FCC Telecommunications Provider Fine: The Fifth Circuit vacated the $57 million fine imposed on AT&T by the FCC in 2024, which was part of a number of FCC enforcement actions issued concurrently by the FCC against major carriers related to the sale of geolocation data to third parties. All carriers have appealed the fines. AT&T argued that the penalty should be vacated in part because the FCC imposed sanctions without proving the allegations in court. Following the U.S. Supreme Court decision in U.S. Securities and Exchange Commission v. Jarkesy, in which the Supreme Court limited use of government agency courts and held that when the Securities Exchange Commission seeks civil penalties against a defendant for securities fraud, the Seventh Amendment entitles the defendant to a jury trial. The FCC argued that its enforcement action was rooted in Section 222 of the Telecommunications Act, which does not have roots in common law, and that, therefore, the Seventh Amendment right to a jury trial is inapplicable. However, the Fifth Circuit determined that Section 222’s requirement to use reasonable measures to protect consumer data is analogous to common law negligence. The Court stated that it was not denying the FCC’s right to enforce laws to protect customer data, but that the FCC must do so consistent with constitutional guarantees of a jury trial.
Illinois Federal Judge Reverses Prior Ruling on Retroactive Application of BIPA Amendments: In two cases before U.S. District Court Judge Elaine Bucklo, Judge Bucklo vacated her prior rulings that Illinois’ Biometric Information Privacy Act (“BIPA”) amendments passed by the Illinois legislature applied retroactively, stating that upon her reexamination of the issue she concluded that the “better interpretation of the amendment is that it changed the law” rather than clarified the initial intent of the legislature when it first passed BIPA. The Illinois Legislature amended BIPA in 2024 to provide that a company that collects a person’s biometric information multiple times in the same manner has committed only one violation of the law. Previously, the Illinois State Supreme Court held that each instance of collection constituted a violation supporting a claim for damages, resulting in potentially extreme liability for companies using biometric systems for business purposes such as timekeeping, where employees might clock in and out by scanning biometric identifiers multiple times per day. Judge Bucklo’s new ruling aligns with those of two other Illinois federal district courts. The plaintiffs will now be permitted to pursue their claims under the statute as it existed at the time of the alleged violations.
Pennsylvania District Court Holds Online Privacy Terms Sufficient for Implied Consent Under State Wiretapping Law: The U.S. District Court for the Western District of Pennsylvania held that disclosure of third-party data collection in online privacy statements that can be seen by a reasonably prudent person is sufficient to obtain implied consent to that disclosure. Pennsylvania’s wiretapping statute prohibits any person from intercepting a wire, electronic, or oral communication unless all parties have provided consent to interception. The website in question, operated by Harriet Carter Gifts, disclosed that the business tracked and shared website visitors’ activity with third parties. The privacy statement was available via a link at the bottom of each page of the website. According to the Court, the description of sharing data with third parties in the privacy statement combined with the reasonable availability of the privacy statement provided the plaintiff with constructive notice of the practice of sharing data with third parties and resulted in the plaintiff providing implied consent to such sharing, despite the fact that the plaintiff testified she had never read the privacy statement.
Sixth Circuit Holds Newsletter Subscribers Are Not Consumers Under VPPA: The Sixth Circuit affirmed the dismissal of a proposed class action brought by a plaintiff who had subscribed to a digital newsletter from Paramount Global’s 24/7 Sports. The plaintiff alleged that the subscription qualified him as a “consumer” under the Video Privacy Protection Act (“VPPA”) because the newsletter contains links to video content, making the newsletter “audiovisual materials” subject to the VPPA. The Court rejected this argument, stating that the complaint suggests that the linked video content was available to anyone with or without a newsletter subscription and that the plaintiff did not plausibly allege that the newsletter itself was “audiovisual material.” The Court noted that its reading of the VPPA differed from the Second and Seventh Circuits, which have held that the term “consumer” under the statute should encompass any purchaser or subscriber of goods or services, whether audiovisual or not. U.S. Circuit Judge Rachel S. Bloomekatz dissented, stating that the plaintiff is a “consumer” under the VPPA because he is a subscriber of Paramount, which is a “videotape service provider.”
Ninth Circuit Rules VPPA Not Applicable to Movie Theaters: The Ninth Circuit affirmed a District Court’s dismissal of an action against Landmark Theaters (“Landmark”), holding that the Video Privacy Protection Act (“VPPA”) does not apply to in-theater movie businesses. The plaintiff had purchased a ticket on Landmark’s website. As part of that purchase, the plaintiff alleged that Landmark shared the name of the film, the location of the showing, and the plaintiff’s unique Facebook identification number with Facebook. The VPPA prohibits “video tape service providers” from knowingly disclosing personally identifiable information of a consumer without consent. “Video tape service provider” is defined under the VPPA as “any person, engaged in the business .. of rental, sale, or delivery of prerecorded video cassette tapes or similar audiovisual materials.” The Court held that the plain language of the statute and the law’s statutory history did not support a finding that selling tickets to an in-theater movie-going experience is a business subject to the VPPA.
U.S. ENFORCEMENT
Defense Contractor Settles FCA Allegations Related to Cybersecurity Compliance: The U.S. Department of Justice (“DOJ”) announced a settlement with defense contractor Morsecorp Inc. (“Morse”) resolving allegations that Morse violated the False Claims Act (“FCA”) by failing to comply with cybersecurity requirements in its contracts with the Army and Air Force. The DOJ alleged that Morse failed to comply with contract requirements by, among other things, using a third party to host Morse emails without requiring or ensuring that the third party met Federal Risk and Authorization Management Program Moderate baseline and complied with the Department of Defense’s cyber security requirements, failing to implement all cybersecurity controls in NIST Special Publication 800-171 (“SP 800-171”), failing to have a consolidated written plan for each of its covered information systems describing system boundaries, system environments of operation, how security requirements are implemented and the relationships with or connections to other systems, and failing to update its self-reported score for implementation of the requisite NIST controls following receipt of an updated score from a third party assessor. Morse has agreed to pay $4.6 million to resolve the allegations.
New York Attorney General Fines Auto Insurance Company over Data Breach: The Office of New York Attorney General Letitia James announced that it had fined auto insurance company Root $975,000 for failing to protect personal information following a breach that affected 45,000 New York residents. Root allows consumers to obtain a price quote for insurance through its website. After entering limited personal information, the online quote tool filled in other personal information such as driver’s license numbers. The Attorney General alleges that Root exposed plaintext driver’s licenses in a PDF generated at the end of the quote process and that Root had failed to perform adequate risk assessments on its public-facing web applications, did not identify the plain text exposure of consumer personal information, and employed insufficient controls to thwart automated attacks. In addition to the fine, the settlement requires Root to enhance its data security controls by maintaining a comprehensive information security program that uses reasonable authentication procedures for access to private information and the maintenance of logging and monitoring systems, among other things.
New Jersey Attorney General Sues Messaging App for Failing to Protect Kids: New Jersey Attorney General Matthew J. Platkin and the Division of Consumer Affairs announced it had filed a lawsuit against message app provider Discord, Inc. (“Discord”) alleging Discord engaged in “deceptive and unconscionable business practices that misled parents about the efficacy of its safety controls and obscured the risks children faced when using the application.” According to the complaint, Discord violated the New Jersey Consumer Fraud Act by misleading parents and kids about its safety settings for direct messages. For example, Discord allegedly represented that certain user settings related to its safe direct messaging setting would cause the app to scan, detect, and delete direct messages for explicit media content. According to the Attorney General, Discord knew that not all explicit content was being detected or deleted. The complaint also alleges that Discord misrepresented its policy of not permitting users under the age of 13 because of its inadequate age verification processes.
HHS Enters Settlement with Healthcare Network over Phishing Attack that Exposed PHI: The U.S. Department of Health and Human Services (“HHS”), Office for Civil Rights (“OCR”) announced a settlement with PIH Health, Inc. (“PIH Health”), a California healthcare network, relating to alleged violations of the Health Insurance Portability and Accountability Act (“HIPAA”) arising from a phishing attack that exposed protected health information. The phishing attack compromised 45 PIH Health employee email accounts, which resulted in the breach of 189,763 individuals’ protected health information, including names, addresses, dates of birth, driver’s license numbers, Social Security numbers, diagnoses, lab results, medications, treatment and claims information, and financial information. OCR alleges that PIH Health failed to conduct an accurate and thorough risk analysis of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of ePHI held by PIH Health, and failed to provide timely notification of the breach. Under the terms of the settlement, PIH Health will implement a corrective action plan that will be monitored by OCR for two years and pay a $600,000 fine.
INTERNATIONAL LAWS & REGULATIONS
Cyberspace Administration of China Publishes Q&A on Cross-Border Data Transfers: The Cyberspace Administration of China (“CAC”) published a Q&A on cross-border data transfer policies and requirements for organizations. The Q&A is intended to provide guidance on government administrative policies. China’s regulations on cross-border data transfer require one of three mechanisms to be used if personal data or important data is transferred. Those mechanisms are a regulator-led security assessment, standard contractual clauses, and certification. The Q&A lists several common types of low risk data transfers that are not required to comply with one of the transfer mechanisms, including data related to international trade, cross-border transportation, academic collaborations, and cross-border manufacturing/sales if no important data or personal information is involved and nonsensitive personal information, totaling fewer than 100,000 individuals since 1 Jan. of the current year by noncritical information infrastructure operators. The Q&A also provides additional detail on assessing the necessity of personal data transfer and describes administrative processes available for obtaining clearance for data transfers on a company group basis, among other things.
ICO Releases Anonymization Guidance: The United Kingdom Information Commissioner’s Office (“ICO”) released new guidance on anonymizing personal data to assist organizations in identifying issues that should be considered to use anonymization techniques effectively. The guidance discusses what is meant by anonymization and pseudonymization, how such techniques affect data protection obligations, provides advice on good practices for anonymizing personal data, and discusses technical and organizational measures to mitigate risks to individuals when organizations anonymize data. Among other things, the guidance explains that anonymization is about reducing the likelihood of a person being identified or identifiable to a sufficiently remote level and that organizations should undertake identifiability risk assessments to determine the likelihood of identification when undertaking anonymization efforts, among other recommended accountability and governance measures. The guidance also includes case studies to assist users in understanding the guidance concepts.
Office of the Privacy Commissioner of Canada Releases Guidance on Risk Assessment in Data Breach; Canada Announces First Phase of Cybersecurity Certification Program: The Office of the Privacy Commissioner of Canada (“Privacy Commissioner”) released an online tool to assist organizations in conducting a breach risk self-assessment. The tool guides users through a series of details of the breach to assess whether the circumstances create a real risk of significant harm and is required to be reported. Separately, the Government of Canada announced the first phase in the implementation of the Canadian Program for Cyber Security Certification (“CPCSC”). The CPCSC will establish a cyber security standard for companies that handle sensitive unclassified government information in defense contracting. The Canadian government stated that the CPCSC will be released in phases, with the first phase involving the release of a new Canadian industrial cyber security standard, opening the accreditation process, and introducing a self-assessment tool for level 1 certification to help businesses better understand the program before a wider rollout of the program in successive phases.
NOYB Files Complaint Against ChatGPT over Defamatory Hallucinations: Privacy advocacy organization NOYB has filed a complaint against ChatGPT stemming from false information about an individual provided by ChatGPT in response to a query. Specifically, the complaint alleges that when Norwegian user Arve Hjalmar Holmen queried ChatGPT to determine if it had any information about him, ChatGPT presented the complainant as a convicted criminal who murdered two of his children and attempted to murder his third son. NOYB further alleges that the fake story included real elements of his personal life, including the actual number and the gender of his children and the name of his hometown. The NOYB complaint alleges that the output is not an isolated incident and violates the EU General Data Protection Regulation, including Article 5(1)(d), which requires organizations to ensure the personal data they produce about individuals is accurate.
ICO Fines Company for Lax Cybersecurity Following Ransomware Attack: The ICO announced it has fined Advanced Computer Software Group Ltd. (“Advanced”) £3.07 million for cybersecurity failures relating to a ransomware incident in August 2022. Advanced provides information technology services to businesses, including in the healthcare industry. Hackers had gained access to Advanced systems via a customer account that did not have multi-factor authentication, leading to the disruption of UK National Health Service (“NHS”) operations. The personal information on 79,404 people was exfiltrated in the attack, including details of how to enter the homes of 809 individuals receiving home care. The ICO investigation concluded that Advanced did not have appropriate technical and organizational measures in place to protect personal data prior to the incident. The ICO noted that it reduced the initially proposed fine due to Advanced’s proactive engagement with law enforcement, the NHS, and other steps taken by Advanced to mitigate the risk to impacted individuals.
Daniel R. Saeedi, Rachel L. Schaller, Gabrielle N. Ganze, Ana Tagvoryan, P. Gavin Eastgate, Timothy W. Dickens, Jason C. Hirsch, Adam J. Landy, Amanda M. Noonan, and Karen H. Shin also contributed to this article.
DEA Proposed Rule for Special Registrations for Telemedicine and Limited State Telemedicine Registrations
On January 17, 2025, the Drug Enforcement Administration (DEA) released the proposed rule, “Special Registrations for Telemedicine and Limited State Telemedicine Registrations.” The proposed rule marks a significant first step in the DEA’s functional establishment of the Special Registration set forth in the Ryan Haight Online Pharmacy Consumer Protection Act of 2008 (RHA). The DEA’s goal in proposing the Special Registration’s framework is to “ensure patient access to care, while maintaining sufficient safeguards to prevent and detect diversion of controlled substances.”
The Special Registration History
The RHA requires that all prescription drugs that are dispensed by means of the internet be issued via a valid prescription, which generally requires an in-person medical evaluation, and that the prescription be issued for a legitimate medical purpose in the usual course of professional practice. The RHA provides distinct circumstances in which the practice of telemedicine is permitted, and in turn, the in-person evaluation is not required in order to properly prescribe a controlled substance. The Special Registration is one of these circumstances, and while the Special Registration exception to the in-person requirement was included in the RHA in 2008, it was not developed beyond the text of the RHA until this year, with the promulgation of the proposed rule. The Special Registration was a common topic of presentation in the DEA Listening Sessions in April 2024, and its development has been called for by a variety of stakeholders in the telemedicine industry.
The Special Registration Framework
The proposed rule is organized by category of Special Registration and conditions for the registration’s maintenance. Specifically, the proposed rule conceptualizes a unique type of practitioner – a “covered online telemedicine platform.” A covered online telemedicine platform means an entity that facilitates connections between patients and clinician practitioners, via an audio-video telecommunications system, for the diagnosis and treatment of patients that may result in the prescription of controlled substances and meets one of four enumerated criteria.
If met, the four criteria reflect that the platform is “integral intermediary in the remote dispensing of controlled substances.” The criteria address platform advertising, associated pharmacy ownership, prescribing guidelines, and handling of medical records. In this way, the proposed rule sets detailed standards, highlights issues that were actively discussed during the DEA Listening Sessions, and defines factors the DEA identifies as indicative of potential diversion or unsafe prescribing. Notably, hospitals, clinics, local in-person medical practices, and insurance providers are excepted from the definition.
Categories of Special Registration
The proposed rule delineates between “clinician practitioners” and “platform practitioners” and sets forth four categories of registration for which a practitioner may apply. Clinician practitioner refers to properly registered physicians and mid-level practitioners. Platform practitioner means a covered online telemedicine platform that dispenses controlled substances by virtue of its central involvement as an intermediary in the remote prescribing of controlled substances by an individual practitioner. Platform practitioners are subject to the requirements imposed upon non-pharmacist practitioners under the Controlled Substances Act, 21 U.S.C. 801-904, and its regulations. The four categories of registration are:
The Telemedicine Prescribing Registration would authorize the prescribing of Schedule III through V controlled substances by clinician practitioners.
The Advanced Telemedicine Prescribing Registration would authorize certain specialized clinician practitioners (i.e., certain categories of “specialized” clinicians defined by the rule) to prescribe Schedule II controlled substances in addition to Schedule III through V controlled substances.
The Telemedicine Platform Registration would authorize covered online telemedicine platforms to dispense Schedule II through V through a clinician practitioner possessing either category of clinician Special Registration above.
The State Telemedicine Registrations, which would be required to prescribe across states, would allow practitioners issued any of the three categories of registration above to obtain a state registration for every state in which patients to whom special registration prescriptions will be issued are located, with certain exceptions.
The proposed rule sets forth eligibility and requirements for each category of Special Registration, although notably application for a certain registration does not guarantee that it will be granted, as each requirement for Special Registration is lined with agency discretion. Ultimately, the DEA Administrator will issue a Special Registration to an applicant when the applicant meets all eligibility requirements set forth in the proposed rule, which includes a practitioner presenting “legitimate need” for the Special Registration, and the Administrator determines that the Special Registration is consistent with the public interest factors stipulated in 21 U.S.C. 823(g)(1) (i.e., the public interest factors considered for conventional practitioner DEA registrations).
Altogether, engaging in the practice of telemedicine under the proposed rule, the practitioner must possess a conventional DEA registration under 21 U.S.C. 823(g), one of the three types of Special Registration for practitioners, and a State Telemedicine Registration for each state in which a patient prescribed a controlled substance is located.
Other Requirements of the Proposed Rule
The proposed rule also requires certain operational standards for special registrants, including:
Disclosure of a Special Registered Location. Special Registration applicants must designate a location as the physical address of the Special Registration.
Certain Disclosures. Platform practitioners, in their application for Telemedicine Platform Registration, must disclose all employment, contractual relationships, or professional affiliations with any clinician special registrant and online pharmacy.
Certain Attestations. Special Registration applicants must attest that they have a legitimate need for a Special Registration for Telemedicine and to the facts and circumstances that form the basis for their “legitimate need” for the Special Registration.
Changes to Special Registration information must be updated within 14 business days of the change.
Patient Verification. The proposed rule provides patient identity verification standards, including patient verification requirements for the first telemedicine encounter and a practitioner’s storage of patient identification information.
Special Registration Prescription Data Reporting. Special registrants must report to DEA on an annual basis the total number of new patients in each state where at least one special registration prescription has been issued and the total number of special registration prescriptions issued by a registrant across states, among other information.
Telecommunications Standards. The proposed rule requires all prescriptions issued via a Special Registration to be through the use of an audio-video telecommunications system, with only limited circumstances allowing for the issuance of a special registration prescription with audio-only technology.
State Law. The proposed rule explicitly requires compliance with state laws and regulations where the patient is located during the telemedicine encounter, the state where the special registrant is located during the telemedicine encounter, and any state in which the special registrant holds a DEA registration.
PDMP Check. Prior to issuing a special registration prescription, a special registrant must perform a check of the state PDMPs in the state the patient is located, the registrant is located and any state with reciprocity agreements with these states.
Prescription Requirements. Special registration prescriptions must contain certain information specific to the Special Registration number, with liability imposed on a pharmacist for filling a special registration prescription that may be missing information.
Moving Forward
The proposed rule acknowledges the expansive nature of telemedicine post-PHE, addresses key players in the telemedicine industry – from popular telemedicine platforms to local practitioners and pharmacists – and attempts to wrangle these diverse interests into a workable Special Registration. Further, it proposes comprehensive application and reporting requirements to facilitate the tracking of special registration prescription information and telemedicine activities by the DEA, which had traditionally been confined to the state level.
Certain aspects of the proposed Special Registration appear clear, such as the form that will be required for the Special Registration and the patient verification requirements. Other elements, such as the practical threshold for “legitimate need” and the DEA’s discretion to grant or deny the Special Registration itself, render logistical aspects of the Special Registration framework unpredictable. Clinician practitioners, platform practitioners, and other telemedicine participants should continue to monitor the development of the proposed rule as, if finalized, a Special Registration will become a key component required for continued telemedicine operations involving the prescription of controlled substances. The written comment period for the proposed rule ended March 18, 2025, and comments are currently under consideration.
Listen to this article
FUELED BY LITIGATION: ExxonMobil Skids Into TCPA Quiet Hours Lawsuit
Greetings TCPAWorld!
I’m back with the latest. Yesterday, ExxonMobil Corporation was named in a new TCPA class action in the Northern District of California. See Yates v. Exxon Mobil Corp., No. 3:25-cv-03984 (N.D. Cal. May 7, 2025). The allegations are already fueling controversy (pun intended). And if you’ve been tracking the ongoing Quiet Hours saga, this one’s straight textbook.
Filed May 7, 2025, the Complaint alleges that Exxon sent Plaintiff five marketing text messages between 6:06 a.m. and 7:46 a.m. local time—well before the 8 a.m. safe harbor set by 47 C.F.R. § 64.1200(c)(1). The messages advertised Exxon’s rewards program with lines like: “Exxon Mobil Rewards+: Earn 3pts/gal each time you fill up” and “Complete your profile and earn $1 in bonus points.”
Sound familiar? It should. This playbook mirrors the MASSIVE surge of Quiet Hours litigation that R.E.A.C.H. (Responsible Enterprises Against Consumer Harassment) has been aggressively fighting in its recent FCC comments.
R.E.A.C.H.’s position is simple: messages sent with “prior express invitation or permission” don’t count as “telephone solicitations,” and therefore fall outside the timing restrictions altogether. If the consumer opts in, the timing shouldn’t matter.
But let’s pump the brakes and think practically. A recipient receiving these messages could be on vacation, visiting family, or just traveling out of their home time zone. In this context, timing isn’t always something a sender can cleanly control—but the litigation risk? That’s very real.
Here, the Complaint is the classic example we are seeing repeatedly. It’s filled with screenshots, timestamps, and a class definition likely to try and sweep in thousands of Exxon Rewards+ members nationwide. And of course, the stakes here are massive.
R.E.A.C.H. recently submitted data to the FCC analyzing 184 Quiet Hours cases filed by a single South Florida law firm through March 31, 2025. The Exxon Complaint fits the mold. Like 77% of those cases, it avoids listing a complete phone number, stating that “Plaintiff’s telephone number has an area code that specifically coincides with locations in California.” Also notable: the Complaint was filed more than a year after the first alleged message, which falls squarely within the filing delays R.E.A.C.H. flagged in over 20% of these cases. So, we see not only familiar fact patterns but also familiar procedural timing that keeps these cases running.
As always,
Keep it legal, keep it smart, and stay ahead of the game.
Talk soon!
Employers Beware: Blanket Policies Prohibiting Workplace Recordings May Violate the NLRA
In the past, employees recording audio or images in the workplace might resort to use of a bulky tape recorder or a hidden “wire” or camera. Now that smart phones with professional-grade audio and video capabilities are an integral part of our society, clandestine (or blatant) workplace recordings are much more easily accomplished.
With this increased ease of access to reliable and compact recording equipment has come a heightened employer sensitivity to workplace recordings. As a result, many employers are tempted to implement blanket policies prohibiting workplace recordings, or otherwise require management consent to make any workplace recordings.
While some limited prohibitions on workplace recordings are permissible—for instance, to protect confidential business information or private health information—in recent years, the National Labor Relations Board (“NLRB” or the “Board”) has criticized blanket policies prohibiting such activities. The NLRB reasons that policies against workplace recordings may discourage employees from participating in concerted activity with other employees that safeguard their labor rights. In other words, such policies may “chill” employees’ ability to act in concert, and some courts have agreed.
Section 7 of the National Labor Relations Act (“NLRA” or the “Act”) ensures employees’ “right to self-organization, to form, join, or assist labor organizations, to bargain collectively through representatives of their own choosing, and to engage in other concerted activities for the purpose of collective bargaining or other mutual aid or protection,” and the right “to refrain from any or all such activities.” Section 8(a)(1) of the Act makes it an unfair labor practice for an employer “to interfere with, restrain, or coerce employees in the exercise of the rights guaranteed in Section 7” of the Act.
The Board has noted that workplace video and audio recording is protected if employees are “acting in concert for their mutual aid and protection” and the employer does not have an “overriding interest” in restricting the recording. As noted above, protection of confidential company information or personal health information can help an employer to demonstrate an overriding interest in restricting recordings. As noted by the Board, a few examples of recordings made in concert for mutual protection that may outweigh an employer’s interest in any restrictions are recordings made to capture:
unsafe working conditions;
evidence of discrimination;
“townhall” meetings with anti-union sentiment; and
conversations about terms and conditions of employment.
Regardless of any state laws that may require two-party consent for recording conversations, the NLRB has held that the NLRA preempts state law, and that protection of employees’ rights under the NLRA overrides concerns about state law recording consent violations.
Thus, at a time when recording capabilities are packed into an ordinary, everyday device carried by nearly every employee in every workplace, employers who still wish to have a policy limiting workplace recordings should ensure that the policy lists valid reasons for implementing the policy, include a carve-out in the policy for protected concerted activities under the NLRA, and not require management approval for recordings that constitute protected concerted activities. Taking such measures can help to ensure the Board will not find an employer’s no-recording policy in violation of the NLRA.
SERIOUS QUESTION: Is it Against the FTC’s Influencer Rules for a #BigLaw Partner To Recommend This…
So just did an article about ANOTHER company getting crushed in a TCPA class action because they trusted #biglaw to defend them and it got me thinking.
Why does this keep happening?
And its root its because of the #biglaw model. Partners get paid under the table– kickbacks if you will– when they recommend clients/companies to use other partners at the firm. Its like an inside sales gig.
But here’s the problem.
This is PLAINLY a conflict of interest and 99% of the time the lawyer making the “recommendation” never reveals to to the client/company that they will be getting a cut of the compensation. So they’re steering an influencing the company/client to use somebody that– in many instances– they know is not as good as others in the field, and they’re getting a secret profit for it without even telling the client.
This is so icky and unethical but also seems like it should be illegal.
The FTC has made it clear social media influencers cannot promote products without advising they’re being compensated to do so. Attorneys are influencers with 100 times the power– why shouldn’t they be required to reveal their secret profits?
Hmmmm.
Deserve to win folks.
HIPAA Compliance for AI in Digital Health: What Privacy Officers Need to Know
Artificial intelligence (AI) is rapidly reshaping the digital health sector, driving advances in patient engagement, diagnostics, and operational efficiency. However, for Privacy Officers, AI’s integration into digital health platforms raises critical concerns around compliance with the Health Insurance Portability and Accountability Act and its implementing regulations (HIPAA). As AI tools process vast amounts of protected health information (PHI), digital health companies must carefully navigate privacy, security, and regulatory obligations.
The HIPAA Framework and Digital Health AI
HIPAA sets national standards for safeguarding PHI. Digital health platforms—whether offering AI-driven telehealth, remote monitoring, or patient portals—are often HIPAA covered entities, business associates, or both. Accordingly, AI systems that process PHI must be able to do so in compliance with the HIPAA Privacy Rule and Security Rule, making it vital for Privacy Officers to understand:
Permissible Purposes: AI tools can only access, use, and disclose PHI as permitted by HIPAA. The introduction of AI does not change the traditional HIPAA rules on permissible uses and disclosures of PHI.
Minimum Necessary Standard: AI tools must be designed to access and use only the PHI strictly necessary for their purpose, even though AI models often seek comprehensive datasets to optimize performance.
De-identification: AI models frequently rely on de-identified data, but digital health companies must ensure that de-identification meets HIPAA’s Safe Harbor or Expert Determination standards—and guard against re-identification risks when datasets are combined.
BAAs with AI Vendors: Any AI vendor processing PHI must be under a robust Business Associate Agreement (BAA) that outlines permissible data use and safeguards—such contractual terms will be key to digital health partnerships.
AI Privacy Challenges in Digital Health
AI’s transformative capabilities introduce specific risks:
Generative AI Risks: Tools like chatbots or virtual assistants may collect PHI in ways that raise unauthorized disclosure concerns, especially if the tools were not designed to safeguard PHI in compliance with HIPAA.
Black Box Models: Digital health AI often lacks transparency, complicating audits and making it difficult for Privacy Officers to validate how PHI is used.
Bias and Health Equity: AI may perpetuate existing biases in health care data, leading to inequitable care—a growing compliance focus for regulators.
Actionable Best Practices
To stay compliant, Privacy Officers should:
Conduct AI-Specific Risk Analyses: Tailor risk analyses to address AI’s dynamic data flows, training processes, and access points.
Enhance Vendor Oversight: Regularly audit AI vendors for HIPAA compliance and consider including AI-specific clauses in BAAs where appropriate.
Build Transparency: Push for explainability in AI outputs and maintain detailed records of data handling and AI logic.
Train Staff: Educate teams on which AI models may be used in the organization, as well as the privacy implications of AI, especially around generative tools and patient-facing technologies.
Monitor Regulatory Trends: Track OCR guidance, FTC actions, and rapidly evolving state privacy laws relevant to AI in digital health.
Looking Ahead
As digital health innovation accelerates, regulators are signaling greater scrutiny of AI’s role in health care privacy. While HIPAA’s core rules remain unchanged, Privacy Officers should expect new guidance and evolving enforcement priorities. Proactively embedding privacy by design into AI solutions—and fostering a culture of continuous compliance—will position digital health companies to innovate responsibly while maintaining patient trust.
AI is a powerful enabler in digital health, but it amplifies privacy challenges. By aligning AI practices with HIPAA, conducting vigilant oversight, and anticipating regulatory developments, Privacy Officers can safeguard sensitive information and promote compliance and innovation in the next era of digital health. Health care data privacy continues to rapidly evolve, and thus HIPAA-regulated entities should closely monitor any new developments and continue to take necessary steps towards compliance.
Blockchain+ Bi-Weekly; Highlights of the Last Two Weeks in Web3 Law: May 8, 2025
Senate Moves Forward with “GENIUS” Stablecoin Bill: May 2, 2025
Background: A revised version of the Senate’s bipartisan stablecoin bill — the “GENIUS Act” — has been introduced, with a floor vote expected before the Memorial Day recess. Key changes include a prohibition on stablecoin issuers offering “a payment of yield or interest” on their issued payment stablecoins, along with enhanced illicit finance provisions. The bill also bars the sale of stablecoins in the U.S. by non-U.S. entities and allows for issuance under state regimes, provided the regime “meets or exceeds” federal standards, as determined by a three-member review panel consisting of the Treasury Secretary, Federal Reserve Chair and FDIC Chair.Changes aimed at addressing concerns about DeFi were also included, though they appeared only in an unpublished draft. Possibly in response to those revisions or other outstanding concerns, a group of nine Democrats — generally considered supportive of crypto — sent a letter indicating they could not support the bill in its current form.
Analysis: The GENIUS Act represents the closest Congress has come to passing meaningful legislation on crypto in the U.S. However, challenges remain. One potential obstacle is the push by some lawmakers to link the stablecoin bill to broader market structure legislation, which is advancing in Congress but is not as far along. Industry advocates have pushed back on this proposed combination, warning that tying the two together could stall momentum — and, given the limited window for congressional action this session, could result in no bill being passed at all. Another hurdle is the apparent erosion of support among key Democrats. With 60 votes needed in the Senate to overcome procedural hurdles, bipartisan support is essential. A delay — or worse, the failure — of even this relatively “vanilla” legislation risks letting political dysfunction once again derail progress in the digital asset space.
Coinbase Files Amicus to SCOTUS Over IRS John Doe Subpoenas: April 30, 2025
Background: Coinbase has filed an amicus brief in support of a petition challenging the IRS’s use of John Doe summonses — which compel platforms to disclose user data without individualized suspicion. The case was brought by a Coinbase customer over the IRS seeking to compel Coinbase to turn over a broad swath of “John Doe” customer information without any probable cause that any particular user broke the law. This follows a similar brief filed earlier by the DeFi Education Fund. If the Court agrees to hear the case, it could have broad implications for financial privacy — not just in digital assets — and may lead the Court to revisit the scope of the Third-Party Doctrine.
Analysis: In the digital age, sharing financial or location data with a third party is often not voluntary, but required for basic participation in modern life. The Third-Party Doctrine, a legal rule that allows the government to access data you’ve shared with third parties without a warrant, was developed in an era before modern financial technology and many argue it no longer fits how people transact today. With a more privacy-sensitive court, this case presents a real opportunity to revisit the boundaries of government surveillance over financial data.
Briefly Noted:
Richard Heart SEC Matter Over: The SEC has announced it will not be amending its complaint against Hex founder, Richard Heart, after the case was previously dismissed on jurisdictional grounds. Regardless of views on project, there should be broad agreement that giving a podcast interview in the U.S. and using open-source code developed here are not sufficient grounds for asserting global regulatory jurisdiction.
Federal Reserve Retracts Supervisory Guidance: The Federal Reserve Board has retracted guidance that required banks to obtain their approval before implementing any activity that involved crypto, including basic or low-risk use cases. If stablecoin legislation passes, banks are expected to become more active in digital asset custody, providing safer options for customers, which should be in everyone’s best interest.
FTC Goes After “Crypto Trading” Venture: The FTC is going after a series of multi-level-marketing businesses that sold “crypto-trading” courses. Fraud of this type has always been more appropriate within the FTC’s domain, rather than what we’ve seen over the last few years with the SEC attempting to broaden its jurisdiction by classifying crypto assets as securities simply to bring them under the purview of the SEC’s anti-fraud powers.
Stablecoin Updates: A number of relatively minor stablecoin-related developments surfaced last week in addition to the Senate updates discussed above, including SoFi exploring its own issuance, Tether posting $1 billion in Q1 profits (with a U.S. expansion in the works), an expected vote in the Senate on the GENIUS Act before Memorial Day, and Visa working with Bridge for a stablecoin-backed payment card. Although each of these updates may seem incremental on their own, collectively they underscore the central role stablecoins now play in the digital asset ecosystem and the growing attention they’re receiving from both industry and regulators.
Treasury Presentation on Digital Money: Buried on page 98 of the Department of Treasury’s update to the Treasury Borrowing Advisory Committee was a surprisingly thoughtful primer on stablecoins and their potential impact on traditional banking. The timing is notable, as this update comes on the heels of Tornado Cash securing at least a partial victory with a federal court rejecting Treasury’s attempt to dismiss the Tornado Cash lawsuit on the grounds that the case was moot following revisions to the sanctions made after the lawsuit was filed. On this topic it’s worth listening to this Miachel Mosier chat about how Tornado wasn’t a complete victory.
Solana Policy SEC Submission: One of the first big published projects from the Solana Policy Institute is its recent submission to the SEC, “Proposing the Open Platform for Equity Networks” which is worth a read. Also recommended is this industry submission to the SEC regarding staking.
SEC Chair’s First Public Remarks on Crypto: In his first public comments since taking over, Chair Atkins emphasized the need for “practical, durable” rules and a more constructive relationship with the digital asset industry. While delivered at a roundtable hosted by the SEC’s Crypto Task Force, the remarks mark a notable shift in tone from the agency’s prior enforcement-first approach.
Galaxy Digital Moves for Public Listing: Galaxy Digital has confirmed plans to go public on Nasdaq, marking a major step for the firm, which originally filed an S-1 back in 2022. The move signals renewed confidence in both the regulatory environment for digital assets and broader public market conditions.
Digital Chamber Initial SEC Submission in Response to Request for Information: As previously discussed, the SEC’s Crypto Task Force has requested industry feedback on a wide range of questions related to the regulation of digital assets. The Digital Chamber of Commerce is coordinating a major response effort in partnership with leading law firms to provide detailed answers to each question. Polsinelli Blockchain+ attorneys are involved in several of these responses. The first response, led by Sidley Austin, was published last week.
Updated FIT21 Market Structure Bill Released: House Financial Services and Agriculture Committees have published an updated discussion draft of the crypto market structure bill, previously known as the Financial Innovation and Technology for the 21st Century Act (FIT21). We will have a larger update on the proposed legislation and a failed attempt at a joint hearing on digital assets in the House in our next Bi-Weekly update.
Conclusion:
The last two weeks suggest that while momentum is building toward a more structured regulatory environment for digital assets, there’s still a real risk that this historic opportunity could be squandered. We’ll be watching closely as these developments unfold and continuing to engage where it matters. We look forward to seeing many of you at Consensus.