California AG Again Enjoined from Implementing California Age Appropriate Design Code Act

On March 13, 2025, the U.S. District Court for the Northern District of California granted a second motion for preliminary injunction in favor of the technology trade group NetChoice. The injunction once again enjoins the California Attorney General from enforcing the California Age Appropriate Design Code Act (the “AADC” or “Code”), which was originally intended to take effect on July 1, 2024. The District Court determined that NetChoice is likely to succeed on claims raised in its amended complaint that the AADC is facially invalid under the First Amendment guarantee of free speech. As a result, the California AG is immediately enjoined from enforcing the Code during the pendency of the litigation.
The claims of free speech infringement stem primarily from the Code’s requirement for covered businesses to perform a data protection impact assessment (“DPIA”) to identify material risks to children under the age of 18, document and mitigate those risks before such children access an online service, product or feature and provide the DPIA to the California Attorney General upon written request. NetChoice asserts that on this basis the Code violates the expressive rights of NetChoice, its members and is void for vagueness under the First Amendment.
An injunction previously granted by the District Court in respect of the Act’s 2023 implementation was partially upheld by a Ninth Circuit panel in August of 2024, with respect to the DPIA requirement and provisions of the Code not grammatically severable from the DPIA requirement, including notice and cure provisions with respect to non-compliance. The Ninth Circuit vacated the rest of the district court’s first ruling and remanded the case to assess other provisions of the Code in more detail and consider whether the law’s unconstitutional provisions are severable from the remainder of the law.
The District Court determined that the AADC is not sufficiently narrowly tailored (under the strict scrutiny standard) to achieve its interest in protecting children online. On the basis that NetChoice has a colorable First Amendment claim, it would suffer irreparable harm if the Code were to take effect. The District Court also found that the enjoined DPIA provisions are not volitionally severable from the remainder of the AADC, though they are functionally severable.
The District Court determined, on the other hand, that NetChoice had not shown that it is likely to succeed on certain other claims, such as that the AADC was pre-empted by the federal Communications Decency Act or by the Children’s Online Privacy Protection Act.

UK-Based Graffiti Artists Sue Vivienne Westwood in California for Misuse of Their Tags

“In a culture where association with philistines is a death knell,” UK-based graffiti and street artists Cole Smith, Reece Deardon and Harry Matthews have brought a lawsuit against Vivienne Westwood and retailers of the brand for the fashion house’s allegedly unauthorized use of their tags “to lend credibility and an air of urban cool” to its apparel. See Smith v. Vivienne Westwood, Inc., Case No. 2:25-cv-01221 (C.D. Cal. Filed 02/12/25). The artists, known professionally as DISA, SNOK and RENNEE, respectively, argue that their tags are, like their name or signature, “deeply personal and determinative of their identity.” In turn, they claim that Vivienne Westwood’s use of their tags falsely represents their endorsement of the fashion house to the consumer and causes “the world to think that they are corporate sellouts, willing to trade their artistic independence, legacy and credibility for a quick buck.”
According to allegations in this and a long string of similar lawsuits by street artists against fashion brands like Moschino, Roberto Cavalli, Guess?, North Face and Puma, the use of graffiti artists’ tags on apparel purportedly generates “huge revenues” for brands based on their supposed affiliation with the artists. Those familiar with the legacy of Vivienne Westwood’s eponymous founder as a punk icon (far from a philistine) might agree that her brand illustrates the profitability of incorporating urban counterculture into retail fashion. 
Yet, the extent to which DISA, SNOK and RENNEE may recover their alleged damages as UK-based artists before the US District Court for the Central District of California remains an open question. While these artists may pursue their copyright infringement claims under the Berne Convention without having registered their tags in the United States Copyright Office, they probably are not entitled to recover either statutory damages or attorneys’ fees without US registrations. Additionally, although they may have a viable claim that their tags are copyright management information subject to the Digital Millenium Copyright Act (17 U.S.C. § 1202) — as other courts in the Central District ruled in the cases against Moschino and Roberto Cavalli — their claims under California’s right of publicity statute (Cal. Civ. Code § 3344) may be somewhat less certain. There is a dearth of precedent for extending the protections of California’s right of publicity statute to out-of-state residents, even if the court, as in the case against Moschino, finds that a graffiti artist’s tag is a name in a literal sense. 
Therefore, this case has the potential to better define the legal landscape faced by foreign street artists pursuing copyright infringement in the United States and right of publicity claims in California. Still, the lawsuit is at its infancy and, similar to the cases against other retailers, may settle before being fully litigated on its merits. We will continue to monitor this case and provide updates as it develops. 

Navigating Trump’s Semiconductor Strategy

As President Donald Trump’s second term continues, the government’s approach to the semiconductor industry is undergoing a significant shift. Industry stakeholders should anticipate changes in key areas, including the “CHIPS and Science Act,” tariff implementations, export controls, and regulatory frameworks.

Reassessment of the CHIPS and Science Act
Enacted in 2022, the “CHIPS and Science Act” allocated substantial funding to bolster domestic semiconductor manufacturing and research. Despite its bipartisan support, President Trump has criticized the act, describing it as unnecessary subsidization.
“Your CHIPS Act is a horrible, horrible thing. We give hundreds of billions of dollars and it doesn’t mean a thing. They take our money and they don’t spend it… You should get rid of the CHIPS Act and whatever is left over, Mr. Speaker, you should use it to reduce debt.”

Remarks by President Trump in Joint Address to Congress, March 4, 2025
Reports suggest that the Administration is considering repealing or modifying the law, favoring broader tax reductions and elevated tariffs as mechanisms to stimulate a “manufacturing renaissance.” Such a policy shift will inevitably impact ongoing and future semiconductor projects within the United States.
At a minimum, the Trump Administration will likely review and look for opportunities to modify contracts and grants issued under the Biden Administration, including trying to remove provisions related to diversity, equity, and inclusion. Companies that participated in the “CHIPS Act” programs should expect increased scrutiny from federal departments, Inspector Generals, and Congress looking to prove that the Biden Administration wasted taxpayer funds in carrying out the “CHIPS Act.”
Elevated Tariffs, Export Controls, and Technology Restrictions
Consistent with his “America First” trade philosophy, President Trump has launched into imposing significant tariffs on imports, including a universal 20% tariff on Chinese goods and 25% tariff on all products from Canada and Mexico – with notable exceptions for those covered by the United States–Mexico–Canada Agreement (USMCA). With additional measures under consideration, these moves are anticipated to disrupt global supply chains, particularly affecting the semiconductor industry, which relies heavily on international collaboration. The imposition of these tariffs could lead to increased costs for consumer electronics and potential retaliatory actions from trade partners.
During the final months of the Biden Administration, significant export controls were introduced to limit China’s access to advanced U.S. semiconductor technology, citing national security concerns. These measures included restrictions on advanced AI chips, cloud access, and model weights. The implementation of these controls now falls under the purview of the Trump Administration.
That said, while President Trump has historically advocated for stringent measures against China, certain post-election actions suggest a pragmatic moderating. In a notable example, President Trump delayed the shutdown of TikTok to facilitate a sale of the app, indicating that the Administration may reassess existing export controls to balance national security concerns with economic interests. However, any effort to significantly relax export restrictions on advanced chips to China will run into bipartisan opposition from Congress as well as China hawks within the Administration.
Deregulation and Industry Incentives
In alignment with its broader deregulatory agenda, there is an expectation that the Trump Administration will relax regulations across the technology sector. Notably, President Trump revoked an executive order on artificial intelligence signed by former President Biden, suggesting an intention to foster innovation and reduce compliance burdens for technology companies. This policy shift is likely to create a more favorable environment for domestic semiconductor manufacturers and encourage increased investment and production within the United States, along the lines of the recently announced US$500 billion Project Stargate.
A second “CHIPS Act” is unlikely to gain traction in Washington. Many Republican members of Congress have committed to making federal spending cuts in exchange for a US$4 trillion dollar increase in the debt ceiling, a reauthorization of President Trump’s Tax Cuts and Jobs Act (TCJA), new tax breaks, and additional funds for border security and the military. As it stands now, there is simply no appetite among congressional Republicans for another large spending bill. TCJA reauthorization does present some opportunities for chip industry stakeholders, as bipartisan provisions being discussed include reinstating immediate R&D expensing.
Geopolitical Implications
The Administration’s policies are poised to reshape the global semiconductor landscape significantly. By implementing protectionist measures and reassessing existing trade agreements, the Trump Administration aims to strengthen the U.S.’s position in the semiconductor sector. However, there is a real risk that these actions lead to heightened geopolitical tensions, particularly with China and Europe, and could result in retaliatory measures affecting other industries. The potential for a more fragmented global market poses challenges for corporations operating within the semiconductor supply chain.
Conclusion
In summation, President Trump’s Administration is adopting a more protectionist and assertive approach in the semiconductor industry, focusing on reshoring manufacturing through a combination of export controls, de-regulation, favorable tax policy, and tariffs. While these policies aim to bolster U.S. competitiveness, they also introduce uncertainties and potential challenges within the global semiconductor landscape.

Navigating Divorce: Key Evidence Strategies for Family Law Cases

Getting Your Story to a Judge
Divorce and family law proceedings can be emotionally charged and legally complex, particularly when disputes arise over issues such as property division, child custody, spousal support, or allegations of misconduct. Litigants have been living their story for years, but a judge knows nothing about the situation and will be hearing two sides for the first time.
Evidence plays a crucial role in influencing the court’s decisions, and understanding the potential challenges surrounding evidence is key to effectively navigating these cases. Below are some of the primary evidence-related issues that arise in divorce cases, along with strategies to address them so that your judge can hear the important facts of your story.
Admissibility of Evidence
Courts typically have strict rules about what evidence is admissible. For instance, hearsay—statements made outside of court by people who are not parties to the divorce—is generally inadmissible unless it falls under an exception. In other words, you cannot say, “my best friend saw my spouse gambling large sums of money at the casino.” The friend who actually observed the spouse must testify as to what was seen. Documents must be authenticated so that a judge is satisfied that the information it contains is genuine.
Similarly, evidence must be relevant to the issues at hand. For example, information about a spouse’s personal habits may not be admissible unless it directly impacts child custody or marital finances. So, if the spouse has been engaged in an extramarital affair, this may not be relevant to the issue of whether the parent is capable of caring for a child.
Tips for Avoiding Admissibility Issues:
Ensure all evidence is directly related to the claims or defenses in your case. For every statement, position, and information you want to provide to support your position, make sure that your evidence is accurate and can be verified. Provide your attorney with the information as soon as possible so that there is time to get what may be needed.
For instance, if a spouse has taken large sums of money from an account, the attorney will need time to get certified copies of bank statements by way of subpoena. This can take time, particularly if the bank is out of state.
Work with your attorney to verify that the evidence complies with local rules of Evidence.
Digital Evidence
In today’s digital age, emails, text messages, social media posts, and even GPS data are commonly presented as evidence. However, authenticity and privacy concerns can complicate their use. Courts may require proof that digital evidence has not been tampered with or taken out of context.
There is something called “The Completeness Doctrine” which means that a single text may not suffice, and the entire thread is necessary. Moreover, a screen shot may not be enough, and an attorney can evaluate if there are other steps that should be taken to get the evidence to the judge. This often includes video evidence such as videos taken with a smart phone, or police body camera footage.
Tips for Avoiding Digital Evidence Issues that can Prevent Your Proofs from Being Admitted
Preserve original digital files with metadata intact.
Avoid accessing or presenting information obtained through illegal means, such as hacking into a spouse’s email account.
Be cautious about your own online activity during divorce proceedings.
Spoliation of Evidence
Spoliation refers to the destruction or alteration of evidence that is relevant to a legal case. In divorce cases, this might involve deleting incriminating text messages or destroying financial records. Courts take spoliation seriously and may impose sanctions, including drawing adverse inferences or awarding legal fees to the other party.
Tips for Avoiding Spoilation of Evidence Issues:
Avoid deleting, altering, or destroying any potential evidence, even if you believe it may harm your case. Give the evidence to your attorney and let them help you determine the best way to address the issue. The other side likely has the same information, and if relevant, will ask that it be considered.
If you suspect your spouse is engaging in spoliation, notify your attorney immediately and consider seeking a court order to preserve evidence.
Financial Evidence
Financial disputes are a central issue in many divorces, and accurate financial evidence is critical. Hidden assets, underreported income, or discrepancies in financial disclosures can lead to significant legal challenges. Common forms of financial evidence include tax returns, bank statements, credit card records, and property appraisals.
Tips for Avoiding Issues with Financial Evidence
Be thorough and honest in disclosing your financial situation.
When possible, obtain statements and records directly from financial institutions. They will most likely be accompanied by a certification of the accuracy and authenticity of the records, which is often admissible.
If you do not have tax returns, the IRS can provide a transcript of the entries on the returns, which can be helpful.
Use forensic accountants or financial experts to uncover hidden assets or evaluate complex financial arrangements when necessary.
Expert Testimony
In cases involving contested child custody, property valuation, or allegations of abuse, expert testimony can be crucial. Psychologists, appraisers, and other professionals can provide opinions that carry significant weight in court. However, opposing parties may challenge the qualifications or conclusions of your experts.
Tips for Avoiding Issues with Expert Testimony
Choose experts with strong credentials and experience in family law cases.
Ensure your expert’s testimony is backed by solid evidence and methodology.
Privileged Communications
Certain communications are protected by legal privilege and cannot be used as evidence. Examples include conversations with your attorney or therapist. However, privilege can be waived if confidentiality is breached, such as by discussing the communication in public or sharing it with a third party.
Tips for Avoiding Issues with Privileged Communications
Keep privileged communications confidential. It is tempting to speak to your closest confidants about your case, but this is dangerous if it is something that you do not want disclosed.
Avoid discussing legal strategies or sensitive topics in public or online forums. This is an excellent way to anger a judge.
Bias and Credibility Issues
The credibility of witnesses and evidence can significantly impact a case. A history of dishonesty or bias may lead the court to question the reliability of a person’s testimony or evidence.
Tips for Avoiding Bias and Credibility Issues
Present your case with honesty and transparency – the good, the bad, and the ugly. It will likely come out anyway, so make sure it is with your narrative.
Avoid exaggerating claims or presenting questionable evidence, as this can undermine your credibility.
Make sure that no witness who is testifying on your behalf has skeletons in their closet that could have a negative impact on your case.
Open and honest communication with your lawyer is key to being able to give the judge your story in the way you want it told.

CIPL Submits Response to India’s Draft Digital Personal Data Protection Rules

Earlier this month, the Centre for Information Policy Leadership at Hunton submitted a response (the “Response”) to India’s Ministry of Electronics and Information Technology (“MeitY”) regarding the Draft Digital Personal Data Protection Rules 2025 (the “Draft Rules”), which were published on January 3, 2025. The Draft Rules provide greater detail on a number of statutory provisions of India’s Digital Personal Data Protection Act 2023 (the “Act”).
As detailed further in the Response, it is CIPL’s view that given the complexities involved for certain operational and technical requirements of the Draft Rules, MeitY should consider a staggered or phased implementation period, particularly with respect to Rule 10 (which addresses verifiable consent) and Rule 13 (which addresses consent managers).
CIPL included the following comments in its Response, among others:

Rule 3 (notice): the notice requirements as drafted could be interpreted as requiring unwieldy and long notices that do not benefit the relevant individuals.
Rule 4 (consent managers): the rule fails to address the interoperability of platforms maintained by different consent managers and to what extent such platforms must be interoperable with systems used by data fiduciaries.
Rule 6 (security): the rule should be amended to provide organizations with a degree of flexibility to employ context-specific security safeguards, as opposed to setting a “minimum” requirement.
Rule 7 (incident notification): the rule should require notification of a personal data breach only where the breach is material, i.e., where it is likely to result in significant harm to individuals.
Rule 8 (retention and deletion): the rule should adopt accountability-based safeguards for data fiduciaries, such as risk assessments and privacy enhancing measures, to determine appropriate retention and deletion practices based on context.
Rule 10 (verifiable consent): the rule requires further clarification on key terms, such as “identity” and “age,” and whether data fiduciaries may meet their compliance obligations based on self-declarations and supporting documents provided by individuals claiming guardianship.
Rule 11 (children’s data exemptions): exemptions for processing children’s data should be broadened to include the personalization of services that do not otherwise have detrimental effects on children.
Rule 12 (significant data fiduciary): MeitY should provide guidance establishing a clear threshold for an entity’s designation as a “Significant Data Fiduciary,” and modify the rule to either delete the reference to algorithmic software, or limit its coverage to address situations that pose significant risk.
Rule 14 (international transfers): the rule should be amended to explicitly recognize lawful data transfer mechanisms that align with global standards—such as standard data protection clauses, binding corporate rules, certification mechanisms, or binding schemes such as Global Cross Border Privacy Rules—thereby ensuring that personal data remains protected while enabling India to remain an active participant in the global digital economy.

View CIPL’s full comments.

CPPA Advances Proposed Regulations for Data Broker Deletion Mechanism

On March 7, 2025, the California Privacy Protection Agency (“CPPA”) voted to authorize the agency to advance proposed data broker regulations concerning the Delete Request and Opt-Out Platform (“DROP”) to formal rulemaking.
The CPPA’s proposed DROP regulations are part of the agency’s efforts to implement California’s Delete Act. The Delete Act requires the CPPA to establish an accessible deletion mechanism to allow consumers to request from registered data brokers the deletion of all non-exempt personal information related to the consumer through a single deletion request to the CPPA. The proposed DROP regulations coincide with the California AG’s enforcement sweep targeting the location data industry and recent enforcement activity by the CPPA against data brokers.
The accessible deletion mechanism provisions of the Delete Act apply to a data broker, meaning a business that knowingly collects and sells to third parties the personal information of a consumer with whom the business does not have a direct relationship. Notably, the proposed regulations would change the definition of “direct relationship” to clarify that a business does not have a direct relationship with a consumer simply because it collects personal information directly from the consumer. Instead, to have a direct relationship with the business, the consumer must intend to interact with the business. Therefore, the revision would bring within scope of covered data brokers businesses that collect and sell to third parties the personal information of a consumer that did not intend to interact with the business.
It is anticipated that DROP will be accessible to consumers by January 1, 2026, and to data brokers by August 1, 2026.

Navigating DORA Compliance: Recent Developments

The EU Digital Operational Resilience Act (DORA) took effect on 17 January 2025 after a two-year implementation period. DORA sets out new requirements for financial entities (FEs) and their information technology and communication (ICT) third-party service providers (TPPs). This note highlights recent developments in the EU’s efforts to facilitate in-scope firms’ compliance with DORA and authorities’ attempts to avoid duplication of operational resilience requirements.
Further information regarding DORA developments can be found in our previous articles (available here, here, here and here).
EBA Amends Guidelines on ICT and Security Risk Management 
On 11 February 2025, the European Banking Authority (EBA) amended its existing 2019 guidelines on ICT and security risk management measures (Guidelines) to align them with DORA.
The EBA has narrowed the scope of its Guidelines to cover:

only FEs subject to DORA, including credit institutions, payment institutions, account information service providers, exempted payment institutions and exempted e-money institutions; and
relationship management of the payment service users in relation to the provision of payment services.

The EBA’s aim is to simplify the ICT risk management framework and provide legal clarity for the industry, by avoiding duplication of requirements and ensuring consistency across the EU single market.
However, other types of payment service providers (PSPs), such as post-office giro institutions and credit unions, who are not covered by DORA, will still have to comply with the security and operational risk management requirements under the revised Payment Services Directive (PSD2), which has been in force since March 2018. In addition, PSPs that remain subject to the PSD2 security and operational risk management requirements can potentially be subject to additional national requirements.
The Guidelines will apply within two months of the publication of the translated versions.
The Guidelines and accompanying press release are available here and here, respectively.
Commission Adopts Delegated Regulation on Threat-led Penetration Testing under DORA
On 11 February 2025, the European Central Bank (ECB) published an updated version of its framework for threat intelligence-based ethical red teaming (TIBER-EU Framework) that aligns with the DORA regulatory technical standards on threat-led penetration testing (TLPT RTS). This follows the ECB’s publication of a paper considering the TIBER-EU Framework in the context of DORA. Further information on this earlier paper can be found in our previous article (available here).
DORA mandates the European Supervisory Authorities (ESAs), together with the ECB, to develop draft RTS in accordance with the TIBER-EU Framework to specify the following:

the criteria to identify FEs required to perform TLPT;
the requirements regarding test scope, testing methodology and results of TLPT;
the requirements and standards governing the use of internal testers; and
the rules on supervisory and other co-operation needed for the implementation of TLPT and for mutual recognition of testing. 

On 13 February 2025, the European Commission (Commission)adopted a delegated regulation (Delegated Regulation), with accompanying annexes 1-8, supplementing DORA in relation to the TLPT RTS. The Delegated Regulation shall enter into force and apply 20 days after its publication in the Official Journal of the European Union. 
The Delegated Regulation and updated version of the TIBER-EU Framework are available here and here, respectively. 
ESAs Publish Roadmap on the Designation of CTPPs under DORA
On 18 February 2025, the ESAs published a roadmap (Roadmap) for the designation of critical ICT TPPs (CTPPs), which will be subject to direct EU supervision under DORA.
Notably, the Roadmap sets out four steps to designation of CTPPs in 2025:

by 30 April 2025, the ESAs will collect the registers of information on ICT third-party arrangements submitted by FEs to their national competent authorities;
by the end of July 2025, the ESAs will perform the criticality assessments mandated by DORA and notify ICT TPPsif they are classified as critical; 
by mid-September 2025, there will be a six-week hearing period where TPPs can object to the assessment, with a reasoned statement and supporting information; and
by the end of 2025, the ESAs will have designated and published a list of CTPPs and commenced oversight engagement. 

The accompanying press release notes that TPPs that are not designated as critical can voluntarily request to be designated once the list of CTPPs is published, with details on how to raise such a request to be provided soon. 
The ESAs expect to organise an online workshop with TPPs in Q2 2025 to provide further clarity on preparatory activities, the designation process and the ESAs’ oversight approach.
The Roadmap and press release are available here and here, respectively.
Delegated and Implementing Regulations on Major ICT-Related Incidents and Cyber Threats Under DORA Published
On 20 February 2025, Delegated and Implementing Regulations (together, the Regulations) supplementing DORA were published in the Official Journal of the European Union, setting out the detailed requirements and procedures for reporting and notifying ICT-related incidents and cyber threats. The Commission adopted the Regulations in October 2024.

The Delegated Regulation specifies the content and time limits for the initial notification of, and intermediate and final report on, major ICT-related incidents by FEs, and the content of the voluntary notification for significant cyber threats. 
The Implementing Regulation sets out the standard forms, templates and procedures for FEs to report a major ICT-related incident and to notify a significant cyber threat.

Both Regulations will enter into force on 12 March 2025.
The Regulations are available here and here, respectively.
ESAs Publish Opinion on Commission’s Rejection of Draft RTS on Sub-contracting ICT Services Supporting Critical or Important Functions
On 7 March 2025, the ESAs published an opinion (Opinion) on the Commission’s rejection of its draft RTS on the elements an FE needs to determine and assess when sub-contracting ICT services supporting critical or important functions. 
The Commission notified the ESAs that it had rejected the draft RTS in January 2025 on the basis that certain requirements introduced by the draft RTS went beyond the mandate given to the ESAs under DORA. The Commission noted that Article 5 of the draft RTS, and the related recital 5, should be removed from the draft RTS. The Commission then stated it would adopt the RTS once the ESAs had made the necessary modifications.
In the Opinion, the ESAs acknowledge that the Commission’s amendments will ensure that the draft RTS are fully in line with its mandate under DORA. The ESAs do not recommend changes to the Commission’s proposed amendments. They note that FEs are expected to adhere to the provisions on subcontractors as set out in Article 29(2) of DORA and Article 3(6) of the implementing technical standards on the register of information.
The Opinion is available here.

Trending in Telehealth: February 2025

Trending in Telehealth highlights monthly state legislative and regulatory developments that impact the healthcare providers, telehealth and digital health companies, pharmacists and technology companies that deliver and facilitate the delivery of virtual care.
Trending in February:

Interstate compacts
Telepharmacy services
Veterinary services
Telehealth practice standards

A CLOSER LOOK
Proposed Legislation & Rulemaking:

North Dakota proposed amendments to the North Dakota Century Code related to optometrist licensure and standards for providing tele-optometry. The amendments delineate the circumstances under which a licensed optometrist may use telemedicine to provide care. Proposed practice standards include requirements to establish a proper provider-patient relationship and requirements related to informed consent.
In Indiana, Senate Bill 473 proposed amendments that would allow providers to prescribe certain agonist opioids through telemedicine technologies for the treatment or management of opioid dependence. Current law only allows partial agonist opioids to be prescribed virtually.

Finalized Legislation & Rulemaking Activity:

Ohio enacted Senate Bill 95, authorizing the operation of remote dispensing pharmacies, defined as pharmacies where the dispensing of drugs, patient counseling, and other pharmacist care is provided and monitored through telepharmacy systems.
The Texas Health and Human Services Commission adopted an amendment to the Texas Government Code, requiring that providers be reimbursed for teledentistry services. The amendment allows flexibility for a dentist to use synchronous audiovisual technologies to conduct an oral evaluation of an established client. This change makes oral evaluations more accessible and prevents unnecessary travel for clients in the Texas Health Steps Program.
The Arkansas governor signed Senate Bill 61 into law, authorizing the practice of veterinary telemedicine in the state. The bill includes practice standards for veterinary telemedicine and provision of emergency veterinary care.
Also in Arkansas, House Bill 1427 enacted the Healthy Moms, Healthy Babies Act. The act amends Arkansas law to improve maternal health and establish reimbursement procedures for remote ultrasounds.

Compact Activity:

Several states have advanced licensure compacts. These compacts enable certain categories of physicians to practice across state lines, whether in person or via telemedicine. The following states have introduced bills to enact these compacts:

Dietitian Licensure Compact: Mississippi, Kansas, and North Dakota.
Social Work Licensure Compact: Mississippi, Maryland, and North Dakota.
Occupational Therapy Licensure Compact: North Dakota and New Mexico.
Audiology and Speech Language Pathology Compact: New Mexico.

Why it matters:

States continue to expand practitioners’ ability to provide telehealth services across state lines. While telemedicine is often seen as an alternative method for care delivery, it can sometimes be the most effective and efficient option. Expanding interstate licensure compacts improves access to qualified practitioners, particularly in underserved and rural areas. These compacts also enhance career opportunities and reduce the burdens associated with obtaining multiple state licenses.
States continue to apply telehealth practice standards to various professions. Legislative and regulatory trends reflect recognition that telehealth can be used in a variety of specialty practices, including veterinary medicine, dentistry, and optometry.

Telehealth is an important development in care delivery, but the regulatory patchwork is complicated. 

BEHIND THE FILTERS: CapCut And TikTok Are Making The Cut And Maybe Your Personal Data Too

Greetings CIPAWorld!
Let’s get techy with it. Ever edited a TikTok or Instagram Reel using CapCut? It turns out that you might have handed over more than just your creativity. The Northern District of Illinois has delivered a mixed but consequential ruling in Rodriguez v. ByteDance, Inc., about how video editing apps collect and utilize our personal data. See Rodriguez v. ByteDance, Inc., No. 23 CV 4953, 2025 U.S. Dist. LEXIS 37355 (N.D. Ill. Mar. 3, 2025). You guessed it. TikTok is at issue here. If you’ve ever used CapCut to perfect a TikTok video or Instagram reel, this decision deserves your attention!
Yikes. Imagine editing a quick vacation video only to discover the app might be scanning every photo in your gallery and capturing your facial features! That’s precisely the kind of privacy implications at the center of this case. Make sure to always check your app permissions!
The Opinion in ByteDance, Inc. offers a nuanced examination of modern privacy law. It allows several significant claims to proceed while dismissing others. So, let’s get into a brief background first.
CapCut, developed and operated by Chinese technology giant ByteDance (which also owns TikTok), has exploded in popularity since its 2020 U.S. launch. Now, it’s one of the most downloaded apps globally, with over 200 million monthly active users! CapCut allows users to create, edit, and customize videos using templates, filters, and visual effects. Everyone wants to look good, right? While predominantly free, users can access premium features through subscription models. It’s remarkable how quickly CapCut became essential for content creators. When in actuality, this case forces us to confront the reality that the most user-friendly tools might also be invasive.
However, according to the Plaintiffs, this seemingly innocent video editor allegedly harbors a more problematic function—collecting vast amounts of user data without proper authorization. The Complaint alleges that CapCut collects everything from registration information and social network contacts to location data, photos, videos, and even biometric identifiers like face geometry scans and voiceprints. Yes you read that right… Biometric identifiers.
First, the Court’s reasoning behind allowing the California constitutional and common law privacy claims to proceed reveals evolving judicial thinking about digital privacy. Judge Alexakis emphasized that privacy violations don’t depend solely on the sensitivity of the content collected but also on the manner of collection. See Davis v. Facebook, Inc. (In re Facebook Inc. Internet Tracking Litig.), 956 F.3d 589, 603 (9th Cir. 2020).
Many of us miss this critical distinction in our everyday tech interactions. We often focus on what data apps collect rather than how they collect it. The Court’s analysis suggests that even innocuous data could trigger privacy concerns if gathered through deceptive or overly invasive methods—a crucial lesson for developers and users alike.
Here, the Judge found the allegations that CapCut accesses and collects all the videos and photos stored on their devices, not just those they voluntarily uploaded to the CapCut app, particularly troubling. If proven true, this broad data collection practice would violate reasonable user expectations. Ringing any bells here? Drawing parallels to Riley v. California, 573 U.S. 373, 397-99 (2014), which recognized that individuals have a reasonable expectation of privacy in the contents of their cell phones, Judge Alexakis noted that a reasonable CapCut user would not expect the app to access and collect all the photos and videos on their devices, regardless of whether they use those photos and videos to create content within the app. Makes perfect sense, right?
Let that sink in for a minute. An app potentially scans your entire photo library when you only intend to edit a single clip! This broad access would be like handing a stranger your family photo album when they only ask to see one vacation picture. The Court rightly recognized how this violates our intuitive sense of privacy.
For the California constitutional and common-law privacy claims regarding user identifiers and registration information, the Court specifically relied on United States v. Soybel, 13 F.4th 584, 590-91 (7th Cir. 2021) for the principle that a person “has no legitimate expectation of privacy in information he voluntarily turns over to third parties,” which was central to dismissing claims based on this type of information.
Next, the California Invasion of Privacy Act (“CIPA”) claims represented a significant but ultimately unsuccessful component of Plaintiffs’ case. As we know, under CIPA, individuals are protected against unauthorized electronic interception of communications. Section 631(a) prohibits any person from using electronic means to “learn the contents or meaning” of any “communication” without consent or in an “unauthorized manner.” Critically, neither CIPA nor the federal Electronic Communications Privacy Act (“ECPA”) impose liability on a party to the communication, as Judge Alexakis noted in Warden v. Kahn, 99 Cal. App. 3d 805, 811, 160 Cal. Rptr. 471 (1979) held that section 631… has been held to apply only to eavesdropping by a third party and not to recording by a participant to a conversation.
Here’s where Plaintiffs ran into a fascinating legal hurdle. When you voluntarily use an app, the law often treats that app as a communication “participant” rather than an eavesdropper. Think about how different this is from our intuitive understanding… Few of us would consider a video editor an equal “participant” in our creative process, yet that’s essentially the legal fiction applied here.
Plaintiffs’ attempted to circumvent this limitation by asserting that Defendants effectively intercepted their data by “redirecting” communications to unauthorized third parties, including the Chinese Communist Party. They relied on legal authorities like Davis, 956 F.3d at 596, 607-08, where Facebook used plugins to track browsing histories even after users logged out.
Conversely, Judge Alexakis found two factual flaws in this theory. First, Plaintiffs failed to plausibly allege that any communications were intercepted during transmission rather than merely shared after collection. Though the Court acknowledged that the Seventh Circuit hadn’t definitively ruled whether interception must be contemporaneous with transmission, it noted that every court of appeals to consider the issue had reached this conclusion. See Peters v. Mundelein Consol. High Sch. Dist. No. 120, No. 21 C 0336, 2022 WL 393572, at *11 (N.D. Ill. Feb. 9, 2022). Second, Paintiffs’ allegations fell short of the specific software-tracking mechanisms proven sufficient in cases like Facebook Tracking. They identified no particular mechanism by which ByteDance contemporaneously redirected communications to third parties.
Let’s dig a little deeper so this makes sense. There’s a (legal) difference between an app intercepting your data in transit (like wiretapping a phone call) versus collecting it at the endpoint and sharing it later. Most individuals would see little practical difference in the outcome. Your private data ends up in unexpected hands either way, yet courts maintain this technical distinction that significantly impacts your legal protections.
Next, the Court rejected claims under Section 632 of CIPA, which imposes liability on parties who use an electronic amplifying or recording device to eavesdrop upon or record confidential communication. Beyond the conclusory assertions that ByteDance intercepted and recorded videos without consent, Plaintiffs failed to allege that Defendants used any electronic amplifying or recording device to eavesdrop on conversations.
Let’s switch it up now. We are going to talk about something a little different. Perhaps the most meaningful survival in the decision concerns claims under Illinois’ Biometric Information Privacy Act (“BIPA”). The Court rejected ByteDance’s argument that BIPA only applies when companies use biometric data to identify individuals. Looking at the statute’s plain language, which defines “biometric identifier” to include “voiceprint[s]” and “scan[s] of… face geometry,” the Court found no requirement that the data be used for identification purposes.
This is genuinely interesting to me. Illinois lawmakers created one of the strongest biometric protection laws in the country, and the Court’s ruling reinforces just how far those protections extend. The practical effect here is enormous. Companies can’t just escape liability by claiming they collected your facial geometry or voiceprints for purposes other than identification. The mere collection without proper consent is enough to trigger liability. This reasoning aligns with an emerging consensus in the Northern District of Illinois. In Konow v. Brink’s, Inc., 721 F. Supp. 3d 752, 755 (N.D. Ill. 2024), the Court held that a defendant may violate BIPA without using technology to identify an individual; instead, BIPA bars the collection of biometric data that could be used to identify a plaintiff.
The Court also found persuasive Plaintiffs’ detailed allegations that ByteDance employs engineers specializing in “computer vision, convolutional neural network, and machine learning, all of which are used to generate the face geometry scans that Defendants derive from the videos of CapCut users.” These technical specifications helped elevate the claims beyond mere conclusory allegations.
What’s particularly impressive here is how Plaintiffs connected the dots between ByteDance’s engineering talent, their patent applications for voiceprint technology, and the actual functions of CapCut. This level of technical detail is increasingly necessary in privacy litigation. In turn, vague claims of data collection often fail without demonstrating the underlying mechanisms involved.
For BIPA’s Section 15(c) claim, the Court relied on the statutory interpretation principle of ejusdem generis in interpreting “otherwise profit.” In Circuit City Stores v. Adams, 532 U.S. 105, 114-15, 121 S. Ct. 1302, 149 L. Ed. 2d 234 (2001), the Court noted that when general words follow specific words, they “embrace only objects similar in nature to those objects enumerated by the preceding specific words.” This interpretive principle was key to the Court’s narrower reading of the statute, limiting “otherwise profit” to commercial transactions similar to selling, leasing, or trading data. Sequentially, the Court rejected ByteDance’s argument that internal use of biometric data—such as improving CapCut’s editing features—constitutes “otherwise profiting” under BIPA. It is fascinating how this determination narrows the scope of liability under Section 15(c), signaling that plaintiffs must show an actual external transaction involving biometric data to succeed on these claims.
Next, the Court analyzed various consumer protection claims that failed because Plaintiffs couldn’t demonstrate economic injury. For their claims under California’s Unfair Competition Law (“UCL”) and False Advertising Law (“FAL”), Judge Alexakis emphasized that, unlike the broader Article III standing requirements, these statutes demand a showing that plaintiffs lost money or property. This highlights one of the most frustrating aspects of privacy litigation for consumers: proving financial harm from privacy violations is extraordinarily difficult. We intuitively understand that our personal data has value (why else would companies collect it so aggressively?). Yet, courts often struggle to quantify it or recognize its loss as economic injury. It’s like recognizing theft only when something tangible is taken.
The Court was particularly unpersuaded by theories based on the diminished value of personal data, noting Plaintiffs hadn’t alleged they attempted to sell their data or received less than market value. Davis, 956 F.3d at 599, rejected a similar argument that a loss of control over personal data constituted economic harm. The Court also referenced Cahen v. Toyota Motor Corp., 717 F. App’x 720, 723 (9th Cir. 2017), which held that speculative claims about diminished data value, without concrete evidence of lost economic opportunity, are insufficient to establish standing under consumer protection statutes.
Moreover, like in Griffith v. TikTok, Inc., No. 5:23-cv-00964-SB-E, 2023 U.S. Dist. LEXIS 223098, at *6 (C.D. Cal. Dec. 13, 2023), the Court observed that Plaintiffs failed to show they attempted or intended to participate in the market for their data. Additionally, the Court noted that Plaintiffs failed to allege any direct financial loss tied to CapCut’s data practices, distinguishing their claims from cases where courts recognized economic harm due to specific monetary expenditures, such as fraudulent charges or paid services rendered worthless by deceptive conduct.
Next, the Court turned to the core issue underlying many of Plaintiffs’ claims—what ByteDance actually did with the data it collected and whether users had truly consented to these practices. One of the most fascinating aspects of the Opinion is the battle over consent. ByteDance mounted an aggressive defense centered on its Terms of Service and Privacy Policies, arguing that users effectively waived their rights by agreeing to these documents. The company submitted three distinct versions of its Privacy Policy from 2020, 2022, and 2023, each making various disclosures about data collection practices.
Let’s be honest…When was the last time any of us actually read a privacy policy before clicking “agree”? Side note: you should. ByteDance, like many tech companies, is backing their legal protection on documents they know full well most users never read. What’s remarkable is that courts are increasingly skeptical of this fiction, recognizing the reality of how users actually interact with digital products.
Judge Alexakis’s detailed analysis here offers insights for both app developers and users alike. She recognized that while the policies could potentially be incorporated by reference since Plaintiffs mentioned them in their Complaint, she refused to dismiss the case based on consent at this early stage. The Court emphasized that dismissing claims based on affirmative defenses like waiver is only appropriate if “the allegations of the complaint itself set forth everything necessary to satisfy the affirmative defense.” See United States v. Lewis, 411 F.3d 838, 842 (7th Cir. 2005).
Here, there are several critical factual questions that prevented the Court from accepting ByteDance’s consent defense. Most notably, there was no conclusive evidence about exactly when and how the plaintiffs agreed to the terms. While ByteDance simply asserted that “[u]sers expressly or impliedly consent to the policy upon downloading and using the app,” Plaintiffs countered that they “were able to access the CapCut platform without having to scroll through and read such policies before they were allowed to sign up for the services.”
The Court was particularly skeptical of ByteDance’s reliance on what appeared to be a browsewrap agreement—where terms of service are presented passively and users are presumed to agree simply by using the service. Judge Alexakis emphasized that browsewrap agreements are only enforceable if users have actual or constructive notice of the terms. See Specht v. Netscape Commc’ns Corp., 306 F.3d 17, 30-31 (2d Cir. 2002). This means that merely linking to a privacy policy at the bottom of a webpage or app interface is insufficient to establish consent. As such, actual consent requires more than theoretical access to terms.
Additionally, the Court noted that the placement and formatting of ByteDance’s consent prompts were unclear, raising doubts about whether the plaintiffs were ever explicitly informed of the policy’s existence before using CapCut. This aligns with precedent from Nguyen v. Barnes & Noble Inc., 763 F.3d 1171, 1177 (9th Cir. 2014), where courts declined to enforce arbitration clauses hidden in inconspicuous terms of service.
This dispute highlights a pervasive problem I’m recognizing in digital consent. There is a gap between technical legal compliance and actual user understanding. Despite ByteDance presenting screenshots showing a prompt requiring users to click “Agree and continue,” Judge Alexakis noted this evidence couldn’t establish whether these particular plaintiffs had seen and agreed to these specific terms, especially because they explicitly alleged they never read any privacy policy or terms of use.
The Court also highlighted another crucial factual gap. Plaintiffs asserted that the attached policies were only three of the ten (or more) versions of the policy that existed over time. In Patterson v. Respondus, Inc., 593 F. Supp. 3d 783, 805 (N.D. Ill. 2022), the Court declined to dismiss claims based on policies because the case “may involve factual questions about what [defendant’s] policies looked like at different moments in time.”
The Court’s skepticism should serve as a wake-up call. Concealing invasive data practices within complicated legal documents and merely asserting user consent may be coming to an end. Companies that are truly dedicated to privacy must go beyond minimal compliance and strive for actual transparency and meaningful choices for users.
For the Computer Fraud and Abuse Act (“CFAA”) claim (Count I), the Court undertook a analysis of the access without authorization element. The CFAA, that was originally enacted to combat hacking, imposes liability on anyone who intentionally accesses a computer without authorization or exceeds authorized access to obtain information. While finding that Plaintiffs’ allegations were insufficient, Judge Alexakis specifically distinguished this matter from Brodsky v. Apple Inc., No. 19-CV-00712-LHK, 2019 WL 4141936 (N.D. Cal. Aug. 30, 2019), where the plaintiffs had “concede[d] that [they] voluntarily installed the software update,” which unambiguously established authorization. Here, the Court emphasized that essential questions of fact exist about the scope of the authorization and the design of Plaintiffs’ operating systems, making dismissal based on implied authorization inappropriate at this stage. The Court noted that factual disputes, such as the scope of Defendants’ access, are not appropriately resolved on a motion to dismiss.
Particularly, the Court declined to follow cases like hiQ Lab’ys, Inc. v. LinkedIn Corp., 31 F.4th 1180, 1197 (9th Cir. 2022), which held that scraping publicly available data does not constitute unauthorized access under CFAA. Unlike hiQ Labs, where access restrictions were clear, Plaintiffs alleged that CapCut accessed files beyond what they knowingly permitted. However, the Court found that Plaintiffs failed to sufficiently allege that ByteDance exceeded authorized access under Carr v. Saul, 593 U.S. 83, 141 (2021), which clarified that merely misusing information one is entitled to access does not violate the CFAA.
In dismissing the Stored Communications Act (“SCA”) claims (Count IV), Judge Alexakis found particularly significant the timing mismatch in Plaintiffs’ allegations about data sharing with the Chinese Communist Party (“CCP”). The Court notably observed that even if ByteDance shared user communications with the CCP in 2018 (as alleged by a former employee cited in Plaintiffs’ Complaint), it is too much to presume based on the engineer’s statement that these activities were ongoing several years later when CapCut became available to users in the United States. This temporal gap and the lack of specificity about what data was shared rendered the allegations too speculative to survive dismissal.
The Court also found that Plaintiffs failed to allege that ByteDance qualified as a remote computing service (“RCS”) or electronic communications service (“ECS”) under the SCA. Under the SCA, ECS is any service that provides users with the ability to send or receive wire or electronic communications. At the same time, RCS is a service that provides computer storage or processing services to the public utilizing an electronic communications system. In Garcia v. City of Laredo, 702 F.3d 788, 792 (5th Cir. 2012), the Court noted that for a company to be liable under the SCA, it must provide services that facilitate the transmission, storage, or processing of electronic communications on behalf of users—not merely collect and store user data for its purposes. Because Plaintiffs did not establish that CapCut functioned as an ECS or RCS, their SCA claims failed as a matter of law.
Lastly, Judge Alexakis granted Plaintiffs until April 2, 2025, to file an amended complaint addressing deficiencies in their dismissed claims. So whether you’re a legal professional, casual content creator, or simply concerned about data privacy, as you should be, the ongoing developments in Rodriguez v. ByteDance merit your continued attention.
So, all in all, I’m particularly encouraged by how the Court emphasized consumer expectations throughout its analysis. This suggests a shift from formalistic legal reasoning toward how privacy functions in people’s lives. Most users have never heard of BIPA or CIPA, but they instinctively recognize when an app crosses a line and invades their privacy.
For everyday app users (myself included), this case is a reminder that seemingly innocuous tools like video editors may be far more invasive than they appear. The allegations that CapCut collects all photos and videos on a device—not just those edited—should give pause to anyone who casually grants broad permissions during app installation..
So be careful out there, folks. Next time you download that trending app, consider this before blindly agreeing to permission requests. That innocent-looking video editor allegedly might be analyzing your face, recording your voice, or scanning through years of personal photos—all while you’re just trying to add a filter to your weekend outing with family and friends. Scary stuff.
And yet, whether you have any real recourse if an app oversteps its bounds depends entirely on which law—if any—happens to apply. The fact that some claims survived in this matter while others failed underscores the fragmented and inconsistent nature of privacy law in the U.S. Right now, a company’s liability for invasive data collection often hinges on whether a lawsuit is filed under a state biometric law, a consumer protection statute, or federal wiretap regulations—each with different requirements and loopholes. This patchwork approach leaves consumers vulnerable and businesses uncertain about compliance.
Imagine if physical property rights varied so drastically between states—where some protected against trespassing while others only recognized theft if an item was taken. That’s essentially our current digital privacy landscape, and without unified standards, the gaps in protection will only widen.
As always,
Keep it legal, keep it smart, and stay ahead of the game.
Talk soon!

Why Financial Institutions Should Stay the Course

Introduction
Many regulated businesses believe that the only thing worse than strict regulations is a wholly uncertain regulatory environment. With many rule changes on hold and enforcement actions and investigations being terminated or limited, how do banks, payments program managers, processors, and fintechs move forward? Do they “take their gloves off” and take advantage of a possible enforcement void to maximize profits, or do they stay the course given that there are 50-year-old laws on the books that still apply and probably are not going anywhere? 
We say, continue to innovate with the expectation that certain fundamental laws and rules are unlikely to change and that consumers still want and need financial services and products.
The Resilience of Statutes and Regulations
Most financial institutions, payments companies, and fintechs have always designed their products and services for compliance. When new rules and orders come out, they often do not have to make changes because they had a robust compliance program in place and had already been using best practices. Similarly, they are not quick to take advantage of a “bad” ruling, knowing instinctively that a new statute, order, or ruling will soon restore the status quo.
Even in a time of regulatory uncertainty, the primary federal consumer protection rules that have existed since the late 1960s and 1970s are likely to stay in place. These include the following: 

The Truth in Lending Act (TILA) and its Regulation Z, which, among other things, require loan disclosures, periodic statements for open-end credit, and prepaid account disclosures, and provide consumers with protections from unauthorized credit card transactions. 
The Electronic Fund Transfers Act (EFTA) and its Regulation E, requiring initial disclosures, regulating electronic fund transfer (EFT) arrangements, and providing significant consumer protections from unauthorized EFTs.
The Equal Credit Opportunity Act (ECOA) and its Regulation B, prohibiting impermissible forms of credit discrimination and requiring “adverse action” notices or other notifications regarding credit applications and existing extensions of credit. While the scope of the impermissible discrimination rules may change from time to time, including as a result of court decisions, the basic credit notification requirements are unlikely to change.
The Truth in Savings Act and its Regulation DD, which requires initial disclosures for consumer deposit accounts and, if statements are provided, requires specific information to be included in such statements.
The Real Estate Settlement Procedures Act (RESPA) and its Regulation X. In addition to requiring certain mortgage loan disclosures, Section 8 of RESPA prohibits referral fee and kickback arrangements involving “settlement services.” Here is one area for which the rules might be relaxed. For many years, the ability to enter into marketing services agreements and similar arrangements has been severely limited due to the Section 8 interpretations and enforcement actions of the Consumer Financial Protection Bureau (CFPB). With the CFPB being under new leadership and its future uncertain, marketing arrangements that survived Section 8 scrutiny prior to the CFPB might again be viable. 

For all of the above, while enforcement by federal regulators might be reduced, enforcement by plaintiffs’ lawyers likely will not. This seems particularly likely for those laws such as TILA, the EFTA, and ECOA that provide for class-action liability. 
State laws governing credit interest rates, loan and other product and service fees, and consumer disclosures also are likely to stay in place. Those laws might shift in some states, particularly those laws that were made more burdensome in recent years, but they are unlikely to go away entirely. 
States May Fill the Void
All of the federal laws listed above are “federal consumer financial laws” under the Dodd-Frank Act, and state attorneys general and state regulators are empowered by that act to bring a civil action to enforce any of these laws. The main exception is that a state attorney general or regulator generally may not bring such civil actions against a national bank or federal savings association.
Conclusion
Although there may be some regulatory uncertainty, some things remain constant. Lawyers will be lawyers and lawsuits will be brought, and state attorneys general and regulators can enforce the federal consumer financial laws against most banks and nonbank businesses. 
It is just a question of complying with the existing laws, applying common sense rules, and developing attractive consumer options. We are not without regulatory guardrails, but old-fashioned banking with modern innovations still provides routes to develop and market consumer products and services and build customer relationships. Those businesses that continue to innovate can take the lead.

FCC Seeks Comment on Quiet Hours and Marketing Messages

We recently published a blog about a slew of class action complaints alleging that marketing text messages cannot be sent between the hours of 9:00 pm and 8:00 am (“Quiet Hours”) unless the recipient provides prior express invitation or permission to receive such messages during Quiet Hours (“Quiet Hour Claims”). As noted, based on the plain language of the Telephone Consumer Protection Act (“TCPA”), we disagree with this argument because marketing text messages already require prior express written consent from the called party. The Ecommerce Innovation Alliance (EIA) and others filed a petition for declaratory ruling (“Petition”) with the Federal Communications Commission (“FCC”) to address this application of Quiet Hours to marketing messages.
On March 11, 2025, the FCC released a Public Notice asking for comment on the Petition. So, the FCC, and its Consumer and Governmental Affairs Bureau, have moved quickly to seek public comment on the questions raised by the petitioners.
Initial comments are due by April 10; with reply comments due by April 25. The FCC will then consider the record in contemplating a decision. There is no requirement or specific deadline for the agency to take action on the Petition. However, the plethora of Quiet Hour Claims being filed could encourage relatively prompt FCC action to clarify the rules.

ANOTHER FCC TURN OF EVENTS: As Commissioner Starks Resigns

As we still await the approval of Olivia Trusty to fill the seat that was vacated with the shift of Chairwoman Rosenworcel stepping down and Commissioner Carr entering the role of Chairman, there is a new development as of today with Commissioner Starks making a statement that he plans to resign his seat in the spring. While there is not an exact timeline, Starks did mention fulfilling his duties over the “next few weeks”.
Starks said of his time at the FCC “Serving the American people as a Commissioner on the Federal Communications Commission has been the honor of my life. With my extraordinary fellow Commissioners and the incredible career staff at the agency, we have worked hard to connect all Americans, promote innovation, protect consumers, and ensure national security. I have learned so much from my time in this position, particularly when I have heard directly from Americans on the issues that matter to them. I have been inspired by the passion, engagement and commitment I have seen from colleagues, advocates, and industry.”
Chairman Carr followed up with his own notice and shared the following in response to Starks’ resignation: “Commissioner Starks led many of the FCC’s national security initiatives, and I welcomed the chance to work closely with him on important matters, including promoting new innovations, protecting consumers, and bringing families across the digital divide. Commissioner Starks put in the work and leaves an impressive legacy of accomplishments in public service. I always learned a lot from him and benefited from the many events we held together.”
Does this now leave the FCC with only one Democrat? We will soon find out but since the role is appointed by the President, I have a feeling it will be heavily weighted with Republicans.