Insider Threats: The Overlooked Risks of Departing Employees and Sensitive Data Theft
Insider threats continue to present a significant challenge for organizations of all sizes. One particularly concerning scenario involves employees who leave an organization and impermissibly take or download sensitive company data. These situations can severely impact a business, especially when departing employees abscond with confidential business information or trade secrets. Focusing on how the theft of such information could cripple a business’s operations, competitive advantage, etc. is warranted. It is critical not to overlook, however, other legal and regulatory implications stemming from the theft of certain data, including potential data breach notification obligations.
The Importance of Safeguarding Trade Secrets
Trade secrets generally refer to information that has commercial value because it’s kept secret. Examples include formulas, patterns, programs, devices, methods, and other valuable business data. Such data are often the lifeblood of a company’s competitive edge. These secrets must be safeguarded to retain their value and legal protections under the Uniform Trade Secrets Act (UTSA) which has been adopted by most states. Businesses will need to demonstrate that they took reasonable measures to protect their trade secrets.
Reasonable safeguards under the UTSA can include:
Implementing access controls to restrict employees’ ability to download or share sensitive information.
Requiring employees to sign confidentiality agreements and restrictive covenants.
Regularly training employees on the importance of data security and confidentiality.
Using monitoring tools to detect unusual access or downloads of sensitive data.
Failing to adopt such safeguards can jeopardize a company’s ability to claim protection for trade secrets and pursue legal remedies if those secrets are stolen. Companies should consult with trusted IT and legal advisors to ensure they have adequate safeguards.
Beyond Trade Secrets: Data Breach Concerns
While the theft of confidential business and trade secret information rightly garners attention, focusing exclusively on this aspect may cause companies to miss another critical risk: the theft of personal information. As part of their efforts to remove company information, departing employees may inadvertently or intentionally take personal information, such as employee or customer data, which could trigger significant legal obligations, particularly if accessed or acquired without authorization.
Contrary to common assumptions, data breach notification laws do not solely apply to stolen Social Security numbers. Most state data breach laws define “personal information” broadly to include elements such as:
Financial account information, including debit or credit card numbers.
Driver’s license or state identification numbers.
Health insurance and medical information.
Dates of birth.
Online account credentials, such as usernames and passwords.
Biometric data, such as fingerprints or facial recognition profiles.
The unauthorized access or acquisition of these data elements together with the individual’s name can constitute a data breach, requiring timely notification to affected individuals and, in some cases, regulatory authorities.
Broader Regulatory and Contractual Implications
In addition to state breach notification laws that seek to protect personal information, companies must consider other regulatory and contractual obligations when sensitive data is stolen. For example:
Publicly traded companies: Theft of critical business information by a departing employee may require disclosure under U.S. Securities and Exchange Commission (SEC) regulations if the theft is deemed material. If a company determines the materiality threshold has been reached, it has four days to report to the public.
Critical infrastructure businesses: Companies providing services in regulated industries, such as energy or healthcare, may have reporting obligations to regulatory authorities if sensitive confidential business data is compromised.
Contractual obligations: Many businesses enter into agreements with business customers that require notification if confidential business information or personal data is compromised.
Ignoring these obligations could expose organizations to fines, lawsuits, and reputational harm, compounding the difficulties already created by the theft of an organization’s confidential business information.
Taking a Comprehensive Approach to Data Theft
The theft of confidential business information by a departing employee can be devastating for a business. However, focusing solely on restrictive covenants, trade secrets, or business information risks overlooking the full scope of legal and regulatory obligations. To effectively respond to such incidents, companies should:
Identify the nature of the stolen data: Assess whether the data includes personal information, trade secrets, or other sensitive information that could trigger specific legal obligations.
Evaluate legal and regulatory obligations: Determine whether notification is required under state breach laws, SEC or other regulations (if applicable), industry-specific rules, or contractual agreements.
Leverage restrictive covenant agreements: Assess appropriate legal or contractual remedies, including under restrictive covenant, confidentiality, and other agreements, as part of a broader strategy to address the theft.
Implement safeguards: Strengthen data protection measures to mitigate the risk of future incidents, including employee training, enhanced monitoring, and robust exit procedures.
While dealing with insider threats is undoubtedly challenging, taking a comprehensive and proactive approach can help businesses protect their interests and minimize legal exposure. In today’s interconnected and highly regulated world, understanding the full scope of risks and obligations tied to data theft is essential for any business.
UNWANTED TEXTS, UNWANTED TROUBLE: LG’s Labor Day Discounts Come with a Price
Greetings, TCPA World!
Don’t change the channel. LG Electronics U.S.A. is central to a federal class action lawsuit over its 2024 Labor Day promotional campaign. See McGonigle v. LG Elecs. U.S.A., Inc., No. 1:25-cv-51 (E.D. Va. Jan. 11, 2025). Filed on January 11, 2025, in the U.S. District Court for the Eastern District of Virginia, the lawsuit alleges that LG violated the TCPA by sending unsolicited marketing texts to consumers whose numbers were listed on the National Do-Not-Call (“DNC”) Registry.
This isn’t a picture-perfect scenario… unless LG calls Troutman Amin, of course. Plaintiff alleges LG’s Labor Day promotional bombardment interrupted their programming with unsolicited texts. These messages touted eye-catching deals, including up to $900 off OLED TVs and 30-50% appliance savings. With professional graphics and branded URLs, the campaign was as polished as a high-resolution display.
Adding to the concerns, the Plaintiff alleges that the texts were intended for someone else entirely, raising questions about how LG managed its customer contact database. One possibility that comes to my mind is that the Plaintiff’s number was reassigned from a previous user who may have consented to LG’s messages. Under FCC guidelines, businesses must avoid contacting reassigned numbers and implement systems to detect and remove them from marketing lists. Whether LG followed these protocols will likely be a focal point here.
This isn’t Plaintiff’s first venture into TCPA litigation. In November 2024, Plaintiff filed a similar class action lawsuit against the Home Shopping Network (“HSN”), alleging the company sent promotional text messages to numbers on the DNC Registry without consent. Check out our blog here. The repeat nature of these lawsuits raises questions about how Courts may view Plaintiff’s experience and credibility in navigating these cases.
What is more, a critical issue in the lawsuit is the timeline of the Plaintiff’s DNC registration, which the Complaint presents with conflicting dates. Paragraph 11 states that the Plaintiff’s number “has been on the Do Not Call Registry since 2014” but lists the registration date as “August 5, 2024.” Further complicating matters, Paragraph 20 asserts that LG “knew or should have known” about the registration “on and after April 18, 2023.” These inconsistencies could play a pivotal role in determining the scope of LG’s liability.
The upcoming FCC 1:1 consent rule, which goes into effect on January 27, 2025, adds to the regulatory landscape. This rule requires businesses to obtain separate written consent for each entity sending marketing texts. Consent must be tied directly to the specific interaction generated, and disclosures must be clear and conspicuous. While the one-to-one rule wasn’t in effect during LG’s Labor Day campaign, it highlights evolving consumer privacy and consent expectations.
It’s essential to keep up to date at TCPA World. Things are constantly changing.
Late last night, Responsible Enterprises Against Consumer Harassment (“R.E.A.C.H.”) filed an emergency petition with the FCC seeking a temporary 60-day stay of the rule’s implementation. You can check out the full details of R.E.A.C.H.’s filing here. Due to the recent executive order signed by President Trump, R.E.A.C.H. advises federal agencies to postpone the effective dates of rules not yet in effect to allow time for further review. R.E.A.C.H. has requested that the FCC delay the one-to-one consent rule until March 18, 2025, and reopen the comment period to evaluate potential issues with the rule, particularly its impact on small businesses. Stay tuned.
As always,
Keep it legal, keep it smart, and stay ahead of the game.
Online judicial sales – Illinois Mortgage Foreclosure Law Enters the Digital Age
Purchasers of distressed real estate in mortgage foreclosure proceedings can now work remotely, too. Governor JB Pritzker signed into law Public Act 103-930 S.B. 2919, which became effective January 1, 2025, and allows the “sheriff or other person” to conduct a judicial sale “in person, online or both.” 735 ILCS 5/15-1507(b)(2). From the birth of the Illinois Mortgage Foreclosure Law (IMFL) in 1987 through 2024, foreclosure auctions could only be conducted in person.
The primary purpose of this amendment to the IMFL is to maximize the value of real estate sold at auction by expanding the reach to bidders unbound by geography. So, rather than needing to appear in person in the lobby of a sheriff’s office or in a room of a court-approved selling officer, bidders can now log in to an online platform like Ten-X through their computer or mobile device, perhaps in their sweats, and bid up the price of the collateral being sold. The higher the sale price, the better the chance the first mortgage holder gets paid in full, a junior lien holder makes a recovery and the mortgagor collects a surplus.
The rules applicable to online judicial sales are set forth in section 15-1507.2 of the IMFL. The highlights are as follows: the sheriff or other person may may conduct an online sale or engage a third-party online sale provider and charge an additional fee for associated costs to be paid by the seller; must demonstrate to the court’s satisfaction the processes and procedures for conducting online auctions and adequate record-keeping; shall require bidders to complete a registration process that includes providing information relevant to identify the buyer, contact the buyer and complete the sale of the property; and shall verify the identity of the bidder through an independent verification process. Importantly, the person conducting the online sale and the third-party online sale provider may “promote and market the sale to encourage and facilitate bidding.”
House Bipartisan Task Force on Artificial Intelligence Report
In February 2024, the House of Representatives launched a bipartisan Task Force on Artificial Intelligence (AI). The group was tasked with studying and providing guidance on ways the United States can continue to lead in AI and fully capitalize on the benefits it offers while mitigating the risks associated with this exciting yet emerging technology. On 17 December 2024, after nearly a year of holding hearings and meeting with industry leaders and experts, the group released the long-awaited Bipartisan House Task Force Report on Artificial Intelligence. This robust report touches on how this technology impacts almost every industry ranging from rural agricultural communities to energy and the financial sector to name just a few. It is clear that the AI policy and regulatory space will continue to evolve while being front and center for both Congress and the new administration as lawmakers, regulators, and businesses continue to grapple with this new exciting technology.
The 274-page report highlights “America’s leadership in its approach to responsible AI innovation while considering guardrails that may be appropriate to safeguard the nation against current and emerging threats.” Specifically, it outlines the Task Force’s key findings and recommendations for Congress to legislate in over a dozen different sectors. The Task Force co-chairs, Representative Jay Obernolte (R-CA) and Representative Ted Lieu (D-CA), called the report a “roadmap for Congress to follow to both safeguard consumers and foster continued US investment and innovation in AI,” and a “starting point to tackle pressing issues involving artificial intelligence.”
There was a high level of bipartisan work on AI in the 118th Congress, and although most of the legislation in this area did not end up becoming law, the working group report provides insight into what legislators may do this year and which industries may be of particular focus. Our team continues to monitor legislation, Congressional hearings, and the latest developments writ large in these industries as we transition into the 119th Congress. See below for a sector-by-sector breakdown of a number of findings and recommendations from the report.
Data Privacy
The report’s section on data privacy discusses advanced AI systems’ need to collect huge amounts of data, the significant risks this creates for the unauthorized use of consumers’ personal data, the current state of US consumer privacy protection laws, and recommendations to address these issues.
It begins with a discussion of AI systems’ need for “large quantities of data from multiple diverse sources” to perform at an optimal level. Companies collect and license this data in a variety of ways, including collecting data from their own users, scraping data from the internet, or some combination of these and other methods. Further, some companies collect, package, and sell scraped data “while others release open-source data sets.” These collection methods raise their own set of issues. For example, according to the report, many websites following “a voluntary standard” state that their websites should not be scraped, but their requests are ignored and litigation ensues. It also notes that some companies “are updating their privacy policies in order to permit the use of user data to train AI models” but not otherwise informing users that their data is being used for this purpose. The European Union and Federal Trade Commission have challenged this practice. It notes that in response, “some companies are turning to privacy-enhanced technologies, which seek to protect the privacy and confidentiality of data when sharing it.” They also are looking at “synthetic data.”
In turn, the report discusses the types of harms that consumers frequently experience when their personal and sensitive data is shared intentionally or unintentionally without their authorization. The list includes physical, economic, emotional, reputational, discrimination, and autonomy harms.
The report follows with a discussion of the current state of US consumer privacy protection laws. It kicks off with a familiar tune: “Currently, there is no comprehensive US federal data privacy and security law.” It notes that there are several sector specific federal privacy laws, such as those intended to protect health and financial data and children’s data, but, as has become clear from this year’s Congressional debate, even these laws need to be updated. It also notes that 19 states have adopted state privacy laws but notes that their standards vary. This suggests that, as in the case of state data breach laws, the result is that they have “created a patchwork of rules and regulations with many drawbacks.” This has caused confusion among consumers and resulted in increased costs and lawsuits for businesses. It concludes with the statement that Federal legislation that preempts state data privacy laws has advantages and disadvantages.” The report outlines three Key Findings: (1) “AI has the potential to exacerbate privacy harms;” (2) “Americans have limited recourse for many privacy harms;” and (3) “Federal privacy laws could potentially augment state laws.”
Based on its findings, the report recommends that Congress should: (1) help “in facilitating access to representative data sets in privacy-enhanced ways” and “support partnerships to improve the design of AI systems” and (2) ensure that US privacy laws are “technology neutral” and “can address the most salient privacy concerns with respect to the training and use of advanced AI systems.”
National Security
The report highlights both the potential benefits of emerging technologies to US defense capabilities, as well as the risks, especially if the United States is outpaced by its adversaries in development. The report discusses the status and successes of current AI programs at the Department of Defense (DOD), the Army, and the Navy. The report categorizes issues facing development of AI in the national security arena into technical and nontechnical impediments. The technical impediments include increased data usage, infrastructure/compute power, attacks on algorithms and models, and talent acquisition, especially when competing with the private sector in the workforce. The report also identifies perceived institutional challenges facing DOD, saying “acquisition professionals, senior leaders, and warfighters often hesitate to adopt new, innovative technologies and their associated risk of failure. DOD must shift this mindset to one more accepting of failure when testing and integrating AI and other innovative technologies.” The nontechnical challenges identified in the report revolved around third-party development of AI and the inability of the United States to control systems it does not create. The report notes that advancements in AI are driven primarily by the private sector and encourages DOD to capitalize on that innovation, including through more timely procurement of AI solutions at scale with nontraditional defense contractors.
Chief among the report’s findings and recommendations is a call to Congress to explore ways that the US national security apparatus can “safely adopt and harness the benefits of AI” and to use its oversight powers to hone in on AI activities for national security. Other findings focus on the need for advanced cloud access, the value of AI in contested environments, and the ability of AI to manage DOD business processes. The additional recommendations were to expand AI training at DOD, continue oversight of autonomous weapons policies, and support international cooperation on AI through the Political Declaration on Responsible Military Use of AI. The report indicates that Congress will be paying much more attention to the development and deployment of AI in the national security arena going forward, and now is the time for impacted stakeholders to engage on this issue.
Education and the Workforce
The report also highlights the role of AI technologies in education and the promise and challenges that it could pose on the workforce. The report recognizes that despite the worldwide demand for science, technology, engineering, and mathematics (STEM) workers, the United States has a significant gap in the talent needed to research, develop, and deploy AI technologies. As a result, the report found that training and educating US learners on AI topics will be critical to continuing US leadership in AI technology. The report notes that training the future generations of talent in AI-related fields needs to start with AI and STEM education. Digital literacy has extended to new literacies, such as media, computer, data, and now AI. Challenges include resources for AI literacy.
US leadership in AI will require growing the pool of trained AI practitioners, including people with skills in researching, developing, and incorporating AI techniques. The report notes that this will likely require expanding workforce pathways beyond the traditional educational routes and a new understanding of the AI workforce, including its demographic makeup, changes in the workforce over time, employment gaps, and the penetration of AI-related jobs across sectors. A critical aspect to understanding the AI workforce will be having good data. US leadership in AI will also require public-private partnerships as a means to bolster the AI workforce. This includes collaborations between educational institutions, government, and industries with market needs and emerging technologies.
While the automation of human jobs is not new, using AI to automate tasks across industries has the potential to displace jobs that involve repetitive or predictable tasks. In this regard, the report notes that while AI may displace some jobs, it will augment existing jobs and create new ones. Such new jobs will inevitably require more advanced skills, such as AI system design, maintenance, and oversight. Other jobs, however, may require less advanced skills. The report adds that harnessing the benefits of AI systems will require a workforce capable of integrating these systems into their daily jobs. It also highlights several existing programs for workforce development, which could be updated to address some of these challenges.
Overall, the report found that AI is increasingly used in the workplace by both employers and employees. US AI leadership would be strengthened by utilizing a more skilled technical workforce. Fostering domestic AI talent and continued US leadership will require significant improvements in basic STEM education and training. AI adoption requires AI literacy and resources for educators.
Based on the above, the report recommends the following:
Invest in K-12 STEM and AI education and broaden participation.
Bolster US AI skills by providing needed AI resources.
Develop a full understanding of the AI workforce in the United States.
Facilitate public-private partnerships to bolster the AI workforce.
Develop regional expertise when supporting government-university-industry partnerships.
Broaden pathways to the AI workforce for all Americans.
Support the standardization of work roles, job categories, tasks, skill sets, and competencies for AI-related jobs.
Evaluate existing workforce development programs.
Promote AI literacy across the United States.
Empower US educators with AI training and resources.
Support National Science Foundation curricula development.
Monitor the interaction of labor laws and worker protections with AI adoption.
Energy Usage and Data Centers
AI has the power to modernize our energy sector, strengthen our economy, and bolster our national security but only if the grid can support it. As the report details, electrical demand is predicted to grow over the next five years as data centers—among other major energy users—continue to come online. These technologies’ outpacing of new power capacity can “cause supply constraints and raise energy prices, creating challenges for electrical grid reliability and affordable electricity.” While data centers only take a few years to construct, new sources of power, such as power plants and transmission infrastructure, can take up to or over a decade to complete. To meet growing electrical demand and support US leadership in AI, the report recommends the following:
Support and increase federal investments in scientific research that enables innovations in AI hardware, algorithmic efficiency, energy technology development, and energy infrastructure.
Strengthen efforts to track and project AI data center power usage.
Create new standards, metrics, and a taxonomy of definitions for communicating relevant energy use and efficiency metrics.
Ensure that AI and the energy grid are a part of broader discussions about grid modernization and security.
Ensure that the costs of new infrastructure are borne primarily by those customers who receive the associated benefits.
Promote broader adoption of AI to enhance energy infrastructure, energy production, and energy efficiency.
Health Care
The report highlights that AI technologies have the potential to improve multiple aspects of health care research, diagnosis, and care delivery. The report provides an overview of use to date and its promise in the health care system, including with regard to drug, medical device, and software development, as well as in diagnostics and biomedical research, clinical decision-making, population health management, and health care administration. The report also highlights the use of AI by payers of health care services both for the coverage of AI-provided services and devices and for the use of AI tools in the health insurance industry.
The report notes that the evolution of AI in health care has raised new policy issues and challenges. This includes issues involving data availability, utility, and quality as the data required to train AI systems must exist, be of high quality, and be able to be transferred and combined. It also involves issues concerning interoperability and transparency. AI-enabled tools must be able to integrate with health care systems, including EHR systems, and they need to be transparent for providers and other users to understand how an AI model makes decisions. Data-related risks also include the potential for bias, which can be found during development or as the system is deployed. Finally, there is the lack of legal and ethical guidance regarding accountability when AI produces incorrect diagnoses or recommendations.
Overall, the report found that AI’s use in health care can potentially reduce administrative burdens and speed up drug development and clinical diagnosis. When used appropriately, these uses of AI could lead to increased efficiency, better patient care, and improved health outcomes. The report also found that the lack of standards for medical data and algorithms impedes system interoperability and data sharing. The report notes that if AI tools cannot easily connect with all relevant medical systems, their adoption and use could be impeded.
Based on the above, the report recommends the following:
Encourage the practices needed to ensure AI in health care is safe, transparent, and effective.
Maintain robust support for health care research related to AI.
Create incentives and guidance to encourage risk management of AI technologies in health care across various deployment conditions to support AI adoption and improve privacy, enhance security, and prevent disparate health outcomes.
Support the development of standards for liability related to AI issues.
Support appropriate payment mechanisms without stifling innovation.
Financial Services
With respect to financial services, the report emphasizes that AI is already and has been used for decades within the financial services system, by both industry and financial regulators alike. Key examples of use cases have included fraud detection, underwriting, debt collection, customer onboarding, real estate, investment research, property management, customer service, and regulatory compliance, among other things. The report also notes that AI presents both significant risks and opportunities to the financial system, so it is critical to be thoughtful when considering and crafting regulatory and legislative frameworks in order to protect consumers and the integrity of the financial system, while also ensuring to not stifle technological innovation. As such, the report states that lawmakers should adopt a principles-based approach that is agnostic to technological advances, rather than a technology-based approach, in order to preserve longevity of the regulatory ecosystem as technology evolves over time, particularly given the rapid rate at which AI technology is advancing. Importantly, the report notes that small financial institutions may be at a significant disadvantage with respect to adoption of AI, given a lack of sufficient resources to leverage AI at scale, and states that regulators and lawmakers must ensure that larger financial institutions are not inadvertently favored in policies so as not to limit the ability of smaller institutions to compete or enter the market. Moreover, the report stresses the need to maintain relevant consumer and investor protections with AI utilization, particularly with respect to data privacy, discrimination, and predatory practices.
A Multi-Branch Approach to AI/Next Steps
The Task Force recognizes that AI policy will not fall strictly under the purview of Congress. Co-chair Obernolte shared that he has met with David Sacks, President Trump’s “AI Czar,” as well as members of the transition team to discuss what is in the report.
We will be closely following how both the administration and Congress act on AI in 2025, and we are confident that no industry will be left untouched.
Vivian K. Bridges, Lauren E. Hamma, Abby Dinegar contributed to this article.
CJEU Rules on Excessive Requests
On January 9, 2025, the Court of Justice of the European Union (“CJEU”) issued its judgment in the case Österreichische Datenschutzbehörde (C‑416/23). In this case, the CJEU was asked to determine when a request to a supervisory authority could be considered as excessive, in particular because of its repetitive character, allowing the supervisory authority to reject the request or charge a fee to process it.
Background
The case arose from the refusal by the Austrian Data Protection Authority (“DSB”) to act on an individual’s complaint due to its excessive nature. The DSB’s refusal was based on the fact that the concerned individual had sent 77 similar complaints to the DSB directed against different controllers within a period of approximately 20 months. The individual contested the DSB’s decision and the matter was eventually brought before the Austrian Supreme Administrative Court. The Court requested the following clarifications from the CJEU:
Whether the exception allowing supervisory authorities to charge a reasonable fee based on administrative costs or to refuse to act on a request based on its excessive or unfounded nature also applies to complaints?
Whether for a request to be “excessive”, it is sufficient that a data subject has merely addressed a certain number of requests to a supervisory authority within a certain period of time, irrespective of whether the facts are different and/or whether the requests concern different controllers, or if an abusive intention on the part of the data subject is required in addition to the frequent repetition of requests?
Whether in case of “manifestly unfounded” or “excessive” requests, the supervisory authority is free to choose between charging a reasonable fee based or refuse to process the request?
The CJEU’s Decision
In its decision, the CJEU started by clarifying that the term “request” should be considered as including complaints lodged with supervisory authorities and, hence, should be subject to Article 57(4) of the GDPR.
With respect to the excessive nature of complaints, the CJEU took the view that setting an absolute numerical threshold, above which complaints are automatically classified as excessive, is not acceptable. According to the CJEU, this approach could undermine rights granted to individuals under the GDPR. Instead, the supervisory authority is required to establish that the concerned individual has an abusive intention. A large number of complaints made by the same individual may be an indication of an excessive request, provided that those complaints are not objectively justified by considerations relating to the protection of the data subject’s rights under the GDPR.
When faced with an excessive request, the CJEU considers that the supervisory authority may take a reasoned decision to charge a reasonable fee based on administrative costs or refuse to act on those requests, taking into account all relevant circumstances and satisfying itself that the chosen option is appropriate, necessary and proportionate. According to the CJEU, the supervisory authority may consider initially opting to charge a reasonable fee to bring an end to the abusive practice, as this option has lesser adverse effects on the GDPR rights of the individual. However, the supervisory authority is not required to always apply this measure first.
Read the CJEU’s decision.
Telecom Alert: FCC Nominations; Environmental Permitting Executive Order; Cybersecurity NPRM; 900 MHz Deployment NPRM; Enforcement Bureau Notice of Violation [Vol. XXII, Issue 3]
Trump Nominates Trusty as Commissioner; Appoints Carr as Chair
President Trump has nominated Olivia Trusty as the next FCC Commissioner. Ms. Trusty currently serves as a Legislative Aide to the U.S. Senate Commerce Committee and will fill seat vacated by the departure of current FCC Chair Jessica Rosenworcel. The nomination of a GOP Commissioner will create a 3-2 GOP-controlled agency, led by incoming FCC Chair Brendan Carr, who has served as an FCC Commissioner since 2017. On Inauguration Day, Carr was officially designated as the next chair, which does not require Senate confirmation.
Trump Issues Executive Order to Expedite Environmental Permitting Requirements
President Trump issued an Executive Order on Monday targeting the federal environmental permitting process by ordering the Chairman of the Council on Environmental Quality (CEQ) to review the guidance and implementation of the National Environmental Policy Act (NEPA). The CEQ has 30 days from the Order to provide its findings to “expedite and simplify the permitting process.” Once provided, the CEQ will convene a working group to coordinate agency-wide revisions of NEPA and other federal regulations. According to the order, the streaming of the regulatory process will open doors for faster permitting for critical energy infrastructure and quicker review of NEPA applications.
Chairwoman Rosenworcel Announces New Cybersecurity Measures
In the wake of the Salt Typhoon incidents that targeted critical infrastructure across domestic networks, Chairwoman Rosenworcel announced that the FCC would adopted a Declaratory Ruling that finds the Communications Assistance for Law Enforcement Act (CALEA) requires all telecommunications carriers to secure their networks against unlawful intrusions into their networks. The Ruling is supported by a new Notice of Proposed Rulemaking which requests comments from carriers regarding proposed annual certifications of risk management plans, and other means to improve cyber resilience.
FCC Proposes Increased Deployment on the 900 MHz Band
The FCC released a Notice of Proposed Rulemaking that seeks to expand the existing negotiation-based process of transitioning the 900 MHz band for broadband use. This NPRM builds on the Commission’s 2019 rulemaking which froze the band and created a 3/3 MHz broadband allocation. The presumptive broadband licensee, Anterix, then began negotiating to relocate incumbents from the newly designated broadband portion of the band to the two narrowband segments, or to other bands. Anterix has requested, and the Commission now seeks comment on, the expansion of the broadband allocation from the existing 3/3 MHz to the entire 5/5 MHz block of the 900 MHz band. The FCC has requested comment on eligibility criteria, application requirements and procedures, licensing, and technical parameters for license operators. The proposal also aims to gauge interest in expanding the Band for private broadband networks, as well as maintaining narrowband operations. Comments and Reply Comments will be due 60 and 90 days, respectively, from the date of the NPRM’s publication in the Federal Register.
Enforcement Bureau Issues Notice of Violation for Faulty Antenna Structure
The FCC’s Enforcement Bureau issued a Notice of Violation for an antenna structure in Fort Worth, TX, citing violations of the FCC’s rules regarding the painting and lighting specifications for registered antenna structures. Specifically, the notice stated the agents found the owner had failed to repair and replace their top beacon light in accordance with 47 C.F.R. §§ 17.6(a) and 17.56, as well as also failed to notify the FAA and FCC of the extinguishment and any associated construction pursuant to §§ 17.48(a) and 17.57. The owner has been ordered to provide responses within 20 days of the release of the notice.
Additional Authors: Thomas B. Magee, Tracy P. Marshall, Sean A. Stokes, and Wesley K. Wright
NOYB Files Complaints For Unlawful Data Transfers To China
On January 16, 2025, the non-profit organization, None Of Your Business (“NOYB”), filed six complaints against organizations with five European data protection authorities for the unlawful transfer of personal data to China. NOYB cited the absence of an adequacy decision for China and the inability for importers in China to guarantee the same level of data protection as in the European Union (“EU”).
Under the complaints, NOYB requests that the data protection authorities immediately order the suspension of data transfers to China on the basis that China does not provide an essentially equivalent level of data protection under the EU General Data Protection Regulation (“GDPR”), and requests that the organizations bring their processing into compliance with the GDPR. In addition, NOYB calls for administrative fines to be imposed on the organizations.
Read NOYB’s press release on the complaints.
U.S. Treasury Department’s Final Rule on Outbound Investment Takes Effect
On January 2, 2025, the U.S. Department of the Treasury’s Final Rule on outbound investment screening became effective. The Final Rule implements Executive Order 14105 issued by former President Biden on August 9, 2023, and aims to protect U.S. national security by restricting covered U.S. investments in certain advanced technology sectors in countries of concern. Covered transactions with a completion date on or after January 2, 2025, are subject to the Final Rule, including the prohibition and notification requirements, as applicable.
The Final Rule targets technologies and products in the semiconductor and microelectronics, quantum information technologies, and artificial intelligence (AI) sectors that may impact U.S. national security. It prohibits certain transactions and requires notification of certain other transactions in those technologies and products. The Final Rule has two primary components:
Notifiable Transactions: A requirement that notification of certain covered transactions involving both a U.S. person and a “covered foreign person” (including but not limited to a person of a country of concern engaged in “covered activities” related to certain technologies and products) be provided to the Treasury Department. A U.S. person subject to the notification requirement is required to file on Treasury’s Outbound Investment Security Program website by specified deadlines. The Final Rule includes the detailed information and certification required in the notification and a 10-year record retention period for filing and supporting information.
Prohibited Transaction: A prohibition on certain U.S. person investments in a covered foreign person that is engaged in a more sensitive sub-set of activities involving identified technologies and products. A U.S. person is required to take all reasonable steps to prohibit and prevent its controlled foreign entity from undertaking transaction that would be a prohibited transaction if undertaken by a U.S. person. The Final Rule contains a list of factors that the Treasury Department would consider whether the relevant U.S. person took all reasonable steps.
The Final Rule focuses on investments in “countries of concern,” which currently include only the People’s Republic of China, including Hong Kong and Macau. The Final Rule targets U.S. investments in Chinese companies involved in the following three sensitive technologies sub-sets: semiconductor and microelectronics, quantum information technologies and artificial intelligence. The Final Rule sets forth prohibited and notifiable transactions in each of the three sectors:
Semiconductors and Microelectronics
Prohibited: Covered transactions relating to certain electronic design automation software, fabrication or advanced packaging tools, advanced packaging techniques, and the design and fabrication of certain advanced integrated circuits and supercomputers.
Notifiable: Covered transactions relating to the design, fabrication and packaging of integrated circuits not covered by the prohibited transactions.
Quantum Information Technologies
All Prohibited: Covered transactions involving the development of quantum computers and production of critical components, the development or production of certain quantum sensing platforms, and the development or production of quantum networking and quantum communication systems.
Artificial Intelligence (AI) Systems
Prohibited:
Covered transactions relating to AI systems designed exclusively for or intended to be used for military, government intelligence or mass surveillance end uses.
Covered transactions relating to development of any AI system that is trained using a quantity of computing power meeting certain technical specifications and/or using primarily biological sequence data.
Notifiable: Covered transactions involving AI systems designed or intended to be used for cybersecurity applications, digital forensics tools, penetration testing tools, control of robotic systems or that trained using a quantity of computing power meeting certain technical specifications.
The Final Rule specifically defines the key terms “country of concern,” “U.S. person,” “controlled foreign entity,” “covered activity,” “covered foreign person,” “knowledge” and “covered transaction” and other related terms and sets forth the prohibitions and notification requirements in line with the national security objectives stated in the Executive Order. The Final Rule also provides a list of transactions that are excepted from such requirements.
U.S. investors intending to invest in China, particularly in the sensitive sectors set forth above, should carefully review the Final Rule and conduct robust due diligence to determine whether a proposed transaction would be covered by the Final Rule (either prohibited or notifiable) before undertaking any such transaction.
Any person subject to U.S. jurisdiction may face substantial civil and/or criminal penalties for violation or attempted violation of the Final Rule, including civil fines of up to $368,137 per violation (adjusted annually for inflation) or twice the amount of the transaction, whichever is greater, and/or criminal penalties up to $1 million or 20 years in prison for willful violations. In addition, the Secretary of the Treasury can take any authorized action to nullify, void, or otherwise require divestment of any prohibited transaction.
OR AG Issues Guidance Regarding OR State Laws and AI
On December 24, 2024, the Oregon Attorney General published AI guidance, “What you should know about how Oregon’s laws may affect your company’s use of Artificial Intelligence,” (the “Guidance”) that clarifies how existing Oregon consumer protection, privacy and anti-discrimination laws apply to AI tools. Through various examples, the Guidance highlights key themes such as privacy, accountability and transparency, and provides insight into “core concerns,” including bias and discrimination.
Consumer Protection – Oregon’s Unlawful Trade Practice Act (“UTPA”)
The Guidance emphasizes that misrepresentations, even when they are not directly made to the consumer, may be actionable under the UTPA, and an AI developer or deployer may be “liable to downstream consumers for the harm its products cause.” The Guidance provides a non-exhaustive list of examples that may constitute violations of the UTPA, such as:
failing to disclose any known material defect or nonconformity when delivering an AI product;
misrepresenting that an AI product has characteristics, uses, benefits or qualities that it does not have;
using AI to misrepresent that real estate, goods or services have certain characteristics, uses, benefits or qualities (e.g., a developer or deployer using a chatbot while falsely representing that it is human);
using AI to make false or misleading representations about price reductions (e.g., using AI generated ads or emails indicating “limited time” or “flash sale” when a similar discount is offered year-round);
using AI to set excessively high prices during an emergency;
using an AI-generated voice as part of a robocall campaign to misrepresent or falsify certain information, such as the caller’s identity and the purpose of the call; and
leveraging AI to use unconscionable tactics regarding the sale, rental or disposal of real estate, goods or services, or collecting or enforcing an obligation (e.g., knowingly taking advantage of a consumer’s ignorance or knowingly permitting a consumer to enter into a transaction that does not materially benefit them).
Data Privacy – Oregon Consumer Protection Act (“OCPA”)
In addition, the Guidance notes that developers, suppliers and users of AI may be subject to OCPA, given generative AI systems ingest a significant amount of words, images and other content that often consists of personal data. Key takeaways from the Guidance regarding OCPA include:
developers that use personal data to train AI systems must clearly disclose that they do so in an accessible and clear privacy notice;
if personal data includes any categories of sensitive data, entities must first obtain explicit consent from consumers before using the data to develop or train AI models;
if the developer purchases or uses another data’s company for model training, the developer may be considered a “controller” under OCPA, and therefore must comply with the same standards as the company that initially collected the data;
data suppliers and developers are prohibited from “retroactively or passively” altering privacy notices or terms of use to legitimatize the use of previously collected personal data to train AI models, and instead are required to obtain affirmative consent for any secondary or new uses of that data;
developers and users of AI must provide a mechanism for consumers to withdraw previously-given consent (and if the consent is revoked, stop processing the data within 15 days of receiving the revocation);
entities subject to OCPA must consider how to account for specific consumer rights when using AI models, including a consumer’s right to (1) opt-out of the use of profiling in decisions that have legal or similarly significant effects (e.g., housing, education or lending) and (2) request the deletion of their personal data; and
in connection with OCPA’s requirement to conduct data protection assessments for certain processing activities, due to the complexity of generative AI models and proprietary data and algorithms, entities “should be aware that feeding consumer data into AI models and processing it in connection with these models likely poses heightened risks to consumers.”
Data Security – Oregon Consumer Information Protection Act
The Guidance clarifies that AI developers (as well as their data suppliers and users) that “own, license, maintain, store, manage, collect, acquire or otherwise possess” personal information also must comply with the Oregon Consumer Information Protection Act, which requires businesses to safeguard personal information and implement an information security program that meets specific requirements. The Guidance also notes that to the extent there is a security breach, AI developers, data suppliers and users may be required to notify consumers and the Oregon Attorney General.
Anti-Discrimination – Oregon Equality Act
The Guidance explains that AI systems that “utilize discretionary inputs or produce biased outcomes that harm individuals based on protected characteristics” may trigger the Oregon Equality Act. The law prohibits discrimination based on race, color, religion, sex, sexual orientation, gender identity, national origin, marital status, age or disability, including in connection with housing and public accommodations. The Guidance also includes an illustrative example regarding how the law applies to the use of AI. Specifically, the Guidance notes that a rental management company’s use of an AI mortgage approval system that consistently denies loans to qualified applicants based on certain neighborhoods or ethnic backgrounds because the AI system was trained on historically biased data may be considered a violation of the law.
Crypto in the Courts: Five Cases Reshaping Digital Asset Regulation in 2025
There has rarely been a larger or more widely distributed financial market that existed in a more uncertain regulatory context than cryptocurrencies and decentralized finance (DeFi) at the start of 2025. In the past several years, the regulatory status of this asset class in the United States has been at the center of a concerted effort by the US Securities Exchange Commission (SEC) to apply the regime applicable to securities to diverse crypto instruments and methods of exchange and transfer. (Although the Commodity Futures Trading Commission (CFTC) has also consistently enforced its regulations on products it deems to be commodities, that effort has not led to the widespread litigation that is likely to define the regulatory status of these products.)
The SEC’s effort is now in jeopardy. As we begin 2025, the legal landscape surrounding digital assets stands at a critical inflection point, with several watershed cases poised to reshape how these assets will be governed, traded, and regulated in the United States. The convergence of these cases — spanning securities law, administrative procedure and federalism — presents opportunities to clarify how traditional legal frameworks apply to digital assets. Further, the Trump administration has promised that it will be a “pro-crypto” administration — driving the SEC towards a friendlier stance with the cryptocurrency industry and having cryptocurrency rules and regulations “written by people who love [the] industry, not hate [the] industry”1 — and that the United States will become the “crypto capital of the world.”2 President Donald Trump has nominated Paul Atkins, a former SEC Commissioner, to become the next SEC chairperson, stating in his announcement that Mr. Atkins “recognizes that digital assets & other innovations are crucial to Making America Greater than Ever Before.”3 The Trump administration’s announced intention to change the course of cryptocurrency regulation and the selection of an SEC chairperson who is an avowed advocate for innovation through blockchain technologies raise questions about the future of the pending litigation at the center of this industry.
This article examines five cases that may define the future of digital asset regulation in the United States and sets out the issues at stake in those cases. These cases are the Second Circuit’s review of SEC v. Ripple Labs, Inc., the interlocutory appeal in SEC v. Coinbase, Inc., and three cases representing the industry’s shift toward offensive litigation against federal agencies — Blockchain Association v. IRS, Bitnomial Exchange, LLC v. SEC, and Kentucky et al. v. SEC. The purpose of this article is not to predict how those cases will progress — that determination is going to lie in the hands of the courts and policymakers — but rather to make clear what is at stake, especially in light of an anticipated shift in regulatory priorities regarding digital assets with the Trump administration, which could decide to no longer support the government’s positions in these cases.
SEC v. Ripple Labs, Inc. (2d Cir.)
The SEC’s appeal in SEC v. Ripple Labs, Inc. follows a July 2023 ruling in the Southern District of New York that began when the SEC charged Ripple Labs, Inc. (Ripple) with conducting an unregistered securities offering through sales of its XRP token. The SEC argued that the offer and sale of XRP tokens constituted an offer and sale of investment contracts under SEC v. W.J. Howey, which provides that an “investment contract” is a contract, transaction, or scheme whereby a person: (1) “invests his money” (2) “in a common enterprise” and (3) “is led to expect profits solely from the efforts of the promoter or a third party.”4 In response, Ripple advanced an “essential ingredients test,” arguing that in addition to the three-part Howey test, investment contracts must also contain “essential ingredients”: (1) “a contract between a promoter and an investor that establishe[s] the investor’s rights as to an investment,” which contract (2) “impose[s] post-sale obligations on the promoter to take specific actions for the investor’s benefit” and (3) “grant[s] the investor a right to share in profits from the promoter’s efforts to generate a return on the use of investor funds.”5
The district court, in its July 2023 ruling, rejected Ripple’s novel “essential ingredients” test, noting that “in the more than seventy-five years of securities law jurisprudence after Howey, courts have found the existence of an investment contract even in the absence of Defendants’ ‘essential ingredients,’ including in recent digital asset cases in this District.”6 Nevertheless, the district court found that, while Ripple’s institutional sales violated securities laws, the company’s programmatic sales (sales of XRP on digital asset exchanges) and other distributions (such as employee compensation and third-party development incentives) did not constitute securities offerings — marking the first major setback to the SEC’s digital asset enforcement initiative.7 Crucially, the district court distinguished between XRP sales based on their economic reality: institutional sales to sophisticated buyers under written contracts were deemed securities transactions because buyers reasonably expected profits from Ripple’s efforts, while programmatic sales on exchanges were not because buyers could not know they were purchasing from Ripple. The court also found that other distributions failed to meet the basic requirements of an “investment of money” since recipients did not provide payment to Ripple.
The SEC filed a notice of appeal on October 4, 2024, and Ripple has cross-appealed. This will likely be the first appellate court to consider how Howey applies to digital assets unless the Trump administration determines to freeze the litigation.8 The SEC filed its appellate brief on January 15, 2025, arguing that the district court erred in concluding that programmatic sales to retail investors were not offers or sales of investment contracts under Howey because “investors were led to expect profits” based on the efforts of Ripple.9 The SEC also argued that other distributions of XRP were also offers or sales of investment contracts because Ripple the “recipients provided tangible and definable consideration in return for Ripple’s XRP.”10 Ripple will likely challenge whether digital assets are ever securities under the Howey framework.
The SEC maintains that the district court’s decision “conflicts with decades of Supreme Court precedent and securities laws.”11 If the SEC persists in this appeal, it will likely be the first appellate court to consider how Howey applies to particular types of primary sales of digital assets and, more broadly, how securities laws are to be applied to the digital asset economy. The appeal’s resolution will provide important clarity on how federal securities laws apply to various types of primary sales of digital assets.
SEC v. Coinbase, Inc. (2d Cir.)
On January 7, 2025, a Southern District of New York court granted Coinbase Inc.’s motion to certify for interlocutory appeal the court’s March 2024 order denying in substantial part Coinbase’s motion for judgment on the pleadings.12 The certification permits the Second Circuit to address Howey’s reach and application to digital assets, particularly in secondary market transactions.
The case arose from the SEC’s June 2023 enforcement action, alleging that Coinbase operated as an unregistered national securities exchange, broker and clearing agency by intermediating transactions in 13 digital assets that the SEC claimed were investment contracts and, thus, securities. The district court in March 2024 rejected Coinbase’s argument that cryptoasset transactions could not be investment contracts absent post-sale contractual obligations between issuers and purchasers.13
In granting Coinbase’s motion to certify for interlocutory appeal, the court found that the case presents a “controlling question of law regarding the reach and application of Howey to cryptoassets, about which there is substantial ground for difference of opinion.”14 In particular, the court emphasized that applying Howey to cryptocurrencies “is itself a difficult legal issue of first impression for the Second Circuit” and questioned the adequacy of the SEC’s application of Howey to secondary market sales.15
The grant of interlocutory appeal is significant for several reasons. First, it creates parallel tracks of appellate review in the Second Circuit, as the SEC’s appeal in Ripple Labs will also be pending. Both cases will allow the Second Circuit to examine how Howey applies to digital assets but from different procedural postures — Ripple Labs on final judgment and Coinbase on interlocutory appeal from a motion for judgment on the pleadings.
Second, the interlocutory appeal addresses a fundamental split in the Southern District of New York regarding whether and how Howey applies to secondary market transactions of digital assets. Judge Torres in Ripple Labs drew a distinction between Ripple’s institutional sales, which satisfied Howey, and programmatic sales (i.e., blind bid-ask transactions on exchanges), which did not. In contrast, Judge Rakoff in SEC v. Terraform Labs and Judge Failla in Coinbase declined to differentiate based on the manner of sale, finding that Howey could apply equally to secondary market transactions.16 The Second Circuit’s resolution of this split will have profound implications for all regulatory disputes relating to digital asset trading platforms, as the designation as a security triggers the application of the securities laws for all participants in the industry, including issuers, traders, and trading platforms.
Third, the appeal will address the novel question of how a digital asset’s “ecosystem” factors in the Howey analysis. The district court in Coinbase found that, unlike traditional commodities, cryptoassets lack inherent value absent their digital ecosystem — a distinction that helped justify treating them as securities.17 However, the district court also recognized in its certification of its appeal that Coinbase raised “substantial ground” to dispute this view of the ecosystem, noting Coinbase’s argument that other commodities such as carbon credits, emissions allowances and expired Taylor Swift concert tickets similarly have no inherent value outside of the ecosystem in which they are issued or consumed.18 The Second Circuit’s treatment of this issue could influence how other courts analyze a wide range of digital assets.
The implications for the digital asset industry are substantial. Coinbase represents the largest US digital asset exchange, and the SEC’s theory would subject most major trading platforms to securities regulation. Resolution of the interlocutory appeal could, therefore, provide crucial guidance on whether and when trading platforms must register with the SEC.
Blockchain Association et al. v. IRS (N.D. Tex.)
On December 27, 2024, three blockchain industry organizations filed suit in the Northern District of Texas, challenging Department of the Treasury (Treasury) regulations that would impose “broker” reporting requirements on DeFi participants.19 The case represents a significant test of Treasury’s authority to regulate the digital asset industry through information reporting requirements.
The challenged regulations implement provisions of the Infrastructure Investment and Jobs Act of 2021 requiring certain digital asset brokers to report transaction information to the Internal Revenue Service (IRS) on Form 1099-DA. The plaintiffs argue that Treasury’s interpretation of who qualifies as a “broker” exceeds its statutory authority. While Congress defined brokers as persons who “effectuate transfers of digital assets” for consideration, Treasury regulations extend to anyone providing “facilitative services” who theoretically could request customer information — potentially including software developers, front-end interface providers and other technology participants who never take custody of assets or directly execute trades.
The complaint raises several significant challenges under the Administrative Procedure Act (APA) and the US Constitution. The plaintiffs argue that the regulations are arbitrary and capricious, violating the APA by failing to engage in reasoned decision-making and ignoring substantial evidence about the practical impossibility of compliance for many DeFi participants. They also contend that the rules violate the Fourth Amendment by compelling warrantless collection of private information and the Fifth Amendment’s due process requirements through unconstitutionally vague standards for determining who qualifies as a broker.
The case has significant implications for the DeFi industry’s future in the United States. According to the IRS’s calculations, compliance with the regulations would cost the industry over $260 billion annually — a potentially existential burden for many DeFi projects. The plaintiffs argue this would force US-based DeFi participants to either relocate overseas, cease operations or fundamentally alter their business models in ways that undermine decentralization.
The case is part of a recent trend of offensive litigation by the cryptocurrency industry against federal agencies, as the industry increasingly turns to the courts to challenge perceived regulatory overreach. In doing so, litigants can at least initially select the venue of these proceedings, subject to the restrictions of the Federal Rules of Civil Procedure. Venue selection can be critical as certain courts in Texas, and the Fifth Circuit itself, have recently expressed criticism of expansive agency authority. In November 2024, the Northern District of Texas vacated the SEC’s rulemaking, expanding the definition of “dealer” under the Securities Exchange Act of 1934 (Exchange Act).20 The same month, the Fifth Circuit reversed a decision wherein Treasury imposed sanctions on Tornado Cash, a cryptocurrency software protocol that conceals the origins and destinations of digital asset transfers.21 The case remains in its early stages, as the government has yet to respond to the complaint.
Bitnomial Exchange, LLC v. SEC (N. D. Ill.)
Bitnomial Exchange, LLC v. SEC marks a notable offensive litigation against the SEC, with a futures exchange regulated by the CFTC directly challenging the SEC’s authority to regulate a cryptoasset security futures product.22 Filed in October 2024 in the Northern District of Illinois, the case stems from Bitnomial’s attempt to list XRP futures contracts after completing the CFTC’s self-certification process. The complaint seeks both a declaratory judgment that XRP futures are not security futures under the Exchange Act and injunctive relief to prevent SEC oversight of these products.
Bitnomial argues that the SEC has created an impossible regulatory situation by taking the view that XRP futures constitute security futures, requiring both registration of the underlying asset (XRP) as a security and Bitnomial’s registration as a national securities exchange. The exchange contends this position is legally untenable, particularly given the court’s ruling in SEC v. Ripple Labs, Inc. that “XRP, as a digital token, is not in and of itself a ‘contract, transaction[,] or scheme’ that embodies the Howey requirements of an investment contract,” and that anonymous secondary market sales of XRP do not constitute investment contracts.23
According to the complaint, even if Bitnomial were to accept the SEC’s position that XRP futures are security futures, compliance would be impossible because XRP itself is not registered as a security with the SEC — a prerequisite for listing single stock security futures under current regulations. Moreover, Bitnomial, as a trading venue rather than the issuer, lacks the authority to register XRP as a security.
The outcome of the litigation could have far-reaching implications for how digital asset futures products are regulated and traded in the United States. A ruling in Bitnomial’s favor would reinforce the CFTC’s exclusive jurisdiction over non-security futures products and potentially clear the way for other futures exchanges to list similar products. Conversely, if the SEC prevails, it could effectively prevent the listing of futures contracts on many digital assets, as the vast majority of digital assets are not registered as a security with the SEC and cannot be registered by the exchanges seeking to list futures on them. As cases are litigated across jurisdictions, there is also the possibility of a split in how federal circuits view secondary transfers of digital assets.
Kentucky et al. v. SEC (E. D. Ky.)
In November 2024, 18 states and a blockchain industry association filed a lawsuit against the SEC in the Eastern District of Kentucky, challenging the agency’s authority to regulate digital asset trading platforms as securities exchanges. The case, which remains in its initial stages, challenges the SEC’s assertion of regulatory authority over digital asset trading platforms, arguing that the agency’s approach improperly preempts state money transmitter laws and interferes with state unclaimed property regimes that many states have specifically adapted for digital assets.
The states detail how they have developed specific regulatory frameworks for crypto businesses, including licensing requirements and consumer protection measures. Under the SEC’s interpretation that most digital asset transactions constitute securities transactions, platforms facilitating these transactions would be required to register as securities exchanges, brokers or dealers. The states argue that this interpretation would effectively nullify their respective regulatory regimes, as the Exchange Act prohibits states from imposing certain requirements — including licensing and bonding requirements — on entities that qualify as securities brokers or dealers. For example, states such as Kentucky have issued guidance stating that transmitters of digital assets are money transmitters under state law. Still, this classification would be preempted if these entities must register with the SEC as securities intermediaries.
This case could help resolve a key question underlying several ongoing SEC enforcement actions against major crypto exchanges: whether secondary market transactions in digital assets on trading platforms constitute securities transactions subject to SEC oversight. A ruling that such transactions fall outside the SEC’s authority could undermine the agency’s enforcement strategy against these platforms. On the other hand, a decision upholding the SEC’s interpretation could strengthen the agency’s positions in these enforcement actions and potentially impact other trading platforms currently operating in the United States.
The timing of the lawsuit, filed just days after the 2024 presidential election, adds another layer of complexity to the litigation.
Conclusion
The five cases examined above will help define the coming shift in digital asset litigation under the new Trump administration. While the Second Circuit’s consideration of Ripple Labs and Coinbase will determine whether the manner of sale creates meaningful distinctions under Howey, the industry-led cases signal an equally important development: the emergence of coordinated challenges to agency authority. The Blockchain Association’s challenge to Treasury’s broker regulations, Bitnomial’s challenge to the SEC’s claim of authority over CFTC-regulated futures products, and 18 states’ defense of their regulatory frameworks collectively represent sophisticated attempts to define and limit federal oversight of digital assets.
The resolution of these cases, coupled with the anticipated regulatory shifts under the new administration, could fundamentally alter the landscape for digital asset innovation in the United States. Market participants should closely monitor these developments as they may significantly impact operational strategies and regulatory obligations in the digital asset space.
1 MacKenzie Sigalos, Here’s What Trump Promised the Crypto Industry Ahead of the Election, CNBC (Nov. 6, 2024), https://www.cnbc.com/2024/11/06/trump-claims-presidential-win-here-is-what-he-promised-the-crypto-industry-ahead-of-the-election.html.
2 Mauricio Di Bartolomeo, Trump’s Top 3 Bitcoin Promises and Their Implications, Forbes (Nov. 7, 2024), https://www.forbes.com/sites/mauriciodibartolomeo/2024/11/07/trumps-top-3-bitcoin-promises-and-their-implications/.
3 Rafael Nam, Trump Picks Crypto Backer Paul Atkins as New Securities and Exchange Commission Chair, NPR (Dec. 4, 2024), https://www.npr.org/2024/12/04/g-s1-36803/trump-crypto-paul-atkins-sec-chair.
4 SEC v. W.J. Howey, 328 U.S. 293 (1946).
5 SEC. v. Ripple Labs, Inc., 682 F. Supp. 3d 308, 322 (S.D.N.Y. July 13, 2023).
6 Id.
7 Id.
8 Hanna Lang and Chris Prentice, Trump’s New SEC Leadership Poised to Kick Start Crypto Overhaul, Sources Say, Reuters (Jan. 15, 2025), https://www.reuters.com/world/us/trumps-new-sec-leadership-poised-kick-start-crypto-overhaul-sources-say-2025-01-15/ (noting top Republican official at the SEC are “reviewing some crypto enforcement cases pending in the courts.”).
9 Brief for SEC at 27-28, SEC v. Ripple, No. 24-2648 (2d Cir. Jan. 15, 2025) (“Ripple publicly promised that it would create a rising tide that would lift the price of XRP for all investors, whether having purchased from Ripple, its affiliates, or a third party.”).
10 Id. at 49-50 (citing Intl. Teamsters v. Daniel, 439 U.S. 551, 560 n. 12 (1979) for the proposition that an “investment of money” under Howey includes “goods and services” so long as the investor provides “some tangible and definable consideration.”).
11 Nikhilesh De, SEC Files Notice of Appeal in Case Against Ripple (Oct. 2, 2024), CoinDesk, https://www.coindesk.com/policy/2024/10/02/sec-files-notice-of-appeal-in-case-against-ripple.
12 SEC v. Coinbase, Inc., No. 1:23-cv-04738-KPF (S.D.N.Y. Jan. 7, 2025).
13 SEC v. Coinbase, Inc., 726 F. Supp. 3d 260 (S.D.N.Y. Mar. 27, 2024).
14 Supra note 9 at 12.
15 Id. at 26.
16 SEC v. Terraform Labs Pte. Ltd., 684 F. Supp. 3d 170, 197 (S.D.N.Y. July 31, 2023) (“It may also be mentioned that the Court declines to draw a distinction between these coins based on their manner of sale, such that coins sold directly to institutional investors are considered securities and those sold through secondary market transactions to retail investors are not.”); Coinbase, Inc., 726 F. Supp. 3d at 293 (“Contrary to Defendants’ assertion, whether a particular transaction in a crypto-asset amounts to an investment contract does not necessarily turn on whether an investor bought tokens directly from an issuer or, instead, in a secondary market transaction.”).
17 Coinbase, Inc., 726 F. Supp. 3d at 295.
18 Coinbase, Inc., No. 1:23-cv-04738-KPF at *28.
19 Blockchain Ass’n et al. v. IRS, No. 3:24-cv-03259-X (N.D. Tex. Dec. 27, 2024).
20 See Nat’l Ass’n of Private Fund Managers et al. v. SEC, No. 4:24-cv-00250 (N.D. Tex. Nov. 21, 2024); Crypto Freedom All. of Tex. et al. v. SEC, No. 4:24-cv-00361 (N.D. Tex. Nov. 21, 2024).
21 See Van Loon v. Department of the Treasury, No. 23-50669 (5th Cir. 2024).
22 Bitnomial Exch., LLC v. SEC, No. 1:24-cv-09904 (N.D. Ill. Oct. 10, 2024).
23 Ripple Labs, Inc., 682 F. Supp. 3d at 324 (S.D.N.Y. July 13, 2023).
Yawara Ng also contributed to this article.
WELL THAT WAS NICE: Democratic FCC Commissioner “Welcome[s]” The Opportunity to Work with Olivia Trusty
So last week now-President Trump suggested he would nominate Olivia Trusty to serve as the fifth FCC commissioner and return the FCC to full strength.
We did a profile review of her last week and came to the conclusion she was really well qualified for the role.
Well apparently Democratic FCC Commissioner Anna M. Gomez is of the same mindset. Her office released a short statement on Thursday reading as follows:
“Congratulations to Olivia Trusty on the President-Elect’s announcement of his intent to nominate her as Commissioner of the Federal Communications Commission. She is widely respected, a consummate professional, and has a strong background on communications policy. I welcome the opportunity to work with her.”
Well look at that– a democrat welcoming a republican with open arms. Maybe there is hope for this country afterall.
Olivia Trusty’s nomination is looking pretty good from everything I have seen or heard. Will keep an eye on it.
DORA Takes Effect: Key Next Steps for Firms
After a two-year implementation period, the EU Digital Operational Resilience Act (DORA) takes effect on 17 January 2025.
DORA is part of the EU’s Digital Finance Package and aims to strengthen the financial sector’s ability to withstand and recover from operational disruption.
Despite DORA coming into effect, many financial entities and information communication and technology (ICT) third-party service providers (TPPs) continue to work towards DORA compliance.
Following 17 January 2025, financial entities will need to, among other things:
continue negotiating DORA-compliant contractual arrangements with TPPs to ensure such arrangements include the minimum contractual provisions set out in DORA;
establish and maintain their registers of information related to their ICT services, and engage with their national competent authorities (NCAs) on the delivery of such information ahead of the deadline for the first submission of these registers by NCAs to the European Supervisory Authorities (ESAs) on 30 April 2025;
monitor the adoption of the remaining technical standards on the subcontracting of ICT services and threat-led penetration testing as well as the publication of other DORA-related materials such as the highly anticipated guidance on the scope of ICT services;
enhance legacy ICT systems and infrastructure or integrate them with new systems to assist with the implementation of DORA’s requirements;
engage across multiple internal departments to avoid siloed efforts, miscommunication and/or gaps in compliance implementation, and ensure that the organisation is appropriately staffed to deal with ongoing DORA obligations;
prepare for engagement with NCAs who will play a key role in the supervision and enforcement of DORA; and
monitor the ESAs designation of TPPs as “critical” and determine any impact that such a designation may have on them where they utilise such a provider.
For further information on developments regarding DORA, please see our recent article (available here).