Trump Alters AI Policy with New Executive Order
On January 23, 2025, President Trump issued an Executive Order entitled “Removing Barriers to American Leadership in Artificial Intelligence.” The Executive Order seeks to maintain US leadership in AI innovation. To that end, the Order “revokes certain existing AI policies and directives that act as barriers to American AI innovation,” but does not identify the impacted policies and directives. Rather, it appears those policies and directives are to be identified by the Assistant to the President for Science and Technology, working with agency heads. The Order also requires the development of a new AI action plan within 180 days. Although the details of the new AI action plan are forthcoming, the Order states that the development of AI systems must be “free from ideological bias or engineered social agendas.”
Earlier in the week, Trump also signed an executive order revoking 78 executive orders signed by President Biden, including Biden’s Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence, issued on October 30, 2023. Biden’s Executive Order sought to regulate the development, deployment, and governance of artificial intelligence within the United States, and the document offered insight into the types of issues that concerned the previous Administration (specifically, AI security, privacy and discrimination). More information on Biden’s Executive Order can be found here.
As relevant to employers and developers of AI tools for employers, the revocation of Biden’s Executive Order is largely symbolic, because it did not directly impose requirements on employers who use AI. Instead, it directed federal agencies to prepare reports or publish non-binding guidance on topics such as:
“the labor-market effects of AI,”
“the abilities of agencies to support workers displaced by the adoptions of AI and other technological advancements,” and
“principles and best practices for employers” to “mitigate AI’s potential harms to employees’ well-being and maximize its potential benefits.”
Biden’s Executive Order had also directed agencies to provide anti-discrimination guidance to federal benefits programs and federal contractors over their use of AI algorithms and to coordinate on best practices for investigating and enforcing civil rights violations related to AI.
While employers may not experience any immediate effects from the two new Executive Orders this week, taken together, they lend support to predictions that the new Administration would take a more hands-off approach to regulating AI. We will continue monitor how the AI legal landscape evolves under the new Administration and continue to report on AI developments that affect employers.
Don’t Forget the EU: Italy Issued First GenAI Fine of €15 Million Alleging GDPR Violations
At the end of 2024 the Italian Data Protection Authority issued a 15 million euro fine in the first generative AI-related case brought under GDPR. According to Garante (the Italian authority), OpenAI trained ChatGPT with users’ personal data without first identifying a proper legal basis for the activity, as required under GDPR. The Order also alleges that OpenAI failed to notify Garante about a data breach the company experienced in March 2023. Additionally, the Order states that OpenAI did not provide proper age verification mechanisms for users under age 13.
In addition to the fine, OpenAI must also conduct a six-month public education campaign on how ChatGPT works and how data is used to train AI products. The campaign must also provide individuals with information about their rights and how to exercise their rights. OpenAI intends to appeal the decision.
This decision follows March 2023 temporary ban of ChatGPT in Italy. And in July 2023, the FTC issued a Civil Investigative Demand to OpenAI.
Putting it into Practice: While it is unclear the extent to which AI will receive the same type of scrutiny in the US that it did under the prior administration, this decision is a reminder that the EU regulators are keeping a close eye on AI activities, especially when personal data is used to train the tool.
Listen to this post
Energy Demand for AI Drives the Midwest’s Focus on Resource Adequacy
As presidential administrations change and policy priorities shift, the steady hum of electricity demand from artificial intelligence (AI) and data centers presses forward. Last week, the President signed several executive orders to realign federal energy priorities. One recent executive order is crucial to data centers and artificial intelligence: Declaring a National Energy Emergency.
The executive order focuses on improving grid reliability and ensuring a reliable supply of energy (though not wind- or solar-powered energy). The impetus for declaring an emergency was, in part, “due to a high demand for energy and natural resources to power the next generation of technology.” The emergency declaration unlocks several powers for the President. Executive agencies were directed to exercise those powers to facilitate the siting, production, transportation and generation of domestic energy resources on federal (or even private) lands. These resources include fossil fuels, uranium, geothermal heat, hydropower and certain critical minerals. Agency heads are permitted to recommend the use of federal eminent domain authority if necessary to achieve these objectives.
This order comes at a time when AI-driven technology is rapidly developing. Some of the most popular AI models require massive computational resources. Training these models involves processing enormous amounts of data across thousands of servers, each consuming significant amounts of electricity to keep their hardware cool. Data centers, which house these servers, are at the heart of the AI revolution. As the executive order notes, “the United States’ ability to remain at the forefront of technological innovation depends on a reliable supply of energy and the integrity of our Nation’s electrical grid.”
Reliable, abundant, and affordable electricity is a critical reason why data centers are targeting the Midwest region for future development. The region has a diverse energy mix, including coal, natural gas, and nuclear, along with an increasing share of wind and solar. However, the boom in demand from AI and associated technology has complicated the region’s reliability and affordability picture. As data centers proliferate across the plains, demand during peak periods is intensified.
Investor-owned utilities report on the present circumstances in their public statements. The average project size in Ameren Missouri’s service territory increased from 3.2 megawatts (MW) in 2019 to 181.2 MW in 2024. Oklahoma Gas & Electric expects over 20% growth in its energy forecast for the next five years. In Evergy’s service territory in Missouri and Kansas, roughly 6 gigawatts (GW) of projects sit in its economic development queue. For context, the Wolf Creek nuclear plant in Kansas has a nameplate capacity of roughly 1.2 GW.
Increasing data center demand comes at a time when the region is also experiencing growth in other energy-intensive industries, such as electric vehicle manufacturing and semiconductor production. The electrical grid needs to be able to manage these surges in demand without compromising reliability, which poses a challenge for regulators and grid operators. More data centers operating in the region means that the peak demand could shift in new directions, with potential implications for the overall energy system.
Regulators, customers and developers must consider rate design and cost allocation to manage this new demand picture and ensure resource adequacy. While the federal government is staking out its position, state regulators, data center developers and utilities can also approach this task with several strategies:
Effective Rate Design: Managing increased demand will require significant investments in new energy infrastructure. State regulators should ensure developers can access reliable energy at a just and reasonable rate when data centers need it without expecting other customers to cover more than their fair share of new upgrades. Utilities and developers should craft tariffs that balance these needs.
Investment in Grid Infrastructure: Upgrading and modernizing the electrical grid will be essential to handle increased demand. Additional development of electric transmission infrastructure is vital to dispatch regional generation resources and meet growing demand. Smart grid technologies, which use digital communications to monitor and manage electricity flow, can also help improve efficiency and resilience.
Energy Efficiency in Data Centers: Data center operators can reduce their impact on peak demand by investing in energy-efficient technologies and practices. Many data center operators are already pursuing advanced cooling systems and optimizing server workloads to mitigate their electricity consumption. As the technology behind AI continues to evolve, the efficiency of the infrastructure supporting it will need to improve.
Demand Response Programs: Utilities can implement demand response programs, which incentivize consumers—including data centers—to reduce their electricity usage during peak periods. This could help balance the grid during times of high demand, ensuring that the system remains reliable.
The increasing demands placed on the electricity grid by AI and new data centers represent a significant challenge for resource adequacy in the Midwest region of the United States. However, with thoughtful planning, strategic investments in infrastructure and energy efficiency, the region can continue to support its technology-driven economy while ensuring the reliability and sustainability of its energy supply.
2025 Outlook: Recent Changes in Construction Law, What Contractors Need to Know
The construction industry is at a crossroads, influenced by shifting economic landscapes, technological advancements, and evolving workforce dynamics. With 2025 under way, businesses must stay ahead of key trends to remain competitive and resilient. Understanding these industry shifts is critical—not just for growth, but for long-term sustainability and safety.
Here’s what to expect in 2025:
Job Market
According to the Michael Bellaman, President and CEO of Associated Builders and Contractors (“ABC”) trade organization, the U.S. construction industry will need to “attract about a half million new works in 2024 to balance supply and demand.” This estimate considers the 4.6% unemployment rate, which is the second lowest rate on record, and the nearly 400k average job openings per month. A primary concern as we enter 2025 is to grow the younger employee pool, as 1 in 5 construction workers are 55 or older and nearing retirement.
While commercial construction has not yet been as heavily impacted as residential construction by the lack of workers, the demand for commercial will increase as more industries are anchored on U.S. soil. Think of bills such as the CHIPS and Science Act that allocated billions in tax benefits, loan guarantees, and grants to build chip manufacturing plants here. This is true regardless of political party; investing in American goods and manufacturing seems to be a bipartisan opinion.
AI and Robotics
At the end of 2024, PCL Construction noted that AI will be an integral part of the construction industry. Demand for control centers will drive up commercial production, though the workforce lack may present some challenges when it comes to a construction company’s productivity and workload capacity.
AI will not just change the supply and demand market, but also will be integrated in the day-to-day mechanics and sensors for safety measures within a construction zone. On top of the demand for microchips catalyzed by the CHIPS and Sciences Act, AI is used to “monitor real-time activities to identify safety hazards.” AI-assisted robotics can take on meticulous work such as “bricklaying, concrete pouring, and demolition while drones assist in surveying large areas.” We will start to see where the line is drawn between which jobs require a skilled worker and which can be handled by AI without disrupting the workforce.
Economic Factors
The theme of the years following COVID-19 has been to return the economy to what it was pre-pandemic, including slashing interest rates and controlling inflation. With this favorable economic outlook for 2025, construction companies can look to increasing their projects. On the residential side, the economic boom may drive housing construction to meet demand. On the commercial side, less inflation and lower interest rates for the business can lead to more developmental projects such as megaprojects and major public works. Economist Anirban Basu believes that construction companies may not reap these benefits until 2026 due to the financing and planning required.
Bringing production supply chains back to U.S. soil can help alleviate some of the global concerns such as the crisis in the Red Sea, international wars, and the high tariffs proposed by the Trump Administration. Again, economists are predicting this bountiful harvest in a few years rather than immediately.
Environmental Construction
Trends toward sustainability are leading the construction industry toward greener initiatives such as modular and prefab structures. Both options find the construction agency developing their structures outside of the building sites.
AI can also play a hand in developing Building Information Modeling (“BIM”) to better understand the nuances, possible pitfalls, and visualization of the project before construction begins. Tech-savvy construction agencies are already using programs such as The Metaverse or Unreal Engine for BIM and can significantly reduce project time, resources, and operational costs.
Employee Safety and PPE: Emphasis on employee safety – smart PPE and “advanced monitoring systems”
PPE requirements will far surpass the traditional protective gear (such as helmets, masks, and gloves). Construction sites may soon be required to supply smart PPE products that can scan a worker’s biometrics and environment to prevent medical anomalies or hazardous environmental conditions. Smart PPE devices will be enabled with Internet of Things (“IoT”) to ensure real-time data transmission and to use data analytics to track patterns or predict risks.
Conclusion
The construction industry’s future hinges on adaptability and innovation. By addressing workforce shortages, integrating AI-driven solutions, and adopting sustainable practices, companies can position themselves for success in a dynamic market. Whether it’s preparing for the long-term economic upswing or enhancing employee safety through smart PPE, proactive measures today can lead to stronger, more resilient operations tomorrow. Staying informed and prepared will be crucial for navigating the challenges and seizing the opportunities ahead.
The Future of AI: A Glimpse into 10 Years from Now

The Future of AI: A Glimpse into 10 Years from Now. Liang Wenfeng is the name behind DeepSeek, the Chinese AI startup that has caused a massive disruption in the global tech scene. What began as a side project has now evolved into a serious threat to U.S. tech giants like OpenAI, Google, and Nvidia, sending shockwaves […]
Who is Liang Wenfeng? – A Visionary Entrepreneur in AI and Quantitative Investment

Who is Liang Wenfeng? – A Visionary Entrepreneur in AI and Quantitative Investment Liang Wenfeng is the name behind DeepSeek, the Chinese AI startup that has caused a massive disruption in the global tech scene. What began as a side project has now evolved into a serious threat to U.S. tech giants like OpenAI, Google, and Nvidia, […]
FDA Dumps Trio of Device-Related Guidances Prior to Administration Change
Among the wave of guidance documents issued by the U.S. Food and Drug Administration (“FDA” or the “Agency”) in the first week of 2025 were three notable draft guidance documents pertaining to medical devices (together, the “Draft Guidances”). The Draft Guidances hit on the topics of in vitro diagnostic (“IVD”) devices, artificial intelligence (“AI”) enabled device software functions, and pulse oximeters. This uncharacteristic deluge of guidance all within the span of a week illustrates the Agency’s desire to disseminate policy ahead of the incoming administration – especially as it relates to medical devices, which for a variety of reasons that any follower of this blog could intuit, have become a hot-button issue across the various corners of the healthcare and life sciences industries.
I. In Vitro Diagnostic Devices
On January 6, FDA released a draft guidance titled “Validation of Certain In Vitro Diagnostic Devices for Emerging Pathogens During a Section 564 Declared Emergency” (the “IVD Draft Guidance”).[1] This guidance aims to provide a framework for manufacturers to efficiently validate IVDs for emerging pathogens – part of FDA’s continuing effort to lay the groundwork for a timely and effective response to future public health emergencies. FDA is inviting comments to the Draft Guidance with a deadline set for March 7, 2025.
A. Background
The Food, Drug, and Cosmetic Act (“FD&C Act”) grants FDA authority to facilitate the availability and use of medical countermeasures (“MCMs”) to address chemical, biological, radiological, and nuclear threats to the nation’s public health.[2] This power is referred to as Emergency Use Authorization (“EUA”) and allows FDA to authorize the use of certain unapproved medical products if the Secretary of Health and Human Services (the “Secretary”) declares that justifying circumstances exist. FDA has used EUA to authorize emergency use of IVDs for eight infectious diseases over the years – most recently and notably, for COVID-19.
During COVID-19, FDA had to play catch-up by issuing enforcement discretion policies, through guidance, for certain unauthorized tests to help rapidly increase testing capacity on a nationwide scale – meaning certain tests were made available without EUA. Whether or not tests are authorized through EUA or described in enforcement discretion policies, the key concern for FDA is that these tests are properly validated. To this end, FDA can, and has, taken appropriate action against tests lacking the proper validation. In the IVD Draft Guidance, FDA provides recommendations for test validation so that IVD manufacturers can have a framework to efficiently secure authorization under EUA, and get much-needed treatments to the public, in the event of a new infectious disease outbreak.
B. Takeaways
The IVD Draft Guidance is clearly underscored by a desire to be better prepared to efficient, safe, and effective testing in the event of another disease outbreak like COVID-19 – in fact, FDA says as much in the guidance itself. What FDA does not explicitly say, but would could also underscore the Agency’s timing in issuing the guidance when it did, is a concern about how the incoming administration might handle such an outbreak in terms of testing and therapeutics, given some of the discourse we’ve heard to date.
Aside from emergency preparation, the IVD Draft Guidance also underscores FDA’s concerns about the efficacy of IVDs, generally, especially those that are subject to abbreviated validation standards. For example, last year, the Agency issued a lengthy (and controversial) final rule outlining a plan to end its previous policy of enforcement discretion for laboratory-developed tests (“LDTs”) – a subset of IVD – based on over a decade of concerns over the efficacy of these tests that have historically not been subject to any oversight, including validation standards, from FDA at all. The framework outlined in this IVD Draft Guide similarly addressed concerns over the efficacy of testing during emergency scenarios when manufacturers are subject to urgent time constraints and abbreviated EUA standards.
II. AI-Enabled Device Software
On January 7, FDA released a draft guidance titled “Artificial Intelligence-Enabled Device Software Functions: Lifecycle Management and Marketing Submission Recommendations” (the “AI Draft Guidance”).[3] This is only the third guidance FDA has released related to Artificial Intelligence (“AI”), and the second relating specifically to AI-enabled device software functions.[4] The AI Draft Guidance was expected, as it appeared on FDA’s Center for Devices and Radiological Health’s (“CDRH”) “A-list” of guidances to publish in Fiscal Year 2025.[5] The AI Draft Guidance signifies an acknowledgement by FDA of the need to keep pace – as best it can – with technological advancements in the medical device space, particularly in light of the attention and concern swirling around AI use. FDA is inviting comments to the Draft Guidance with a deadline set for April 7, 2025.
A. Background
The rapid advance of AI technology in recent years has significantly influenced the development and implementation of medical device software functions. Device manufacturers are increasingly integrating these AI-enabled device software functions (“AI-DSFs”) to enhance diagnostic, monitoring, and treatment capabilities. In recent years, FDA has focused on promoting a Total Product Life Cycle (“TPLC”) approach to the oversight of AI-DSF, which emphasizes ongoing management and continuous improvement of AI-enabled devices both pre- and post-market according to guiding principles for Good Machine Learning Practice (“GMLP”), to ensure that AI-DSF remain safe and effective from design through decommissioning. In the AI Draft Guidance, FDA continues its effort by establishing lifecycle management and marketing submission recommendations for AI-DSF and, as always, encouraging early interaction with FDA to ensure the development of a safe and effective product for patients.
B. Takeaways
In its AI Draft Guidance, FDA makes clear that the integration of AI modeling into medical device software is being scrutinized in a way that at least parallels, or even exceeds, the oversight given to general device software functions. The AI Draft Guidance sets forth many FDA recommendations for lifecycle management, suggesting that a large overhaul is needed to come up-to-speed with rapidly evolving AI development. Significantly, the scope of the AI Draft Guidance includes the device itself and any device constituent parts of a combination product, which may include AI-DSFs.
Much of the AI Draft Guidance focuses on premarket notification (e.g., 510(k)) submissions for devices that include AI-DSFs, notably requiring a thorough explanation of how AI is integrated into the device. In light of all the uncertainties surrounding AI use and its seemingly unlimited application, FDA seems to be looking for some level of assurance that AI-DSF developers will properly leverage parameters and safeguards so that this “unlimited” use potential does not transform a device beyond its cleared and/or approved intended use.
Another key focus of the AI Draft Guidance is transparency and bias reduction. This is a typical, and growing, area of concern for FDA when it comes to devices that collect and store information; however, incorporating AI complicates the issue because of unknown risk of bias. Specifically, FDA notes that AI models may rely on data correlations and other machine learning-derived processes that do not connect to biologically plausible mechanisms of action. Therefore, while AI’s strength is its adaptability, risk lies in the fact that its decision-making processes are not fully predictable. To mitigate this risk, FDA provides a recommended design approach to transparency throughout the product lifecycle, especially with respect to involving data collection and monitoring.
Another key focus of the AI Draft Guidance – and another growing area concern for FDA and stakeholders alike – is cybersecurity. Here, FDA builds off of its 2023 guidance (“2023 Guidance”) which addressed, more generally, cybersecurity in medical devices,[6] to contextualize it within the AI-sphere. The application of AI to medical device software adds a new layer of security concern because, if AI systems are hacked/accessed, the consequences can be much more widespread. To mitigate this risk, FDA provides comprehensive, AI-specific recommendations for handling cybersecurity threats, while also deferring to the 2023 Guidance for the complete framework that should be implemented prior to marketing submission.
The emergence of AI necessitates that FDA alter its long-standing framework for ensuring the safety and efficacy of medical devices in light of the unique way that AI-enabled device functions operate – and the AI Draft Guidance illustrates FDA’s continued recognition of, and response to this need.
III. Pulse Oximeters for Medical Purposes
On January 7, FDA released a draft guidance titled “Pulse Oximeters for Medical Purposes – Non-Clinical and Clinical Performance Testing, Labeling, and Premarket Submission Recommendations.” (the “PO Draft Guidance”),[7] which provides recommendations for performance testing, labeling, and premarket submissions of pulse oximeters. Once finalized, the PO Draft Guidance FDA’s existing pulse oximeter guidance, which was issued on March 4, 2013 (“2013 Guidance”).[8] FDA is inviting comments to the PO Draft Guidance with a deadline set for March 10, 2025.
A. Background
In recent years, there has been growing concern over the accuracy of readings from pulse oximeters, which are devices that measure the amount of oxygen in arterial blood and pulse rate.[9] In addressing this concern, FDA found that a host of factors affects the accuracy of pulse oximeter readings, especially person’s skin pigmentation. In light of this particular concern, FDA engaged interested parties, and partnered with the University of California San Francisco, as part of the Centers of Excellence in Regulatory Science and Innovation (“CERSI”) program, to conduct a study comparing pulse oximeter errors in clinical patients with varying skin tones. Based on results from this study, as well as input from interested stakeholders, FDA has created enhanced recommendations for marketing submissions to ensure that pulse oximeters used as standalone medical devices, or as part of a multi-parameter medical device, accurately fulfill their intended use. This comprehensive list of marketing submission recommendations is laid out in the new PO Draft Guidance and includes enhanced clinical performance testing procedures that specifically account for disparities in performance across different populations, such as diverse skin tones and pediatric populations.
FDA is showing that it is concerned not only with whether the device performs its intended function accurately, but whether that performance is consistent across all patient populations. Significantly, FDA is exhibiting concern regarding the diversity of patient populations across this country, urging manufacturers to ensure accuracy for all.
B. Takeaways
This new PO Draft Guidance underscores FDA’s continuing commitment to ensuring that regulated products are safe and effective for all individuals – not only those majority populations who have typically been the subject of clinical testing and validation. It is incumbent on manufacturers, FDA, and providers to ensure that medical devices perform properly for each and every person, irrespective of differences in identifying characteristics. And where a certain product has not been tested on and/or cannot be confirmed safe and effective for a certain population, FDA is clear that this limitation needs to be made known to prescribers and end users by limiting the product’s intended use and associated labeling. Bottom line – we can’t have patients relying on products that do not operate safely and/or specifically for them and others like them.
Conclusion
The common thread among these device-specific Draft Guidances is an emphasis on early collaboration with FDA to get ahead of certain identified issues that pose public health threats—an infectious disease emergency, an unmanageable and/or unsecure AI algorithm, or a test result that was not clinically verified for a certain patient’s skin tone. Now, we have heard this refrain before, especially on the drug side of the house—we often roll our eyes when we see it, given that the Agency holds the ultimate power in just about any facet of inquiry, decision-making, and enforcement. But given the blistering speed with which these technologies have been and will continue to develop, FDA’s entreaties here might mean something more.
Indeed, despite the myriad of other critical issues that FDA needs to address, it is clear that CDRH policymakers did not intend for devices to fall by the wayside as this administrations changed guard. Whatever happens over the coming months, all eyes in our industry will be on FDA policy—in guidance, enforcement, or otherwise.
FOOTNOTES
[1] IVD Draft Guidance available here: Validation of Certain In Vitro Diagnostic Devices for Emerging Pathogens During a Section 564 Declared Emergency | FDA
[2] FD&C Act Section 564 available here: 21 U.S.C. 360bbb-3.
[3] AI Draft Guidance available here: Artificial Intelligence-Enabled Device Software Functions: Lifecycle Management and Marketing Submission Recommendations | FDA
[4] See December 2024 guidance available here: Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence-Enabled Device Software Functions | FDA; January 2025 draft guidance available here: Considerations for the Use of Artificial Intelligence To Support Regulatory Decision-Making for Drug and Biological Products | FDA
[5] 2025 A-List available here: CDRH Proposed Guidances for Fiscal Year 2025 (FY2025) | FDA
[6] 2023 Guidance available here: Cybersecurity in Medical Devices: Quality System Considerations and Content of Premarket Submissions | FDA
[7] PO Draft Guidance available here: Pulse Oximeters for Medical Purposes – Non-Clinical and Clinical Performance Testing, Labeling, and Premarket Submission Recommendations | FDA
[8] 2013 Guidance available here: Pulse Oximeters – Premarket Notification Submissions [510(k)s]: Guidance for Industry and Food and Drug Administration Staff | FDA
[9] See, e.g., Pulse Oximeter Accuracy and Limitations: FDA Safety Communication, FDA (Feb. 19, 2021).; Multistate Letter Urging FDA to Address Concerns about Dangerous Pulse Oximeter Inaccuracies Impacting Communities of Color, Cal. Atty. Gen. (Nov. 1, 2023).
Is the Future of Digital Assets in the United States Bright Again?
Yes, indeed! What Brad Garlinghouse of Ripple Labs called “Gensler’s reign of terror” ended with Securities and Exchange Commission (SEC) Chair Gary Gensler’s resignation upon President Donald Trump’s inauguration. Paul Atkins, who has co-chaired the Token Alliance, spoke of the need for a “change of course” at the SEC and will be given charge of the SEC when he is confirmed as its new Chairman.
While the greatest deliberative body takes time to exercise its constitutional role of advice and consent, President Trump and Acting SEC Chairman Mark Uyeda are moving ahead at lightning speed, each taking action in the first week of the new administration. The long-awaited paradigm shift in regulation for digital assets is here and the market likes what it sees, with Bitcoin now trading near an all-time high and the total market capitalization of digital assets topping the US$3 trillion mark. Projects are once again being funded in—and development teams are returning to—the United States.
The day after his inauguration, President Trump signed an Executive Order, Strengthening American Leadership in Digital Finance Technology, aiming to “support the responsible growth and use of digital assets, blockchain technology, and related technologies across all sectors of the economy.” This comes on the heels of a newly announced Crypto Task Force at the SEC, dedicated to developing a comprehensive and clear regulatory framework for digital assets, including “crypto” assets.
The Executive Order
In his Executive Order, President Trump points to the crucial role that the digital assets industry plays in the innovation and economic development of the United States, declaring it to be the policy of his administration to:
Protect and promote public blockchain networks, mining and validating, and self-custody of digital assets.
Protect and promote the U.S. dollar by promoting stablecoins worldwide.
Provide regulatory clarity and certainty built on technology-neutral regulations, including well-defined jurisdictional regulatory boundaries.
President Trump’s 2025 Executive Order revokes former President Biden’s 2022 Executive Order regarding crypto assets and orders the Secretary of the Treasury to likewise revoke all prior inconsistent Treasury policies.
Most significantly, the Executive Order establishes the “President’s Working Group of Digital Asset Markets” to be chaired by the “Special Advisor for AI and Crypto,” Silicon Valley venture capitalist David Sacks, who is sometimes called the “Crypto Czar.” Its Executive Director will be “Bo” Hines of North Carolina. The Working Group will consist of specified officials (or their designees) such as the Secretaries of the Treasury, Commerce, and Homeland Security, the Attorney General, the Director of Office, Management and Budget, the Homeland Security Advisor, and the Chairs of the SEC and the Commodities and Futures Trading Commission (CFTC).
The Working Group has been charged to hit the ground running:
By February 22, 2025, the Treasury, DOJ, SEC and other relevant agencies included in the Working Group shall identify all regulations, guidance documents, orders, or other items that affect the digital assets sector. In other words, what has the federal government done so far?
By March 24, 2025,each agency shall submit recommendations with respect to whether each identified regulation, guidance document, order, or other itemshould be rescinded or modified, or, for items other than regulations, adopted in a regulation.
By the end of last week, the SEC had already rescinded Staff Accounting Bulletin 121, an especially troubling piece of guidance that the SEC never approved and that Congress had sought to overturn but former President Biden retained. SAB 121 required crypto custodial banks to carry customer assets on their balance sheets—something required for no other asset. Upon rescinding SAB 121, SEC Commissioner Hester Pierce tweeted, “Bye, bye SAB 121! It’s not been fun.” Another piece of SEC guidance that might be on the chopping block is the so-called “Framework for ‘Investment Contract’ Analysis of Digital Assets,” which has confounded the digital assets industry since it was first adopted.
By July 22, 2025, the Working Group shall submit a report to the President recommending i that advance the policies established in the order. In particular:
The Working Group will propose a federal regulatory framework governing the issuance and operation of digital assets, including stablecoins, in the United States. The Working Group’s report shall consider provisions for market structure, oversight, consumer protection, and risk management.
The Working Group will have significant choices to make in this regard: Will it back the “FIT 21” bill that has already been approved by the U.S. House of Representatives, or will it seek to chart a different course? Will it back a merger of the CFTC with the SEC? How will it reconcile the desire to support technology innovation with national security interests and investor protection?
The Working Group will evaluate the potential creation and maintenance of a national digital asset stockpile and propose criteria for establishing such a stockpile, potentially derived from cryptocurrencies lawfully seized by the federal government through its law enforcement efforts. In this regard, President Trump might be seen as having backed off his earlier promise to create a Bitcoin reserve in the United States, as it is now being considered rather than proposed for immediate adoption. The word “Bitcoin” does not appear even once in the Executive Order.
President Trump’s Executive Order also prohibits the establishment, issuance, or promotion by federal agencies of Central Bank Digital Currencies (CBDCs) within the United States or abroad, terminating any ongoing plans or initiatives related to the creation of a CBDC within the United States. The libertarians who dominate appointments in the financial services sector of the administration are strongly opposed to CBDCs, viewing them as a threat to personal liberty.
In issuing this Executive Order, President Trump fulfilled his campaign promises relating to crypto assets. In a July 27, 2024, address to the Bitcoin 2024 Conference in Nashville, he promised to “end Joe Biden’s war on crypto.” He promised:
To “fire Gary Gensler,” who resigned upon Trump’s inauguration.
To “immediately shut down Operation Chokepoint 2.0,” which he is carrying out in his order to Department of the Treasury.
To appoint the aforementioned Working Group.
To defend the right to self-custody.
To ban CBDCs.
In the first week, we are seeing that, at least thus far, promises made are promises kept.
SEC Crypto Task Force
On the SEC side, Commissioner Hester Pierce, known as “Crypto Mom,” will head the Crypto Task Force that will work to develop a “sensible regulatory path that respects the bound of the law.” The SEC under former President Biden used “regulation by enforcement” rather than “regulation by rulemaking and interpretation” to regulate the crypto asset industry. President Trump’s SEC has already signaled the “course correction” that Paul Atkins called for before the election. Both Commissioners Peirce and Uyeda worked for Atkins in his prior stint as an SEC Commissioner. Others have observed that the Atkins-Peirce-Uyeda “triumvirate” might be the most powerful cohort of Commissioners that the SEC has ever seen.
The SEC announcement states that the Task Force will be focused on developing clear regulatory lines, realistic paths to registration, sensible disclosure frameworks, and deploying enforcement resources judiciously. The Task Force plans to hold future roundtables and is asking for public input as well.
The day that the SEC Task force was announced, Foley & Lardner submitted suggestions to the SEC for roundtable topics. Our suggestions included:
What Securities Act registration exemptions should be adopted to broaden market access to digital assets? An example might be the “safe harbor” that Commissioner Peirce proposed and refined, only to have it ignored by the Gensler SEC.
What guidance should the staff have given that it has failed to give? What guidance should be withdrawn? There has been no guidance about how Regulation S applies to digital asset offerings, to point out one shortcoming. The staff might have given guidance, but Chairman Gensler prohibited it, adopting the view that the SEC does not give legal advice.
What needs to change for you to “come in and register” if you are a token “issuer”? Plainly the system is broken now, as those who have tried to register were delayed indefinitely and ultimately conceded defeat. Others, seeing this, never even tried.
What needs to change for you to “come in and register” if you are a token “dealer” or “exchange”? These questions are paramount for crypto exchanges that do business in the United States and have been sued by the SEC for failing to register.
What needs to change for you to “come in and register” your crypto brokerage firm? What more can be done for you to “come in and register” your crypto fund? How can the SEC facilitate trading in securities tokens and other tokenized assets? How can the SEC better collaborate with the CFTC regarding digital assets? What legislation should the SEC recommend for adoption by Congress? All these questions, and more, need to be addressed by the SEC, engaging the public as the answers are determined. In each case, the SEC would act consistently with its statutory mandate to protect securities investors and assure fair and orderly markets.
Next Steps
Foley has offered to assist the SEC in its consideration of these questions and expect to be involved in some capacity along the way. Likewise, we expect to make submissions to the President’s Working Group. If you would like to be represented in that process to make sure that your views are considered, please reach out to either of the authors. We are engaging with the House Financial Services Committee and the Senate Banking Committee in addition to the Trump Administration, the SEC, and the CFTC.
Similarly, if you have a development team or a product and are looking to access the U.S. digital asset markets lawfully, we are standing by to help.
5 Trends to Watch in 2025: AI and the Israeli Market
Israel’s AI sector emerging as a pillar of the country’s tech ecosystem. Currently, approximately 25% of Israel’s tech startups are dedicated to artificial intelligence, according to The Jerusalem Post, with these companies attracting 47% of the total investments in the tech sector (Startup Nation Finder). This strong presence highlights Israel’s focus on AI-driven innovation and entrepreneurs’ belief in the growth opportunities related to AI. The Israeli AI market is expected to grow at a compound annual growth rate of 28.33% from 2024 through 2030, reaching a value of $4.6 billion by 2030 (Statista). This growth is driven by increasing demand for AI applications across diverse industries such as health care, cybersecurity, and fintech. Government-backed initiatives, including the National AI Program, play a critical role in supporting startups by providing accessible and non-dilutive funding for research and development (R&D) purposes. Despite facing significant challenges since the start of the war in Gaza, Israel has continued to produce cutting-edge technologies that are getting the attention of global markets. Additionally, Israel’s highly skilled workforce and partnerships with academic institutions provide a steady supply of talent to meet the sector’s demands. With innovation, resilience, and collaboration at its core, the Israeli AI landscape is poised to remain a global force in 2025 and beyond.
Mergers and acquisitions to remain a cornerstone of deals. According to IVC Research Center, 47 Israeli AI companies successfully completed exits in 2024, showcasing the global demand for AI-driven innovation. Investors are continually identifying the differences between companies whose foundations were built on AI, versus those leveraging AI to enhance other core elements of their value proposition—sometimes only marginally. Savvy buyers look beyond the “AI label” and seek out companies with genuine, scalable AI solutions rather than superficial integrations, understanding that value lies in robust and transformative applications. AI is also sector agnostic and may disrupt virtually every vertical. From health care and finance to retail and manufacturing and others, numerous industries are increasingly leveraging AI to enhance or even change their core competency to gain competitive advantages. Deals in this space are coming from strategics such as automobile manufacturers, banks, digital marketing companies and life science firms, among others. As AI continues to permeate multiple sectors, Israeli companies are poised to receive increased attention from strategic M&A buyers looking to unlock new technologies and business opportunities in the market.
Intersection of PropTech and AI to further revolutionize the global real estate industry. Israeli innovation is expected to be at the forefront of this trend. According to IVC Research Center, over 70 PropTech companies headquartered in Israel are leveraging AI to develop cutting-edge technologies that are reshaping the industry on a global scale. We anticipate these companies will continue advancing AI-driven tools and third-party solutions to streamline acquisition strategies, enhance underwriting processes, and drive operational efficiencies. By harnessing AI to identify leasing opportunities, forecast rental trends, and optimize costs, Israeli PropTech firms are set to solidify their position as global leaders in real estate innovation in the year ahead.
AI to become increasingly important across global industries. Israeli companies have demonstrated genuine thought/R&D leadership in AI innovation. Some of the AI-centric legal trends that may stand out in 2025 include (1) a greater focus on data rights management as Agentic AI continues to carve new learning standards; (2) regulatory advancements in science, highlighted by two AI-related Nobel Prizes in science, that will likely materialize in the U.S. Food and Drug Administration adopting new rules for AI-driven drug approvals, as well as new AI patenting standards and requirements; (3) greater emphasis on responsible AI usage, particularly around ethics, privacy, and transparency; (4) the adoption of quantum AI across many industries, including in the area of securities trading, which will likely challenge securities regulators to address its implications; and(5) turning to AI-powered LegalTech strategies (both in Israel and in other countries). Israeli entrepreneurs are likely to continue working within each of these industries and help drive the AI transformation wave.
AI-based technology to continue changing how companies handle recruitment and hiring. While targeted advertising enables employers to find strong talent, and AI-assisted resume review facilitates an efficient focus on suitable candidates, the use of AI to identify “ideal” employees and filter out “irrelevant” applicants may actually discriminate (even if unintentionally) against certain groups protected under U.S. law (for example, women, older employees, and/or employees with certain racial profiles). In addition, AI-assisted interview analysis may inadvertently use racial or ethnic bias to eliminate certain candidates. Israeli companies doing business in the United States should not assume their AI-assisted recruitment and hiring tools used in Israel will be permitted to be utilized in the United States. Also, Israeli companies should be mindful of newly enacted legislation in certain U.S. states requiring companies to notify candidates of AI use in hiring, as well as conduct mandatory self-audits of AI-based employee recruitment and hiring systems. AI regulation on the state level in the United States is likely to increase, and Israeli companies that recruit and hire in the United States will be required to balance their use of available technology with applicable U.S. legal constraints.
Latest Changes to ISS and Glass Lewis Proxy Voting Guidelines
Institutional Shareholder Services (ISS) and Glass Lewis, two leading proxy advisory firms, recently announced updates to their U.S. proxy voting policies in advance of the 2025 proxy and annual meeting season. Public companies need to consider how these updates could impact voting recommendations and any governance changes that could be implemented to improve the likelihood of favorable recommendations.
Background on Proxy Advisory Firms
ISS and Glass Lewis have risen to prominence for making proxy voting recommendations to their investor clients ahead of shareholder meetings for public companies.
ISS and Glass Lewis publish their respective proxy voting guidelines and policies that describe the factors that it will take into consideration in making voting recommendations. While these policies remain largely consistent year over year, the annual updates often address new and emerging issues, such as artificial intelligence, or revise or clarify existing stances on evolving matters of corporate governance, such as executive compensation, director independence, and environmental, social, and governance (ESG) policies and disclosures. These changes can be based on a number of factors, such as changing shareholder attitudes, new legislation or exchange rules, or general industry trends.
These proxy voting guidelines can be used by institutional investors as either determinative or informative of their voting decisions, and the recommendations by ISS and Glass Lewis can significantly sway the outcome of shareholder voting proposals.
2025 ISS Proxy Voting Guideline Changes
Executive Compensation
In addition to revisions to its proxy voting guidelines, ISS also updated its FAQs on executive compensation policies:
Computation of Realizable Pay – The realizable pay chart will not be displayed for companies that have experienced multiple (two or more) CEO changes within the three-year measurement period.
Pay-for-Performance Qualitative Review – ISS will place greater focus on performance‑vesting equity disclosures and plan designs, especially for companies with a quantitative pay-for-performance misalignment. Existing qualitative considerations around performance equity programs will be subject to greater scrutiny in the context of a quantitative pay‑for‑performance misalignment. ISS provided a non-exhaustive list of typical considerations for such analysis, including:
Non-disclosure of forward-looking goals (note: retrospective disclosure of goals at the end of the performance period will carry less mitigating weight than it has in prior years);
Poor disclosure of closing-cycle vesting results;
Poor disclosure of the rationale for metric changes, metric adjustments or program design;
Unusually large pay opportunities, including maximum vesting opportunities;
Non-rigorous goals that do not appear to strongly incentivize for outperformance; and/or
Overly complex performance equity structures.
Evaluation of Incentive Program Metrics – ISS reaffirmed its stance that it does not favor total shareholder return (TSR) or any specific metric in executive incentive plans, holding that the board and its compensation committee are best suited to choose metrics that lead to long-term shareholder value. However, ISS acknowledged that shareholders prefer an emphasis on objective metrics that lead to increased transparency in compensation decisions. In evaluating the metrics of an incentive program, ISS may consider several factors, including:
Whether the program emphasizes objective metrics linked to quantifiable goals, as opposed to highly subjective or discretionary metrics;
The rationale for selecting metrics, including the linkage to company strategy and shareholder value;
The rationale for atypical metrics or significant metric changes from the prior year; and/or
The clarity of disclosure around adjustments for non-GAAP metrics, including the impact on payouts.
Changes to In-Progress Incentive Programs – ISS reiterated its position against midstream changes to ongoing incentive programs, such as metrics, performance targets, and/or measurement periods). Similar to other kinds of unusual pay program interventions, ISS states that companies should disclose a compelling rationale for such actions and how they do not circumvent pay-for-performance outcomes.
Robust Clawback Policies – This year, ISS added a new FAQ concerning the requirements for a clawback policy to be considered “robust” under the “Executive Compensation Analysis” section of the ISS research report. In order to qualify, a clawback policy must:
Extend beyond minimum Dodd-Frank requirements; and
Explicitly cover all time-vesting equity awards.
Poison Pills
ISS made significant revisions to its voting policies concerning shareholder rights plans, more commonly referred to as “poison pills,” which are used by boards of directors to prevent hostile takeovers. Currently, when considering whether or not to vote for director nominees who have adopted a short-term poison pill (one year or less) without shareholder approval, ISS evaluates director nominees on a case-by-case basis. This year, ISS revised its guidelines to increase transparency surrounding the factors considered in this evaluation.
The revised list of factors now includes (changes as marked):
The trigger threshold and other terms of the pill;
The disclosed rationale for the adoption;
The context in which the pill was adopted (e.g., factors such as the company’s size and stage of development, sudden changes in its market capitalization, and extraordinary industry-wide or macroeconomic events);
A commitment to put any renewal to a shareholder vote;
The company’s overall track record on corporate governance and responsiveness to shareholders; and
Other factors as relevant.
Natural Capital
Next, ISS renamed references to “General Environmental Proposals” with “Natural Capital‑Related and/or Community Impact Assessment Proposals.” ISS also revised the list of factors considered when voting requests for reports on policies and/or the potential (community) social and/or environmental impact of company operations.
The revised list of factors now includes (changes as marked):
Alignment of current disclosure of applicable company policies, metrics, risk assessment report(s) and risk management procedures with any relevant, broadly accepted reported frameworks;
The impact of regulatory non-compliance, litigation, remediation, or reputational loss that may be associated with failure to manage the company’s operations in question, including the management of relevant community and stakeholder relations;
The nature, purpose, and scope of the company’s operations in the specific region(s);
The degree to which company policies and procedures are consistent with industry norms; and
The scope of the resolution.
SPAC Extensions
ISS also revised its policies with respect to SPAC termination dates and extension requests. Now, ISS will generally recommend that shareholders vote in favor of requests to extend the termination date of a SPAC by up to one year from the SPAC’s original termination date, inclusive of any built-in extension options, and accounting for prior extension requests.
ISS may also consider the following factors:
Any added incentives;
Business combination status;
Other amendment terms; and
If applicable, use of money in the trust fund to pay excise taxes on redeemed shares.
2025 Glass Lewis Proxy Voting Guideline Changes
Approach to Executive Pay Program
Glass Lewis provided clarification on its pay-for-performance policy to emphasize Glass Lewis’ holistic approach to analyzing executive compensation programs. Glass Lewis’ analysis reviews pay programs on a case-by-case basis, and there are few program features that, standing alone, will lead to an unfavorable recommendation from Glass Lewis on a say-on-pay proposal.
Glass Lewis does not utilize a pre-determined scorecard approach when considering individual features such as the allocation of the long-term incentive between performance-based awards and time-based awards. Unfavorable factors in executive compensation programs are reviewed in the context of rationale, overall structure, overall disclosure quality, the program’s ability to align executive pay with performance and the shareholder experience, and the trajectory of the pay program resulting from changes introduced by the board’s compensation committee, all as reflected in the compensation disclosures in the company’s proxy statement.
Additionally, while regulatory disclosure rules may allow for the omission of key executive compensation information, such as for smaller reporting companies, Glass Lewis believes that companies should use proxy statements to provide sufficient information to enable shareholders to vote in an informed manner.
Glass Lewis also revised how it identifies peer groups for its pay-for-performance model, including with reference to the peers of a company’s self-disclosed peers.
Board Oversight of Artificial Intelligence
Glass Lewis has adopted new guidelines dedicated to board oversight of AI, similar to the oversight of cybersecurity that was added in 2023. Glass Lewis believes that boards should take steps to mitigate exposure to material risks that could arise from their use or development of AI.
In the absence of material incidents related to a company’s use or management of AI-related issues, Glass Lewis’ policy will generally not make voting recommendations on the basis of AI‑related issues. However, when there is evidence that there is insufficient oversight and/or management of AI technologies that has resulted in material harm to shareholders, Glass Lewis will review a company’s overall governance practices and identify which directors or board-level committees have been charged with oversight of AI-related risks. Glass Lewis will also closely evaluate the board’s management of this issue, as well as any associated disclosures, and Glass Lewis may recommend against directors it deems appropriate should it find the board’s oversight, response, or disclosure concerning AI-related issues to be insufficient. Glass Lewis recommends that all companies that develop or use AI in their operations disclose the board’s role in AI oversight and how they are ensuring their directors are fully educated on this topic.
Change-in-Control Procedures
Glass Lewis also has updated the policies surrounding the change-of-control provision to clarify that companies that allow for committee discretion over the treatment of unvested awards should commit to providing clear rationale for how such awards are treated in the event that a change in control occurs. This change underscores the importance of clear disclosure surrounding equity awards.
Board Responsiveness to Shareholder Proposals
Glass Lewis revised its policy for shareholder proposals to clarify that when shareholder proposals receive “significant” shareholder support (generally more than 30%, but less than a majority of votes cast), boards should engage with shareholders on the issue and provide future disclosure addressing shareholder concerns and outreach initiatives.
Reincorporation
Glass Lewis also revised its policy on reincorporation to reflect that Glass Lewis reviews all proposals to reincorporate to a different state or country on a case-by-case basis. Glass Lewis considers a number of factors, including the changes in corporate governance provisions, especially those relating to shareholder rights, material differences in corporate statuses and legal precedents, and relevant financial benefits, among other factors, resulting from the change in domicile.
Key Takeaways
You can find copies of the 2025 polices of ISS and Glass Lewis on their respective websites, as well as summaries of their 2025 policy updates. These policy updates will be important as public companies prepare for their 2025 proxy statements and annual shareholders’ meetings. Companies should review these voting guidelines to proactively make disclosures necessary to secure favorable voting recommendations from ISS and Glass Lewis. Companies may also want to consider changes in governance and compensation practices to decrease the likelihood of an adverse voting recommendation from ISS or Glass Lewis, although any such change should also be weighed against the overall governance needs and strategy of the company.
In addition to ISS and Glass Lewis and other third-party proxy advisory firms, companies should review the voting policies of any large institutional investors who have significant shareholdings in the company. These institutional investors often have their own voting policies that can change over time, like ISS and Glass Lewis.
President Trump’s Executive Order Steering Digital Assets Policy
As promised during his campaign, President Trump has taken significant steps to support the digital asset industry during his first week in office. On 23 January 2025, he signed an executive order initiating digital asset regulatory rollbacks and a new federal framework governing cryptocurrencies, stablecoins, and other digital assets (the Order).
On the same day, the Securities and Exchange Commission (SEC) rescinded the controversial Staff Accounting Bulletin 121, which required crypto custodians and banks to reflect digital assets in their custody as both an asset and a liability on their balance sheets. Earlier in the week, the SEC established Crypto 2.0, a crypto task force designed to provide paths for registration and reasonable disclosure frameworks, and to allocate enforcement resources “judiciously.”
The Order recognizes the role the digital asset industry serves in our economy and aims to support the responsible growth and use of digital assets by promoting dollar-backed stablecoins and providing regulatory clarity. The Order lays the groundwork for a regulatory shift furthering digital assets policy, focusing on the creation of “technology-neutral regulations” tailored to digital assets.
In addition to prohibiting agencies from facilitating any central bank digital currencies, the Order establishes a working group comprised of the heads of various agencies (the Working Group) and sets three deadlines:
22 February 2025: Federal agencies must report to the Special Advisor for AI and Crypto with the regulations or other agency guidance that affect the digital asset sector.
24 March 2025: Federal agencies must submit recommendations on whether to rescind or modify these regulations and guidance.
22 July 2025: The Working Group must submit a report to the President on regulatory and legislative proposals to advance digital assets policy. This report must include a proposed Federal framework for the issuance and operation of digital assets, including stablecoins, and evaluate whether establishing a national digital assets stockpile is possible.
Cybersecurity Executive Order—Key Implications for the Manufacturing Industry
On January 16, 2025, President Joe Biden issued the “Executive Order on Strengthening and Promoting Innovation in the Nation’s Cybersecurity,” a comprehensive directive designed to address the growing complexity and sophistication of cyber threats targeting the United States. The Executive Order aims to establish a cohesive national strategy for improving cybersecurity across federal agencies, private businesses, and critical infrastructure sectors. The Executive Order governs a wide-array of critical issues, including new cybersecurity standards for federal contractors, enhanced public-private information sharing, the promotion of advanced technologies like quantum-resistant cryptography and artificial intelligence (AI), and the imposition of sanctions on foreign cyber actors. The Executive Order’s initiatives demonstrate a commitment to strengthening the nation’s cybersecurity defenses in a rapidly evolving digital landscape and incorporate approaches generally understood as best practices to enhance cybersecurity.
To further advance the initiatives outlined in the order, the Cybersecurity and Infrastructure Security Agency (CISA), a key federal entity responsible for coordinating national efforts to safeguard critical infrastructure, expanded on the directive with detailed implementation frameworks and additional guidance. CISA’s involvement underscores its crucial role in operationalizing the Executive Order and transforming its policy directives into actionable strategies. Through collaboration with industry leaders, technology innovators, and government stakeholders, CISA has addressed specific challenges, including adopting quantum-resistant cryptography, deploying artificial intelligence in cybersecurity defenses, and improving public-private information-sharing mechanisms. These efforts emphasize fostering innovation, enhancing resilience, and protecting the nation’s digital ecosystem from emerging threats. By building on the Executive Order, CISA seeks to bridge the gap between policy objectives and on-the-ground cybersecurity practices, ensuring that the nation’s cybersecurity posture evolves in tandem with the rapidly changing threat landscape.
The transition of the presidency to President Donald Trump on January 20, 2025, has led to questions about the future of the Biden Executive Order. Historically, President Trump has favored deregulation and, during his first term, had repealed several executive orders issued by previous administrations. The possibility of modification or repeal to the Executive Order is particularly significant for the manufacturing sector, which is both a critical component of the U.S. economy and a frequent target of cyberattacks.
The purpose of this guide is three-fold. First, it examines the key elements of the existing Executive Order. Next, it explores the potential modifications that the Trump administration may implement. Finally, it provides guidance tailored to manufacturing companies for navigating this evolving regulatory and threat environment, building on previous related resources published by Foley & Lardner and the Cybersecurity Manufacturing Innovation Institute (CyManII), which are referenced at the end of this alert.
Key Provisions of the Executive Order and their Impact on Manufacturing
Minimum Cybersecurity Standards for Federal Contractors
A central provision of the Executive Order mandates baseline cybersecurity measures for federal contractors. These include securing access to critical systems and data using Multi-factor authentication (MFA), incorporating endpoint detection and response (EDR) tools to monitor, detect, and respond to cybersecurity threats, and using encryption to protect sensitive data both during transit and at rest.
Manufacturers supplying goods or services to the federal government must adhere to these cybersecurity standards to maintain their eligibility for governmental contracts. For many companies, this may require substantial investments in upgrading systems, adopting new technologies, and training personnel. Non-compliance could lead to the loss of profitable federal contracts and potential reputational damage.
Enhanced Public-Private Information Sharing
The Executive Order directs federal agencies to enhance mechanisms for sharing threat intelligence with private-sector entities. This collaboration aims to provide timely and actionable insights to help businesses defend against emerging cyber threats.
This initiative benefits the manufacturing sector as it is a primary target for ransomware attacks and intellectual property theft. Access to real-time threat intelligence allows manufacturers to identify vulnerabilities, respond swiftly to incidents, and mitigate risks more effectively. A ransomware incident plan focused on manufacturing can be found here: Ransomware Playbook.
Transition to Quantum-Resistant Cryptography
The Executive Order highlights the urgent need to adopt quantum-resistant cryptographic algorithms to tackle the long-term threat arising from advancements in quantum computing. As manufacturing increasingly incorporates digital technologies and interconnected systems, safeguarding proprietary designs, supply chain data, and other sensitive information is essential to business. Early adoption of quantum-resistant encryption may provide a competitive advantage and safeguard critical assets against existing and future threats. Guidelines for approaching quantum-resistant cryptography are available from NIST and the first post-quantum encryption standards are found here.
Leveraging AI for Cybersecurity
The Executive Order promotes the use of AI-driven cybersecurity tools to identify and counter advanced cyber threats in real time. AI is potentially transformative for the manufacturing sector because it can automate threat detection and response strategies. AI is also a proven tool for minimizing operational disruptions, protecting intellectual property, and ensuring the integrity of production lines. The pilot programs outlined in the Executive Order could serve as a model for broader adoption across the industry. AI may significantly accelerate the detection and mitigation of cyber-attacks, an area under development by CyManII.
Sanctions on Foreign Cyber Actors
The Executive Order grants the federal government the authority to impose sanctions on individuals and entities responsible for cyberattacks targeting U.S. organizations. Sanctions serve as a deterrent against state-sponsored cyberattacks and industrial espionage. For manufacturers, this provision provides an extra layer of protection and highlights the government’s commitment to safeguarding critical industries.
Potential Changes Under the Trump Administration
Deregulation of Cybersecurity Standards
President Trump’s emphasis on minimizing regulatory burdens may result in a rollback of the cybersecurity requirements in the Executive Order. This could shift the responsibility for implementing robust cybersecurity measures from the federal government to individual companies.
Focus on Supply Chain Resiliency
Based on the criticality of U.S. manufacturing and its role in global competitiveness and economic stability, we anticipate President Trump will issue guidance on securing supply chain resiliency to enhance the productivity of U.S. manufacturers. We will monitor these anticipated changes and publish future alerts as applicable.
Reprioritization of Cybersecurity Initiatives
While the current Executive Order emphasizes quantum-resistant cryptography and AI, the Trump administration might focus first on immediate cybersecurity challenges and delay longer-term solutions that require significant investment.
Reduced Emphasis on Public-Private Collaboration
Changes to information-sharing initiatives could decrease government support for private-sector cybersecurity efforts, which may compel manufacturers to seek alternative sources of threat intelligence.
Selective Sanctions Enforcement
A more selective approach to sanctions could change the deterrent effect on foreign cyber actors, potentially raising the risk of targeted attacks on U.S. manufacturing companies.
Guidance for Manufacturing Companies
Given the uncertainty surrounding the future of the Executive Order, manufacturers must adopt a proactive approach to cybersecurity. Below are actionable steps to enhance resilience:
Strengthen Core Cybersecurity Measures
Adopt Industry Best Practices: Ensure the deployment of MFA, EDR, and encryption on all critical systems.
Secure Operational Technology (OT): Safeguard industrial control systems (ICS) and other OT components essential to manufacturing operations.
Conduct Regular Assessments: Regular audits can help identify vulnerabilities and prioritize remediation efforts.
Invest in Employee Training: Over 80% of ransomware and other cyber-attacks can be traced to the “human in the loop.” Thus, cybersecurity training is a solid investment to protect your company and its operations.
Monitor Regulatory Developments
Stay Informed: Stay informed about updates to the Executive Order and other relevant cybersecurity policies.
Engage Legal Counsel: Consult legal and compliance experts to assess the potential impact of policy changes on your business operations.
Invest in Advanced Cybersecurity Technologies
Explore AI Solutions: Leverage AI tools for predicting threats, identifying anomalies, and automating incident responses.
Transition to Quantum-Resistant Cryptography: Start planning cryptographic upgrades to protect sensitive data from emerging threats.
Collaborate with Industry Peers: Participate in forums and consortia to exchange best practices and establish standardized cybersecurity protocols.
Secure the Supply Chain
Evaluate Vendor Risks: Perform comprehensive cybersecurity assessments of suppliers and third-party partners.
Develop Redundancy Plans: Identify critical supply chain dependencies and develop contingency plans to mitigate potential disruptions.
Encrypt Communications: Safeguard data transfers throughout the supply chain to minimize the risk of interception.
Build Robust Incident Response Plans
Establish Comprehensive Protocols: Develop incident response plans tailored to manufacturing-specific threats, such as ransomware attacks on production systems. An example of industry guidance and template is available in CyManII’s Ransomware Preparation Guide: Prevention, Mitigation, and Recovery for Manufacturers.
Train Employees: Provide ongoing cybersecurity training to improve awareness and minimize human error.
Test and Refine Plans: Perform regular simulations to assess the effectiveness of response strategies and implement necessary adjustments.
Final Thoughts
The “Executive Order on Strengthening and Promoting Innovation in the Nation’s Cybersecurity” highlights the urgent need for robust cybersecurity measures, particularly within the manufacturing sector, vital to national security, economic stability, and global competitiveness. This sector faces an increasing number of sophisticated threats, including ransomware attacks, vulnerabilities in the supply chain, and intellectual property theft. While the future of the Executive Order under the Trump administration is uncertain, manufacturers cannot afford to delay action. Cyber-attacks on manufacturers will continue to rise in volume and sophistication over the coming years. Proactive measures such as implementing advanced security technologies, strengthening supply chain defenses, and keeping abreast of regulatory changes are essential for mitigating risks and ensuring operational continuity.
Furthermore, adhering to strict cybersecurity standards allows manufacturers to secure federal contracts, establish trust with stakeholders, and gain a competitive edge in the market. As potential changes to the Executive Order could lead to a fragmented regulatory landscape—spanning federal, state, and international levels—manufacturers must prepare for diverse compliance requirements. By prioritizing cybersecurity, the manufacturing sector not only safeguards its critical assets and processes but also reinforces its vital role in driving economic growth and technological innovation.
About CyManII
Launched in 2020 by the U.S. Department of Energy, CyManII works across the manufacturing industry, research and academic institutions, and federal government agencies to develop technologies that enable the security and growth of the U.S. manufacturing sector.
Additional information on cybersecurity risks faced by manufacturers can be found in prior articles authored by Foley & Larder and CyManII, including:
Recommendations for Managing Cybersecurity Threats in the Manufacturing Sector
So, You Think of Cybersecurity Only as a Cost Center? Think Again.
CyManII also contributed to this article.