“False” Sense of Security: DOJ Announces False Claims Act Settlements Related to Failure to Comply with Cybersecurity Requirements
On July 31, 2025, the United States Department of Justice (DOJ) announced a pair of settlements with companies accused of having violated the False Claims Act (FCA) by falsely representing their compliance with certain cybersecurity requirements applicable to federal contractors. These two settlements highlight key aspects of DOJ’s enforcement priorities: (1) DOJ’s strong focus on enforcing the FCA in the cybersecurity space, and (2) DOJ’s willingness to reward companies that self-disclose violations. All government contractors certifying compliance with regulatory and contractual requirements must stay vigilant and take the steps needed to comply.
In one press release, DOJ announced a $9.8 million settlement with Illumina Inc., alleging that the company sold genomic sequencing systems with cybersecurity vulnerabilities to certain federal agencies and did not have an adequate product security program or sufficient systems to identify and address these vulnerabilities. This settlement arose out of a qui tam action filed by a former Illumina employee in the United States District Court of Rhode Island.[1] According to DOJ, between February 2016 and September 2023, Illumina knowingly failed to incorporate sufficient cybersecurity protections and falsely represented that its software adhered to cybersecurity standards, including standards of the International Organization for Standardization and National Institute of Standards and Technology. While Illumina denied these allegations, it agreed to pay $9.8 million, of which $4.3 million was restitution. The settlement thus seems to have involved a multiplier that exceeded the 2x multiplier that typically applies in FCA settlements.
In an additional press release, DOJ announced a $1.75 million settlement with defense contractor Aero Turbine Inc. (ATI) and private equity company Gallant Capital Partners LLC (Gallant), which has a controlling stake in ATI. There, DOJ alleged that ATI violated the FCA by knowingly failing to comply with cybersecurity requirements in its contract with the Department of the Air Force. The government further claimed that between January 2018 and February 2020, ATI failed to implement cybersecurity controls of its information system, which contained controlled unclassified information. ATI’s systems allegedly did not meet applicable cybersecurity standards, as required by the National Institute of Standards and Technology, which could have led to significant exploitation of the system or exfiltration of sensitive defense information. Notably, unlike Illumina, ATI and Gallant voluntarily self-disclosed this issue, as detailed by the settlement agreement. Among other measures, ATI submitted two written disclosures, identified individuals involved in or responsible for the situation, disclosed facts from its internal investigation, along with attribution to specific sources, and implemented remedial measures. DOJ apparently applied a lower multiplier of just over 1.5x, rather than the typical 2x multiplier, in exchange for the self-disclosure and cooperation.
These settlements highlight the government’s potent use of the FCA to enforce cybersecurity compliance. In this evolving enforcement landscape, any company certifying cyber compliance to a federal or state government entity should continually review its cybersecurity systems and protections to ensure compliance. And companies that discover non-compliance implicating the FCA should strongly consider making a self-disclosure.
[1] United States ex rel. Lenore v. Illumina, Inc., 1:23-cv-00372-MSM (D.R.I. 2023).
Senators Introduce Legislation to Curb Use of Personal Data and Copyrighted Works for Gen AI Training
We recently discussed the legal developments related to fair use in AI training. Through a bipartisan bill titled the AI Accountability and Personal Data Protection Act (the “Bill”), introduced on July 21, 2025, U.S. Senators Josh Hawley (R-Mo.) and Richard Blumenthal (D-Conn.) proposed legislation that would effectively render the fair use defense—the primary defense relied upon by AI companies—meaningless. This legislation would create a new federal cause of action—empowering individuals to sue companies that train AI models using personal data or copyrighted works without clear, affirmative consent. This Bill remains with the Senate Judiciary Committee, and there is currently no indication if it will be considered, nor what form it might ultimately take.
Proposed Federal Cause of Action
The Bill establishes a new federal tort for individuals whose “covered data” is used without express, prior consent. “Use” is broadly defined under the bill and includes:
Collection, processing, sale, or exploitation of personal data
Training of generative AI systems using such data
Generation of content that imitates or derives from an individual’s data
”Express, prior consent” is narrowly defined to mean “a clear, affirmative act by an individual, made in advance . . . indicating a freely given, informed, and unambiguous consent to the specific appropriation, use, collection, processing, sale, or other exploitation of covered data of the individual.”
The Bill permits prevailing individuals to recover compensatory damages in the form of compensatory damages (actual damages, treble profits, or $1,000), as well as punitive damages, injunctive relief, and attorney’s fees and costs.
Broad Definition of “Covered Data”
While, the definition of “covered data” under the introduced version of the Act could be clearer as to its boundaries, the broadest reading would include:
personally identifiable information and unique identifiers (e.g., IP addresses, device IDs);
geolocation data;
biometric information;
behavioral data such as browsing history or purchasing patterns; and
copyrighted works, whether registered or unregistered.
Interestingly, The Bill does not require that a copyrighted work be registered before bringing suit. This marks a departure from the current Copyright Act, which requires registration prior to initiating a copyright infringement lawsuit.
Consent for Third-Party Use Must Be Separate
If a party intends to have a third party “use, collect, process, sell, or exploit” the covered data, then that third party must be explicitly disclosed separate and apart from any privacy policy, terms of service, or other general conditions or agreements. The Bill provides that a browsewrap agreement cannot be used (i.e., inclusion of a hyperlink or general references to a privacy policy or agreement is not sufficient disclosure of third-parties).
Notably, the Bill is silent as to whether a browsewrap agreement could be used to provide express, prior consent from the person whose covered data is being used.
Arbitration Clauses Null and Void for this New Federal Cause of Action
The bill would also preclude arbitration clauses or any contracts that limit the right to sue. A “predispute arbitration agreement or predispute joint-action waiver shall not be valid or enforceable with respect to any claim arising under the Act.” The Bill affirms the right of individuals to join class actions regardless of any agreement to the contrary.
So What Now?
While bipartisan support gives the Bill a promising start, its path through Congress remains uncertain. The Bill signals a growing appetite for stronger oversight of AI and data practices. On July 24, Senator Peter Welch reintroduced the Transparency and Responsibility for Artificial Intelligence Networks Act (“TRAIN Act”). The TRAIN Act would establish “an administrative subpoena process” enabling individuals to compel training-data disclosures from AI developers
To stay ahead, companies should: (1) audit their AI training datasets and data collection practices to determine whether personal or copyrighted data is used without clear, documented consent; (2) update privacy policies and consent mechanisms to reflect heightened transparency and specificity; and (3) monitor legislative developments closely, especially if operating in data-intensive or AI-driven sectors. Proactive compliance today could help mitigate significant legal and reputational risks tomorrow.
Privacy Compliance Insights from Connecticut’s First Privacy Law Settlement
Can we take any insights from Connecticut’s first settlement under the state’s Data Privacy Act, reached with TicketNetwork, an online ticket marketplace? The AG concerns mirrored priorities outlined in Connecticut’s 2025 CTDPA Enforcement Report. This suggests that future cases may also draw from that report.
In its press release about the matter, the AG argued that the company’s privacy policy was “largely unreadable,” missing required consumer rights information under Connecticut’s privacy law, and failed to provide a functioning mechanism for consumers to exercise their data rights even after receiving a notice of violation in late 2023.
Under the settlement, TicketNetwork has agreed to pay an $85,000 penalty, as well as to maintain records regarding the number and types of consumer rights requests received and its response timelines and outcomes—metrics the AG uses to assess compliance. As a reminder, the Connecticut cure period has now expired, but some other states still have them. Namely: Indiana, Texas, Utah, and Virginia (30 day); Tennessee (60 day); Iowa (90 day); Oregon (30 day, expires January 1, 2026); Delaware (60 day, expires January 1, 2026); and Montana (60 day, expires April 1, 2026).
It is hard to tell exactly what concerns the AG had with the privacy policy. The AG did not provide any detail in its press release. However, since the AG first began its investigation of TicketNetwork (in 2022), the company made many modifications to its privacy policy (based on a comparison of its July 1, 2025 and 2023 policies, the latter available here.) Changes included adding section headers, giving more detail about rights and information collection practices, and listing the Connecticut law by name.
Putting It Into Practice: This settlement suggests that the AG may rely on the priorities set out in its enforcement report when assessing if companies are in compliance with the Connecticut Data Privacy Act. If you have not done so already, you may want to review your organization’s privacy policy, including how you describe rights to consumers.
Listen to this post
James O’Reilly also contributed to this post.
Armanino Expands Managed Service Offerings with Addition of Strategic Accounting Outsourced Solutions (SAOS)
San Ramon, Calif.-based IPA 100 firm Armanino (FY24 net revenue of $697.4 million) announced the addition of the team from Strategic Accounting Outsourced Solutions (SAOS), a specialized outsourced accounting firm. This combination strengthens Armanino’s position as a leader in delivering tech-driven outsourced finance and accounting solutions.
Recognized on the Inc 5000 fastest-growing firms list for three consecutive years, SAOS brings a focused approach to outsourced accounting, combining deep expertise, a strong technology foundation and a transparent, relationship-first model. Beyond its rapid growth, SAOS is known for its commitment to long-term client success, operational excellence and forward-thinking strategy tailored to each client’s current and future needs.
“SAOS is a values-driven team with a passion for strategic client support in outsourced accounting and advice,” said Matt Armanino, CEO of Armanino Advisory LLC. “Their people, specialized expertise and strong service culture align perfectly with ours, and together we’ll deliver even greater impact for our clients. The need for outsourced services, especially in complex areas of finance, payroll, human resources, tax, fund administration and nonprofit strategic development outsourcing, is only growing, and this combination allows us to expand our outsourced finance offerings and scale faster to meet this major need.”
“For us, this was all about alignment of values, culture, and long-term vision,” said Kim Discenza, founder and CEO of SAOS. “Armanino has always stood out for its client-centric philosophy and forward-thinking mindset, which sets them apart from other large accounting and consulting firms. But what truly matters is that their culture reflects our own: one that prioritizes people, fosters a strong sense of belonging, and empowers employees as the foundation of long-term success. We share a commitment to integrity, innovation and delivering exceptional value to our clients. This partnership represents an exciting step forward for our employees and clients, and we’re proud to be joining forces with people who see the future the way we do.”
“Armanino and SAOS recognize the future of the profession is in high-touch turnkey solutions like outsourced accounting that are responsive to what clients need in today’s business environment,” said Allan Koltin, CEO at Koltin Consulting Group, who advised both firms on the combination. “Both firms share a commitment to growth grounded in client success and creating solutions designed to solve mission-critical challenges, making for a strong cultural alignment and an easy decision to come together.”
America’s AI Action Plan: What You Need to Know
In July 2025, the White House released Winning the Race: America’s AI Action Plan (July 2025), a national policy memorandum outlining the federal government’s strategy to advance U.S. leadership in artificial intelligence. The Action Plan highlights AI’s transformative role across science, defense, education, and commerce — while positioning the United States as a global leader in AI innovation, infrastructure, and security. To support clients and stakeholders in understanding the implications of this initiative, Barnes & Thornburg’s Artificial Intelligence practice has prepared a legal and regulatory analysis of the Action Plan.
The AI Action Plan
Structured around three strategic pillars, the Plan outlines a broad federal commitment to:
1. Accelerate AI Innovation – Removing regulatory barriers, promoting open-source development, driving AI adoption across sectors, and ensuring systems reflect American values of free speech and objectivity.
2. Build American AI Infrastructure – Scaling energy capacity, expanding semiconductor manufacturing, and constructing secure data centers to support AI growth.
3. Lead in International AI Diplomacy and Security – Strengthening export controls, countering adversarial influence, and aligning technology protection measures with allies.
Key Themes in This Report
Our analysis provides insight into several cross-cutting themes for businesses and legal stakeholders, with a focus on:
Privacy and Security Perspectives – Data governance, cyber-resilience and legal considerations for AI-related privacy frameworks
Intellectual Property Perspectives – IP protection for AI models, datasets and innovations; enforcement against malicious actors
Infrastructure, Education, and Robotics – Regulatory updates on data centers, chip manufacturing, energy requirements and AI workforce development
A Focus on Potential Liability – Synthetic media risks, biosecurity obligations, international AI compliance strategies and consideration of potential liability for AI labs/system users and developers
President’s Working Group Issues Report on Digital Financial Technology
On July 30, 2025, the President’s Working Group on Digital Assets released its report entitled “Strengthening American Leadership in Digital Financial Technology.” The report champions American innovation in crypto, and “endorses the notion that digital assets and blockchain technologies can revolutionize not just America’s financial system, but systems of ownership and governance economy-wide.”
The report lays the framework for a comprehensive, government-wide approach to regulatory reform in the sector. Produced with input from key federal agencies, the report lays out a series of core recommendations for a “new American Golden Age”:
American citizens and businesses should be able to own digital assets and use blockchain technologies for lawful purposes without fear of prosecution. Likewise, American entrepreneurs and software developers should have the liberty, and regulatory certainty, to upgrade all sectors of our economy using these technologies.
Policymakers and market regulators should lay the groundwork for American digital asset markets to become the deepest and most liquid in the world.
Banking regulators should never again pursue Operation Choke Point 2.0 and should instead embrace the opportunities digital assets and blockchain technologies offer to banks nationwide.
U.S. dollar-backed stablecoins represent the next wave of innovation in payments, and policymakers should encourage their adoption to advance U.S. dollar dominance in the digital age.
U.S. law enforcement agencies should have the tools and authorities to hold those who use digital assets for illegal activities accountable. These tools should never be misused to target the lawful activities of law-abiding citizens.
Federal tax policy should recognize the unique characteristics of digital assets and address longstanding requests for guidance from investors and entrepreneurs.
Running over 160 pages, the report opens with a detailed glossary and lengthy background materials on the digital asset ecosystem. It then includes a series of detailed discussions around the following topics: digital asset market structure, banking and digital assets, stablecoins and payments, countering illicit finance, and taxation. The report considers both the current and future states of digital financial technology and regulation, and concludes with dozens of recommendations for Congress, Treasury, Commerce, IRS, the federal banking regulators, the SEC and the CFTC. The report advocates for a mixture of legislation, agency action through rulemaking and exemptive orders, and the issuance of agency interpretive guidance. It also urges agencies to act in coordination.
Many of the report’s key recommendations fall on the SEC. SEC Chairman Paul Atkins’s July 30 statement in support of the recommendations is here. On July 31, Chairman Atkins also announced a bold plan to begin implementing the report’s recommendations for the SEC, which he christened Project Crypto. “The days of convoluted offshore corporate structures, decentralization theater, and confusion over security status, are over,” Atkins proclaimed.
SIGNED BY SILENCE?: Court Finds Consumer Agreed to Arbitration By Failing To Respond to A Text Message– And Its A Little Odd
Hello from the road!
So contract law generally requires a party to manifest their assent to the terms of any agreement.
That means while you cannot get away with saying “I didnt read the contract I signed” you generally can get away with “I didn’t sign the contract.” haha.
In Thompson v. Brew Culture, 2025 WL 2198616 (S.D. Miss Aug. 1, 2025), however, the court enforced an arbitration clause in a contract that the plaintiff agreed to, if at all, only by failing to respond to a text message.
In Thompson the plaintiff signed up for a coffee shop rewards text program to get a cheaper cup of joe. She contends– and the record apparently supports the fact that–she was never told the text program included an arbitration agreement before she provided her number.
Only after she provided her number did she receive a text message with a hyperlink containing the terms of the promotion. Those terms included an arbitration provision.
Thompson sued the coffee shop for TCPA violations and the coffee shop moved to compel arbitration. Thompson opposed arguing: i) the arbitration clause didn’t even cover the coffee shop (the court did not address this argument in its opinion); and ii) she never knew about the clause until after she provided the number– which was the only action demonstrating assent.
While these are pretty strong arguments the court granted the motion concluding that Plaintiff had accepted the arbitration provision by failing to reply “stop” to the message campaign. In the court’s view he Plaintiff had the opportunity to reject the arbitration provision upon learning of its existence by simply saying “stop.” By failing to do so she agreed to the terms.
Interesting, no?
Now I can’t say the court got this wrong, but this is certainly not a result you see everyday.
Much better practice to advise consumers of all terms and conditions BEFORE you send a text message.
Still, this case is a good reminder that consumers do owe some responsibility when it comes to interacting with SMS messages with common sense– let’s hope this trend continues as the onslaught of optout TCPA cases continues to crash the shores of TCPAWorld!
TALK TO A HUMAN OR PAY THE PRICE: Congress Proposes Call Center Bill Mandating AI and Location Disclosures
Greetings TCPAWorld!
Happy Monday! If you’ve ever been stuck shouting “representative or agent” or pressing zero repeatedly into the void of an automated system, this one’s for you.
So without further ado, let’s get into it as part of our latest updates, always bringing you up to speed. On July 30, 2025, Senators Ruben Gallego (D-AZ) and Jim Justice (R-WV) introduced the Keep Call Centers in America Act, a bipartisan bill aimed at regulating call center operations and customer service disclosures. The legislation seeks to address two primary concerns: the offshoring of call center jobs and the growing use of artificial intelligence (“AI”) in customer service interactions. According to the bill’s sponsors, the intent is to improve service quality for consumers while preserving domestic employment opportunities. The bill is motivated in part by Bureau of Labor Statistics data showing that call centers employ approximately 3 million Americans, with projections indicating a loss of 150,000 jobs by 2033. Additionally, a Data for Progress survey found that 70% of Americans find automated phone systems more frustrating than human customer support.
If enacted, the bill would require businesses to disclose, at the outset of any customer service interaction, whether the call is being handled by a human agent located outside the United States or by an AI system. If either is the case, consumers would have the right to request transfer to a human representative physically located in the U.S. Businesses would also be required to certify compliance with these disclosure obligations annually to the Federal Trade Commission. The disclosure requirement is accompanied by enforcement provisions that treat noncompliance as a violation of the Federal Trade Commission Act. With this in mind, the bill applies to businesses with 50 or more full-time employees or those employing 50 or more workers who collectively work at least 1,500 hours per week.
In addition to the disclosure mandates, the bill introduces federal funding consequences for companies that relocate or outsource call center work to overseas locations. Employers would be required to provide at least 120 days’ notice to the Department of Labor (“DOL”) before making such a change. Companies that relocate or contract call center work abroad would be placed on a public DOL list and become ineligible for new federal grants or guaranteed loans for a period of five years. Also, employers with existing federal awards may face monthly penalties equal to 8.3% of the total award already dispersed, and in some cases, cancellation of those awards if they remain on the list for one year. The bill additionally provides limited exceptions in cases where denial of federal funding would threaten national security, result in significant job losses in the domestic economy, or harm the environment. As such, companies can be removed from the list if they relocate call center work back to the U.S. with equal or greater employment levels, or amend contracts to require U.S.-based operations. In other words, if you want access to federal dollars, you’d better think twice before sending support operations abroad.
The legislation would also require that all call center work performed under federal contracts be conducted within the United States. Federal agencies would be directed to give preference in contracting to companies that do not appear on the DOL’s list of offshore call center operators. Finally, the DOL would be tasked with submitting a report to Congress that identifies the scope and location of federal call center work, including any job displacement associated with the use of AI. Whether this creates additional incentive for businesses to invest in onshore human reps or just more risk to navigate remains to be seen.
Supporters of the bill, including the Communications Workers of America (“CWA”), have framed it as a measure to protect U.S. jobs and promote transparency in consumer interactions. CWA Director of Government Affairs Dan Mauer stated that companies have historically offshored customer service jobs to avoid paying good union wages and benefits, and are now using AI to deskill and speed up work, thereby displacing jobs. Critics may raise good questions about compliance burdens, operational flexibility, and the potential impact on businesses that rely on offshore or automated customer service models.
Although the bill does not directly amend the TCPA, it is relevant to organizations already operating under telemarketing and consumer communication regulations. The required disclosures could intersect with broader compliance frameworks governing consent, transparency, and consumer expectations. Companies using AI or offshore call routing in their customer service operations may want to monitor the bill’s progress and evaluate whether any proactive adjustments to internal policies are warranted. The bill’s requirements would take effect one year after enactment, providing companies time to adjust their operations.
To check out the full legislation, click here.
Overall, the Keep Call Centers in America Act remains in the early stages of the legislative process. Whether it advances through committee or gains broader congressional support remains to be seen at this time. However, it reflects a growing focus on regulating the intersection of technology, labor, and consumer interaction. We are witnessing this as an area of increasing relevance to businesses operating in highly regulated communications spaces.
As always,
Keep it legal, keep it smart, and stay ahead of the game.
Talk soon!
Minnesota Data Privacy Law Effective July 31, 2025
Effective July 31, 2025, the Minnesota Consumer Data Privacy Act governs the manner by which the personal data of Minnesota residents is handled.
Who Does the Minnesota Consumer Data Privacy Act Apply To?
The MCDPA applies to entities doing business in Minnesota or produce products or services that are targeted to residents of Minnesota, and that satisfy one or more of the following threshold:
During a calendar year, controls or processes personal data of 100,000 consumers or more, excluding personal data controlled or processed solely for the purpose of completing a payment transaction; or
derives over 25 percent of gross revenue from the sale of personal data and processes or controls personal data of 25,000 consumers or more.
What is a “Controller” and What are a Controller’s Obligations?
A “Controller” means the natural or legal person which, alone or jointly with others, determines the purposes and means of the processing of personal data.
The MCDPA obligates controllers to provide consumers with a clear and accessible privacy notice that sets forth the categories of personal data being processed and the purposes for that the data will be processed for. The privacy notice must also set forth the categories of personal data sold or shared with third-parties, identify the third-parties, explain how consumers may exercise their privacy rights, set forth the controller’s contact information, and describe the controller’s personal data retention policy. Notably, controllers are expressly restricted to the collection of personal data that is “adequate, relevant, and reasonably necessary” for its intended processing.
How does the MCDPA Require Controllers to Protect Personal Data?
According to the MCDPA, controllers are obligated to establish and maintain administrative, technical and physical data security practices that reasonably ensure the “confidentiality, integrity, and accessibility” of personal data. Records reflecting a detailed inventory of the data being managed must be maintained. A privacy or legal regulatory compliance officer must also be appointed.
What Rights do Minnesota Consumers Possess Under the Minnesota Consumer Data Privacy Act?
The MCDPA provides Minnesota consumers with the right to access, correct, delete and obtain copies of their personal data. Minnesota consumers are also provided the right to opt-out of data sales and targeted advertising. Minnesota consumers are also required to affirmatively “opt-in” to the processing of their sensitive data.
Not unlike other states’ data privacy legislation, Minnesota consumers are also afforded the right to demand the specific third-parties their data has been sold or shared to, and to seek information regarding profiling.
The MCDPA also provides Minnesota consumers with forty-five (45) days to receive a response to data privacy related requests and are also provided with the right to appeal decisions.
Who Can Enforce the Minnesota Consumer Data Privacy Act?
The Minnesota Attorney General is vested with enforcement of the MDCPA. There is not a private right of action under the MDCPA.
Until January 31, 2026, prior to commencing a regulatory actions controllers must be sent a warning letter and has thirty (30) days to cure an alleged violation. In the event of an enforcement proceeding, civil penalties of up to $7,500 per violation are available.
Bonus Depreciation and Fiber Optic Networks
The following post discusses legal and financial matters and is provided for general informational purposes only. It is not intended to serve as legal or financial advice, particularly with respect to an individual entity’s tax status. Readers should consult qualified legal, accounting, and tax planning professionals for advice as to their specific circumstances.
On July 4, 2025, President Donald Trump signed the One Big Beautiful Bill Act (OBBBA), a massive budget reconciliation bill that codified many of the Trump Administration’s tax and spending policy objectives. While the final version of the Act did not exclude broadband grants from treatment as gross income for purposes of federal taxation (as proposed under the Broadband Grant Tax Treatment Act re-introduced earlier this year), the OBBBA’s 100% bonus depreciation provision provides some consolation, as it promises to significantly benefit some broadband network owners.
In general, depreciation provides a tax deduction equivalent to the purchase price of the property, normally realized over the economic life of the property. Bonus depreciation enables the tax deduction to occur on an accelerated basis and is intended to incentivize capital investments by businesses. First enacted in 2002, some form of bonus depreciation has been in effect for most of the past two decades. The percentage rate of bonus depreciation, and property eligible for it, has changed under various legislative enactments over that period.
The 2017 Tax Cuts and Jobs Act established a 100% bonus depreciation rate (meaning a business may be able to deduct all of a qualifying asset’s cost in the year that asset was acquired) for assets acquired and placed in service between September 27, 2017 and January 1, 2023, but would have phased down the bonus depreciation rate to zero percent in 2027. The OBBBA, however, now provides a permanent 100% bonus depreciation provision of eligible assets acquired and placed into service after January 19, 2025.
What does this mean for fiber optic networks and other Telephone Distribution Plant (including conduit and related outside plant and equipment (OSP))? AT&T and others in the industry are very bullish on it, touting bonus depreciation as enabling, in AT&T’s case, “over $1 billion annually in cash-tax deferrals, effectively reducing the cost of capital for fiber projects.”
But fiber network owners should note that bonus depreciation may or may not be available to them, depending on what accounting methods they follow.
Bonus depreciation is available only for assets with a recovery period of 20 years or less under the Modified Accelerated Cost Recovery System (MACRS). Fiber optic networks and other OSP, however, generally have a long economic life: the IRS lists the “class life” of Telephone Distribution Plant as 24 years, which is also the recovery period for depreciation of Telephone Distribution Plant under the Alternative Depreciation System.
At first glance, the 24-year class life of telecommunications OSP would suggest that fiber optic networks simply are not eligible for bonus depreciation. But that is not the case, as is evident from the reaction of AT&T and others.
The key is that bonus depreciation eligibility depends on the recovery period of the asset, not its class life. Under the Alternative Depreciation System (ADS), the recovery period of Telephone Distribution Plant is 24 years. But under the General Depreciation System (GDS), the recovery period is 15 years.
In the most general terms, then, eligibility of fiber optic network assets for bonus depreciation depends on the provider’s chosen accounting and depreciation methods.
We must emphasize that the above is a greatly simplified explanation, and there is considerable nuance in the tax treatment of capital assets. As noted in the introduction, readers should consult with their own legal, financial, and tax professionals with respect to their particular circumstances.
Ninth Circuit Affirms Disclosure of EEO-1 Reports Under FOIA
On July 30, 2025, the Ninth Circuit Court of Appeals upheld a district court order requiring the U.S. Department of Labor (DOL) to release EEO-1 reports previously withheld in response to Freedom of Information Act (FOIA) requests. In Center for Investigative Reporting v. U.S. Dep’t of Labor, the Ninth Circuit held that federal contractors’ workforce composition data is not protected “commercial” information under FOIA’s Exemption 4 and must be disclosed.
The case arose when the Center for Investigative Reporting (CIR) requested several years of federal contractors’ EEO-1 reports from the DOL. These reports contain aggregated demographic data, including race, ethnicity, and sex, organized by job category. The DOL initially withheld thousands of reports determining that they may contain confidential commercial information protected from disclosure under FOIA’s Exemption 4. The Agency then published a notice in the Federal Register, giving federal contractors the opportunity to object to the release of their EEO-1 data. After extending the objection deadline, the DOL continued to withhold the reports, prompting CIR to eventually file suit. Following the district court’s order compelling the release the reports, the DOL filed an appeal.
The DOL argued the EEO-1 reports fall under FOIA’s Exemption 4 because the data “relates to commercial subject matter.” The Ninth Circuit disagreed. The court explained the information qualifies as “commercial” under Exemption 4 only if it is an object of commerce or “describes an exchange of goods or services for profit.” Finding the EEO-1 reports alone do not reveal details about federal contractors’ services, prices, profits, or other information typically considered commercial, the court held that the data is not protected information under FOIA. In addition, the court rejected the DOL’s argument that EEO-1 data is “indirectly” related to commercial activity as too attenuated to bring the reports within the scope of Exemption 4.
Because the DOL failed to show that the reports contained protected “commercial” information, the court ordered their disclosure to the CIR.
According to this decision, federal contractors cannot rely solely on FOIA’s Exemption 4 to keep their EEO-1 reports confidential. While the ruling promotes transparency and public access to diversity data for companies doing business with the federal government, it is narrowly focused on the aggregated data of the consolidated EEO-1 reports. Other types of sensitive commercial information may still be protected under FOIA or under Trade Secret protections.
The DOL has limited time to decide whether they will accept the opinion or request a rehearing of the matter.
New Texas Law Requires Storage of Electronic Health Records in U.S.
Starting September 1, 2025, health care practitioners in Texas are required to store electronic health records in the United States under a new Act. This requirement is found in a recently enacted law that also includes requirements for practitioner’s AI use.
Health care practitioners include providers licensed, certified, or otherwise authorized to provide health care services in Texas. Many practitioners use third party software for electronic health record solutions. This localization requirements also applies to those arrangements with vendors and cloud storage providers. The Act also requires that health care practitioners implement reasonable and appropriate administrative, physical, and technical safeguards to protect the confidentiality, integrity, and availability of electronic health records. The law does not specifically contemplate what those safeguards might be. The law also prohibits collection and storage of individual credit scores or voter registration information in electronic health records.
Putting it into practice. While historically, there had not been many laws expressly requiring that EHRs be hosted in the U.S., Texas joins Florida in enacting this law. Before this law goes into effect, health care practitioners in Texas should assess storage of electronic health records to ensure records are maintained in the United States. Providers will also want to confirm that the necessary safeguards are in place to protect EHRs. Lastly, credit scores or voter registration records should not be collected or stored in electronic health records. In addition, health care practitioners should assess vendor relationships to confirm compliance with the Act. Practitioners may also want to update template agreements to account for these offshoring considerations (if not done so already).
Listen to this post