The Brief That Wrote Itself

Three weeks ago, we wrote that one company’s conscience and 220 engineers’ courage should not be the only thing between frontier AI and unrestricted military deployment. We documented Anthropic’s refusal, the employee petition, and the structural argument: voluntary commitments hold only when they are specific enough to be indefensible to break.

Since then, the Pentagon has followed through on its threats. Defense Secretary Hegseth designated Anthropic a supply chain risk, a label previously reserved for foreign adversaries like Huawei and Kaspersky. President Trump ordered all federal agencies to stop using Claude. Anthropic’s classified military work is being shifted to Google, OpenAI, and xAI. The company that was the first frontier AI lab to deploy on classified U.S. networks is now the first American company ever designated a supply chain threat by its own government.

On March 9, Anthropic sued the Trump administration in two federal courts. On March 10, Microsoft filed an amicus brief in support.

Microsoft’s brief states that “American AI should not be used to conduct domestic mass surveillance or start a war without human control.” This is the right position. It is also the position that protects $5 billion in equity, $30 billion in committed cloud revenue, and a product line that ships with Claude in every tier.


The financial relationship#

In November 2025, Microsoft announced a $5 billion investment in Anthropic. Anthropic committed to purchasing $30 billion in Azure compute capacity. Beyond the investment, Microsoft spends approximately $500 million per year licensing Claude for its own products, including Microsoft 365 Copilot. Microsoft is not just an investor. It is a customer, a distribution partner, and a platform host.

The supply chain risk designation does not just bar Anthropic from defense contracts. It requires contractors to stop using Claude in work tied to the Department of Defense. Microsoft is one of the largest defense contractors in the world. Its products ship with Claude integrated. When its brief argues that the court should block the designation to “avoid disrupting the American military’s ongoing use of advanced AI,” the disruption includes disruption to Microsoft’s own products.

The brief’s three arguments (the designation is being misused against a domestic company, the economic harm is disproportionate, and AI should not be deployed for mass domestic surveillance or autonomous weapons without human oversight) are all sound. The first two are also arguments about Microsoft’s own exposure. A precedent that allows the Pentagon to designate any domestic technology company a supply chain risk over a policy disagreement applies to every company with a government contract. The brief defends Anthropic’s principle and the principle that emergency designations cannot be weaponized as commercial leverage. Those happen to be the same principle, and they happen to protect Microsoft’s entire government business.


The people with nothing to protect#

The same week, 22 former high-ranking military officials filed their own brief. The group includes former CIA Director Michael Hayden, retired Coast Guard Admiral Thad Allen, and former secretaries of the Air Force, Army, and Navy.

Their filing states that the Pentagon’s conduct “threatens the rule-of-law principles that have long strengthened our military.” They describe the supply chain designation as a “misuse of government authority for retribution.”

On March 17, 150 retired federal and state judges filed a separate brief in support.

None of these people hold equity, Azure commitments, or products shipping with Claude. Microsoft’s brief reaches the same conclusion they do. It reaches it from a $35 billion financial relationship. The conclusion is more credible when the path to it does not pass through a balance sheet.


The cost of showing up#

This series has tracked a sequence of actions, each with a different price.

Anthropic refused the Pentagon’s demands and lost a $200 million contract, its classified network access, and its government business.

220 employees at Google and OpenAI signed a petition asking their leadership to hold the same lines.

22 retired military officials filed a brief stating the Pentagon is misusing its authority. They staked professional reputations built over decades on a position opposing the sitting defense secretary.

Microsoft filed an amicus brief. By the time it filed, public sentiment and the market had already sided with Anthropic. The brief cost legal fees and no political capital. Not filing would have left billions in committed revenue and a major equity position exposed.

The principle is the same in every case. The price is not. Anthropic paid the most. The employees risked the most relative to their resources. The retired officials spent reputational capital. Microsoft joined a consensus and defended a balance sheet. The brief wrote itself.


This is not a criticism of Microsoft. Microsoft’s brief is legally sound, factually accurate, and substantively correct. Microsoft should have filed.

The observation is narrower: when a corporation’s financial interests align with a principle, the corporation defends the principle. Regulation does what Microsoft’s investment portfolio did by accident. It makes the right outcome and the profitable outcome the same thing. The fact that a $35 billion partnership produced an amicus brief saying “do not build a surveillance engine” is fortunate. The fact that it took a partnership of that scale to produce one is the entire case for regulation in a single court filing.

The hearing before U.S. District Judge Rita Lin is scheduled for March 24. Anthropic is seeking a temporary restraining order. Microsoft, 22 retired military officials, 150 retired judges, and major tech industry groups have filed in support.


This post is part of a series on AI policy and accountability. See also: Safety Was the Product. Now It Is the Obstacle., Cannot in Good Conscience, They Asked for Regulation, You Built the Training Set, and The Foundation Is Physical.