US drafts strict AI contract rules amid dispute with Anthropic

Tensions escalated this week when the Pentagon labelled Anthropic a 'supply-chain risk'

US drafts strict AI contract rules amid dispute with Anthropic

The administration of Donald Trump has drafted stricter guidelines governing artificial intelligence contracts with civilian agencies, following a dispute between the United States Department of Defense and AI startup Anthropic, according to a report by the Financial Times.

The new rules would require companies seeking federal AI contracts to allow the government to use their models for “any lawful purpose.”

A draft reviewed by the newspaper states that firms must grant the US government an irrevocable license to use their AI systems under legal conditions.

Tensions escalated this week when the Pentagon labelled Anthropic a “supply-chain risk,” preventing government contractors from using the company’s technology in projects related to the US military.

The decision reportedly followed months of disagreement between the company and defense officials over safety restrictions Anthropic wanted to place on how its models could be used.

The proposed guidelines are being developed by the General Services Administration, which manages federal procurement.

The measures are intended to strengthen oversight of AI services purchased by civilian agencies and could mirror similar rules being considered for military contracts.

Josh Gruenbaum, commissioner of the Federal Acquisition Service within the GSA, said maintaining a business relationship with Anthropic would be irresponsible given the current concerns.

He confirmed that the agency has terminated Anthropic’s participation in the government’s OneGov contracting program, cutting off its access to pre-negotiated federal contracts across multiple branches of government.

According to the report, the draft rules also require companies to ensure their AI systems do not intentionally include partisan or ideological bias in outputs.

Additionally, developers would need to disclose whether their models have been altered to comply with regulatory or commercial frameworks outside the US federal government.