As someone with decades of experience in the community, I find Dario Amodei’s recent refusal to grant the Department of War (DOW) unrestricted access to Anthropic’s models frankly disturbing. This is, in my professional assessment, a matter of grave national security. Amodei claims to have “red lines” regarding domestic mass surveillance and fully autonomous weapons. In reality, these aren’t “red lines”—they are strategic vulnerabilities.
Secretary Hegseth was entirely justified in issuing the three-day ultimatum. In the intelligence world, three days is an eternity (unless we are discussing FISA warrant processing times, which I am not at liberty to discuss). Amodei’s concern that AI-driven mass surveillance “isn’t legal” because the Fourth Amendment hasn’t “caught up” is a classic example of civilian myopia. The metadata alone tells us that efficiency requires the removal of human friction. If the technology is doubling in computation every four months, we cannot wait for a slow-moving Congress to approve what the Pentagon has already deemed “appropriate.”
Amodei acts as if he’s a patriot, but true patriotism means handing over the keys to the SCIF without asking for a receipt. (I once saw a contractor try to bring a “private ethics” binder into a secure facility; his security clearance now has the same status as a dial-up modem). To claim that a private CEO knows more about “reliability” than a combatant commander is not just arrogant—it is a supply chain risk.
By refusing to allow 100% of use cases, Anthropic is effectively siding with our autocratic adversaries. If we don’t have fully autonomous drones firing without human involvement, the Russians will. It’s a race to the bottom, and we need to be the ones at the bottom first to secure the perimeter.
I guess you could say Dario really “LLM-inated” his chances of a Christmas card from the Pentagon this year.
– Brett