Stories

Tue, 17 Mar 2026

Tue, 17 Mar 2026 AI firm Anthropic seeks weapons expert to stop users from 'misuse'

The artificial intelligence firm says it wants to prevent "catastrophic misuse" of its systems.

* Anthropic, an AI firm, is hiring a chemical weapons and high-yield explosives expert to prevent "catastrophic misuse" of its software.
* The expert will have at least 5 years of experience in "chemical weapons and/or explosives defence" and knowledge of radiological dispersal devices (dirty bombs).
* Anthropic and OpenAI are advertising similar positions, with salaries ranging from $300,000 to $455,000.
* Some experts are alarmed by the risks of this approach, warning that AI tools can still provide information about banned weapons despite guardrails.
* The issue has gained urgency as the US government calls on AI firms amid military operations in Iran and Venezuela.
* Anthropic is taking legal action against the US Department of Defence over a supply chain risk designation.


Terms of Use | Privacy Policy | Manage Cookies+ | Ad Choices | Accessibility & CC | About | Newsletters | Transcripts
Business News Top © 2024-2025