{"id":105230,"date":"2026-02-27T19:42:28","date_gmt":"2026-02-27T22:42:28","guid":{"rendered":"https:\/\/tech.einnews.com\/article\/896262971"},"modified":"2026-02-27T19:42:28","modified_gmt":"2026-02-27T22:42:28","slug":"trump-orders-all-federal-agencies-to-phase-out-use-of-tech-by-claude-maker-anthropic","status":"publish","type":"post","link":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/2026\/02\/27\/trump-orders-all-federal-agencies-to-phase-out-use-of-tech-by-claude-maker-anthropic\/","title":{"rendered":"Trump orders all federal agencies to phase out use of tech by Claude-maker Anthropic"},"content":{"rendered":"<div><img data-opt-id=758893364  fetchpriority=\"high\" decoding=\"async\" src=\"data:image\/gif;base64,R0lGODlhAQABAIAAAAAAAP\/\/\/ywAAAAAAQABAAACAUwAOw==\" fifu-lazy=\"1\" fifu-data-sizes=\"auto\" fifu-data-srcset=\"https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=75&resize=75&ssl=1 75w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=100&resize=100&ssl=1 100w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=150&resize=150&ssl=1 150w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=240&resize=240&ssl=1 240w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=320&resize=320&ssl=1 320w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=500&resize=500&ssl=1 500w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=640&resize=640&ssl=1 640w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=800&resize=800&ssl=1 800w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=1024&resize=1024&ssl=1 1024w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=1280&resize=1280&ssl=1 1280w, https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1&w=1600&resize=1600&ssl=1 1600w\" fifu-data-src=\"https:\/\/i3.wp.com\/www.bostonherald.com\/wp-content\/uploads\/2026\/02\/dario-amodei.jpg?w=1024&h=683&ssl=1\" class=\"ff-og-image-inserted\"><\/div>\n<p>President Trump said he was ordering all federal agencies to phase out the use of Anthropic technology after the company\u2019s unusually public dispute with the Pentagon over artificial intelligence safety.<\/p>\n<p>Trump\u2019s comments came just over an hour before the Pentagon\u2019s deadline for Anthropic to allow unrestricted military use of its AI technology or face consequences \u2014 and nearly 24 hours after CEO Dario Amodei said his company \u201ccannot in good conscience accede\u201d to the Defense Department\u2019s demands.<\/p>\n<p>Trump said most agencies must immediately cease using Anthropic technology, but gave the Pentagon a 6-month period to phase out the technology that is already embedded in military platforms.<\/p>\n<p>\u201cWe don\u2019t need it, we don\u2019t want it, and will not do business with them again!\u201d Trump wrote.<\/p>\n<p>Anthropic didn\u2019t immediately reply to a request for comment on Trump\u2019s remarks.<\/p>\n<p>At issue in the defense contract was a clash over AI\u2019s role in national security and concerns about how increasingly capable machines could be used in high-stakes situations involving lethal force, sensitive information or government surveillance.<\/p>\n<p>The move is likely to benefit Elon Musk\u2019s competing chatbot, Grok, which the Pentagon plans to give access to classified military networks, and could serve as a warning to two other competitors, Google and OpenAI, that also have contracts to supply their AI tools to the military.<\/p>\n<p>Anthropic, maker of the chatbot Claude, could afford to lose the contract. But the ultimatum this week from Defense Secretary Pete Hegseth posed broader risks at the peak of the company\u2019s meteoric rise from a little-known computer science research lab in San Francisco to one of the world\u2019s most valuable startups.<\/p>\n<h4>Anthropic spurned Pentagon\u2019s latest proposal over its safeguards<\/h4>\n<p>If Amodei does not budge, military officials said they would not just pull Anthropic\u2019s contract but also \u201cdeem them a supply chain risk,\u201d a designation typically stamped on foreign adversaries that could derail the company\u2019s critical partnerships with other businesses. Trump didn\u2019t make such a designation in his announcement Friday, but said Anthropic could face \u201cmajor civil and criminal consequences\u201d if it\u2019s not helpful in the phase-out period.<\/p>\n<p>And if Amodei were to cave, he could have lost trust in the booming AI industry, particularly from top talent drawn to the company for its promises of responsibly building better-than-human AI that, without safeguards, could pose catastrophic dangers.<\/p>\n<p>Anthropic had said it sought narrow assurances from the Pentagon that Claude won\u2019t be used for mass surveillance of Americans or in fully autonomous weapons. But after months of private talks exploded into public debate, it said in a Thursday statement that new contract language \u201cframed as compromise was paired with legalese that would allow those safeguards to be disregarded at will.\u201d<\/p>\n<p>That was after Sean Parnell, the Pentagon\u2019s top spokesman, posted on social media that the military \u201chas no interest in using AI to conduct mass surveillance of Americans (which is illegal) nor do we want to use AI to develop autonomous weapons that operate without human involvement.\u201d He emphasized that the Pentagon wants to \u201cuse Anthropic\u2019s model for all lawful purposes,\u201d but he and other officials haven\u2019t detailed how they want to use the technology.<\/p>\n<h4>Dispute further polarizes the tech industry<\/h4>\n<p>Emil Michael, the defense undersecretary for research and engineering, later lashed out at Amodei, alleging on X that he \u201chas a God-complex\u201d and \u201cwants nothing more than to try to personally control the US Military and is ok putting our nation\u2019s safety at risk.\u201d<\/p>\n<p>That message hasn\u2019t resonated in much of Silicon Valley, where a growing number of tech workers from Anthropic\u2019s top rivals, OpenAI and Google, voiced support for Amodei\u2019s stand late Thursday in an open letter.<\/p>\n<p>OpenAI and Google, along with Elon Musk\u2019s xAI, also have contracts to supply their AI models to the military.<\/p>\n<p>Musk sided with Trump\u2019s Republican administration on Friday, saying on his social media platform X that \u201cAnthropic hates Western Civilization\u201d after Michael drew attention to a previous version of Claude\u2019s guiding principles that encouraged \u201cconsideration of non-Western perspectives.\u201d All of the leading AI models, including Musk\u2019s Grok and OpenAI\u2019s ChatGPT, are programmed with a set of instructions that guide a chatbot\u2019s values and behavior. Anthropic calls that guidance a constitution.<\/p>\n<p>While some Trump-allied tech leaders have joined the fray \u2014 including Musk and Palmer Luckey, co-founder of defense contractor Anduril \u2014 the polarizing debate over \u201cwoke AI\u201d has put others in a difficult position.<\/p>\n<p>\u201cThe Pentagon is negotiating with Google and OpenAI to try to get them to agree to what Anthropic has refused,\u201d the open letter from some OpenAI and Google employees says. \u201cThey\u2019re trying to divide each company with fear that the other will give in.\u201d<\/p>\n<p>But in a surprise move from one of Amodei\u2019s fiercest rivals, OpenAI CEO Sam Altman on Friday sided with Anthropic and questioned the Pentagon\u2019s \u201cthreatening\u201d move in a CNBC interview, suggesting that OpenAI and most of the AI field share the same red lines. Amodei once worked for OpenAI before he and other OpenAI leaders quit to form Anthropic in 2021.<\/p>\n<p>\u201cFor all the differences I have with Anthropic, I mostly trust them as a company, and I think they really do care about safety,\u201d Altman told CNBC. \u201cI\u2019ve been happy that they\u2019ve been supporting our warfighters. I\u2019m not sure where this is going to go.\u201d<\/p>\n<p>Also raising concerns about the Pentagon\u2019s approach were Republican and Democratic lawmakers and a former leader of the Defense Department\u2019s AI initiatives.<\/p>\n<p>\u201cPainting a bullseye on Anthropic garners spicy headlines, but everyone loses in the end,\u201d wrote retired Air Force Gen. Jack Shanahan in a social media post.<\/p>\n<p>Shanahan faced a different wave of tech worker opposition during the first Trump administration when he led Maven, a project to use AI technology to analyze drone footage and target weapons. So many Google employees protested its participation in Project Maven at the time that the tech giant declined to renew the contract and then pledged not to use AI in weaponry.<\/p>\n<p>\u201cSince I was square in the middle of Project Maven &amp; Google, it\u2019s reasonable to assume I would take the Pentagon\u2019s side here,\u201d Shanahan wrote Thursday on social media. \u201cYet I\u2019m sympathetic to Anthropic\u2019s position. More so than I was to Google\u2019s in 2018.\u201d<\/p>\n<p>He said Claude is already being widely used across the government, including in classified settings, and Anthropic\u2019s red lines are \u201creasonable.\u201d He said the AI large language models that power chatbots like Claude are also \u201cnot ready for prime time in national security settings,\u201d particularly not for fully autonomous weapons.<\/p>\n<p>\u201cThey\u2019re not trying to play cute here,\u201d he wrote.<\/p>\n<h4>Pentagon ready to punish Anthropic if it doesn\u2019t compromise<\/h4>\n<p>Parnell asserted Thursday that opening up use of the technology would prevent the company from \u201cjeopardizing critical military operations.\u201d<\/p>\n<p>\u201cWe will not let ANY company dictate the terms regarding how we make operational decisions,\u201d Parnell wrote. Anthropic has \u201cuntil 5:01 p.m. ET on Friday to decide\u201d if it would meet the demands or face consequences.<\/p>\n<p>When Hegseth and Amodei met on Tuesday, military officials warned that they could designate Anthropic as a supply chain risk, cancel its contract or invoke a Cold War-era law called the Defense Production Act to give the military more sweeping authority to use its products, even if the company doesn\u2019t approve.<\/p>\n<p>Amodei said Thursday that \u201cthose latter two threats are inherently contradictory: one labels us a security risk; the other labels Claude as essential to national security.\u201d He said he hopes the Pentagon will reconsider given Claude\u2019s value to the military, but, if not, Anthropic \u201cwill work to enable a smooth transition to another provider.\u201d<\/p>\n<p><strong><a href=\"https:\/\/blockads.fivefilters.org\"> <\/a><\/strong> <a href=\"https:\/\/blockads.fivefilters.org\/acceptable.html\"> <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>&#8230; the use of Anthropic <span class=\"match\">technology<\/span> after the company\u00e2\u0080\u0099s &#8230; military use of its AI <span class=\"match\">technology<\/span> or face consequences \u00e2\u0080\u0094 and &#8230; must immediately cease using Anthropic <span class=\"match\">technology<\/span>, but gave the Pentagon &#8230; use the <span class=\"match\">technology<\/span>. Dispute further polarizes the <span class=\"match\">tech<\/span> industry Emil &#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"fifu_image_url":"","fifu_image_alt":"","footnotes":""},"categories":[1],"tags":[],"class_list":["post-105230","post","type-post","status-publish","format-standard","hentry","category-news","wpcat-1-id"],"_links":{"self":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/105230","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/comments?post=105230"}],"version-history":[{"count":0,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/105230\/revisions"}],"wp:attachment":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/media?parent=105230"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/categories?post=105230"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/tags?post=105230"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}