{"id":109994,"date":"2026-03-03T19:42:09","date_gmt":"2026-03-03T22:42:09","guid":{"rendered":"https:\/\/tech.einnews.com\/article\/897339140"},"modified":"2026-03-03T19:42:09","modified_gmt":"2026-03-03T22:42:09","slug":"this-tech-company-turned-into-a-resistance-icon-overnight-the-reality-is-much-darker","status":"publish","type":"post","link":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/2026\/03\/03\/this-tech-company-turned-into-a-resistance-icon-overnight-the-reality-is-much-darker\/","title":{"rendered":"This Tech Company Turned Into a Resistance Icon Overnight. The Reality Is Much Darker."},"content":{"rendered":"<p class=\"slate-paragraph slate-graf\" data-word-count=\"21\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb54p9m000w3b7d8mza4vzx@published\"><em><a href=\"https:\/\/slate.com\/theslatest?utm_source=slate&amp;utm_medium=article&amp;utm_campaign=article_plain_text_topper&amp;sailthru_source=Article-TopperText-CTA\">Sign up for the Slatest<\/a> to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily.<\/em><\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"55\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb54fpy005tzum5ijs4g87l@published\">Last July, Anthropic agreed to ink a <a href=\"https:\/\/www.cnbc.com\/2025\/07\/14\/anthropic-google-openai-xai-granted-up-to-200-million-from-dod.html\">$200 million contract<\/a> with the Pentagon, allowing the department broad-based use of its Claude model as the two prospective partners gradually worked out the final terms of engagement. Those were supposed to get etched last week\u2014only for Anthropic to undergo a <a href=\"https:\/\/slate.com\/technology\/2026\/02\/anthropic-ai-pentagon-pete-hegseth.html\">decisive test<\/a> for its oft-professed ethical boundaries.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"111\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686o001p3b7dvssqyjcz@published\">Defense Secretary Pete Hegseth demanded that his team be allowed to deploy Claude\u2019s software in whatever manner they deemed pertinent, including applications for domestic surveillance and fully autonomous weaponry, which were \u201c<a href=\"https:\/\/slate.com\/podcasts\/what-next-tbd\/2026\/02\/anthropic-vs-the-pentagon\">red lines<\/a>\u201d for Anthropic CEO Dario Amodei. By Friday afternoon, President Donald Trump had ordered a <a href=\"https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2026\/02\/anthropic-refuses-to-bend-to-pentagon-on-ai-safeguards-as-dispute-nears-deadline\/\">six-month phaseout<\/a> of all uses of Claude at the federal level. Hegseth then designated Anthropic a <a href=\"https:\/\/apnews.com\/article\/anthropic-pentagon-ai-dario-amodei-hegseth-0c464a054359b9fdc80cf18b0d4f690c\">supply-chain risk to national security<\/a>, all but forbidding any military contractors from doing future business with the company. A federal contract was subsequently bestowed upon Anthropic rival OpenAI, which <a href=\"https:\/\/bsky.app\/profile\/davelee.me\/post\/3mfvaqeyxys27\">unconvincingly<\/a> claimed that it would try to safeguard tools like ChatGPT from use in population surveillance and autonomous weapons.<\/p>\n<div data-uri=\"slate.com\/_components\/connatix\/instances\/default\" class=\"ad--exco\">\n<div id=\"8b7b2bed-5b69-4d70-82d5-aa335c8abe6c\"><\/div>\n<\/div>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"164\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686o001q3b7dibmay34a@published\">The fallout for Anthropic has been remarkable. It\u2019s the first-ever American company to be deemed a supply-chain risk, which means it\u2019s already lost <a href=\"https:\/\/www.axios.com\/2026\/03\/02\/treasury-trump-ai-anthropic-pentagon\">several users<\/a> across the federal government. But something even stranger emerged in the aftermath: a lotta liberal goodwill. Social media campaigners <a href=\"https:\/\/bsky.app\/profile\/altwatcher.bsky.social\/post\/3mfzrpjqfns2h\">encouraged<\/a> their followers, even the A.I. skeptics, to download Claude en masse. Extremely online observers came up with <a href=\"https:\/\/xcancel.com\/tinathesis\/status\/2027584169987977723\">bizarre metaphors<\/a> to characterize Anthropic\u2019s heroism&nbsp; and pushed Claude to the <a href=\"https:\/\/www.cnbc.com\/2026\/02\/28\/anthropics-claude-apple-apps.html\">top of the app-store charts<\/a> over the weekend. By Monday morning, there was a Claude <a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2026-03-02\/anthropic-s-claude-chatbot-goes-down-for-thousands-of-users\">service outage<\/a> that Anthropic attributed to \u201cunprecedented demand\u201d for its products. Even <a href=\"https:\/\/xcancel.com\/brianschatz\/status\/2028891474721460430\">Sen. Brian Schatz<\/a> and <a href=\"https:\/\/x.com\/katyperry\/status\/2027619173325553765\">Katy Perry got in<\/a> on the whole thing. (The fact that American commandos had <a href=\"https:\/\/www.wsj.com\/livecoverage\/iran-strikes-2026\/card\/u-s-strikes-in-middle-east-use-anthropic-hours-after-trump-ban-ozNO0iClZpfpL7K7ElJ2\">amply used Claude to plan the Saturday strikes<\/a> on Iran did not appear to faze many of these folks.) Meanwhile, the complementary OpenAI backlash has been so pitched that it\u2019s pushed CEO Sam Altman <a href=\"https:\/\/www.cnbc.com\/2026\/03\/03\/openai-sam-altman-pentagon-deal-amended-surveillance-limits.html\">into claiming he will amend<\/a> the surveillance terms of its Pentagon partnership.<\/p>\n<aside class=\"clay-tweet\" data-uri=\"slate.com\/_components\/clay-tweet\/instances\/cmmb56ex300203b7dulxmxzha@published\">\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">done <a href=\"https:\/\/t.co\/DkS9DmlUAR\">pic.twitter.com\/DkS9DmlUAR<\/a><\/p>\n<p>\u2014 KATY PERRY (@katyperry) <a href=\"https:\/\/twitter.com\/katyperry\/status\/2027619173325553765?ref_src=twsrc%5Etfw\">February 28, 2026<\/a><\/p><\/blockquote>\n<\/aside>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"83\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686o001s3b7dyr0mijh7@published\">It\u2019s understandable that the manic, first-term energy from the libs who embrace <em>any<\/em> Trump opponents has manifested yet again. But those who\u2019ve chosen Anthropic as a pro-democracy signifier should reconsider their choice of mascot\u2014because, as <a href=\"https:\/\/www.wheresyoured.at\/the-ai-bubble-is-an-information-war\/#anthropic-is-fully-supportive-of-the-us-military-using-claude-in-the-war-in-iran-wants-to-help-governments-go-to-war-and-kill-people-and-wants-you-to-believe-otherwise\">anyone who\u2019s paid close attention<\/a> to Anthropic over the past half-decade will tell you, not only is it far from an ethical company, but it embodies the very worst, most corrosive aspects of A.I.\u2019s impacts on modern society, from creative exploitation to political opportunism to, yes, military lethality.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"144\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686o001t3b7dxi73ge64@published\">The hullaballoo around Anthropic\u2019s fight overshadowed another major development last week: The company was <a href=\"https:\/\/www.cnn.com\/2026\/02\/25\/tech\/anthropic-safety-policy-change\">ditching its \u201cresponsible scaling policy,\u201d<\/a> a safeguard, unique within the sector, meant to prevent it from developing risky A.I. tools too quickly. It\u2019s not the first time Anthropic has been so flexible with its self-imposed rules. In 2024, it scrapped its blanket ban against selling Claude products to government spy agencies; just after Trump\u2019s reelection, it also <a href=\"https:\/\/www.stripes.com\/theaters\/us\/2024-11-08\/ai-companies-military-contracting-anthropic-openai-15783799.html\">partnered with Palantir and Amazon<\/a> to sell their tools to U.S. military customers. This year, the Pentagon made use of the Palantir-Anthropic suite in <a href=\"https:\/\/www.wsj.com\/politics\/national-security\/pentagon-used-anthropics-claude-in-maduro-venezuela-raid-583aff17\">planning the kidnapping<\/a> of Venezuelan President Nicol\u00e1s Maduro, a campaign that <a href=\"https:\/\/www.newyorker.com\/news\/the-lede\/the-overlooked-deaths-of-the-attack-on-venezuela\">killed dozens of locals<\/a>. Even after the capture, Anthropic participated in a Pentagon bidding contest, proposing a system whereby Claude would interpret voice commands so as to <a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2026-03-02\/anthropic-made-pitch-in-drone-swarm-contest-during-pentagon-feud\">guide offensive, autonomous drone swarms<\/a> that will employ some human backup.<\/p>\n<aside class=\"recirc-line\" data-via=\"recirc-line\" data-uri=\"slate.com\/_components\/recirc-line\/instances\/cmmb54fpy005uzum5q8hanzqi@published\">\n<h2 class=\"recirc-line__related-from\">Related From Slate<\/h2>\n<p> <a href=\"https:\/\/slate.com\/podcasts\/what-next-tbd\/2026\/02\/anthropic-vs-the-pentagon\" class=\"recirc-line__content\"> <\/p>\n<div class=\"recirc-line__img\"> <img data-opt-id=758893364  fetchpriority=\"high\" decoding=\"async\" src=\"data:image\/gif;base64,R0lGODlhAQABAIAAAAAAAP\/\/\/ywAAAAAAQABAAACAUwAOw==\" fifu-lazy=\"1\" fifu-data-sizes=\"auto\" fifu-data-srcset=\"https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=75&resize=75&ssl=1 75w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=100&resize=100&ssl=1 100w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=150&resize=150&ssl=1 150w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=240&resize=240&ssl=1 240w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=320&resize=320&ssl=1 320w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=500&resize=500&ssl=1 500w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=640&resize=640&ssl=1 640w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=800&resize=800&ssl=1 800w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=1024&resize=1024&ssl=1 1024w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=1280&resize=1280&ssl=1 1280w, https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1&w=1600&resize=1600&ssl=1 1600w\" fifu-data-src=\"https:\/\/mlmjbqro95r8.i.optimole.com\/cb:bOxR.6a5\/w:auto\/h:auto\/q:mauto\/f:best\/https:\/\/i1.wp.com\/compote.slate.com\/images\/91577dea-ec20-4564-a326-c0abec9ee1d8.jpeg?crop=4500%2C3000%2Cx0%2Cy0&width=140&ssl=1\" width=\"141\" height=\"94\" alt loading=\"lazy\"> <\/div>\n<div class=\"recirc-line__txt\">\n<h4 class=\"recirc-line__byline\">Staff<\/h4>\n<h3 class=\"recirc-line__promoline\">Anthropic vs. the Pentagon<\/h3>\n<p> <b class=\"slate-link--bold recirc-line__read-more\">Read More<\/b> <\/div>\n<p> <\/a><br \/>\n<\/aside>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"61\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686p001u3b7dklomjiac@published\">In the most technical sense, none of this violates the red lines that Amodei outlined around surveilling Americans or allowing its tech to power <em>fully<\/em> autonomous killing machines. But those lines <a href=\"https:\/\/www.reddit.com\/r\/ClaudeAI\/comments\/1qprovf\/anthropic_are_partnered_with_palantir\/\">appear all the thinner<\/a> when you consider that Anthropic willingly outsourced Claude use to two corporations\u2014Palantir and Amazon\u2014that are <a href=\"https:\/\/finance.yahoo.com\/news\/palantir-pltr-joins-forces-ondas-151809996.html\">actively<\/a> <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/nov\/28\/amazon-ai-climate-change\"><em>enthusiastic<\/em><\/a> about both applications, especially in partnership with this administration.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"104\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686p001v3b7d7nhkkjxs@published\">That kind of convenient ethical punt has been a constant of Anthropic\u2019s brief history. Long before it reneged on its promise of \u201cresponsible\u201d and careful A.I. development, Anthropic used the same unethical shortcuts that have invited so much opprobrium upon competitors like Meta and OpenAI: <a href=\"https:\/\/slate.com\/technology\/2025\/11\/arcadia-publishing-artificial-intelligence-history-books-authors.html\">mass-pirating<\/a> copyright books and songs to speed up model training, circumventing Reddit\u2019s anti-A.I.-crawler <a href=\"https:\/\/slate.com\/technology\/2025\/06\/reddit-artificial-intelligence-chatgpt-openai-dead-internet-theory.html\">protections<\/a>, and extending its timeline for <a href=\"https:\/\/www.theverge.com\/anthropic\/767507\/anthropic-user-data-consumers-ai-models-training-privacy\">retaining users\u2019 private chats<\/a> and Claude sessions. For a company founded by ex-OpenAI executives disaffected with Sam Altman\u2019s business practices, it seemingly has little compunction about the aggressive tacks it\u2019s already taken to shore up <a href=\"https:\/\/www.reuters.com\/technology\/anthropic-valued-380-billion-latest-funding-round-2026-02-12\/\">its $380 billion bottom line<\/a>.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"81\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686p001w3b7de1ro5rjf@published\">To be fair, Anthropic indeed deserves credit for holding to its red lines with the Trump administration, fending off Hegseth\u2019s explicit threats to <a href=\"https:\/\/www.nytimes.com\/2026\/03\/01\/technology\/anthropic-defense-dept-openai-talks.html\">force it into compliance<\/a> by invoking the Defense Production Act (which, thankfully for Anthropic, did not come to fruition). That\u2019s no small thing when so many other tech companies and CEOs have discarded their professed Trump 1.0 principles\u2014defending immigrant workers, decrying Trump\u2019s racist statements, resigning from White House advisory positions\u2014for the sake of government cash and business-friendly deregulation.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"108\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmmb5686p001x3b7dc7ijuqyq@published\">But to celebrate Anthropic\u2019s move through a mass virtuous-capitalism campaign is to give it too much credit; the company did, after all, willingly lend itself to this administration and its most openly craven partners until the final minute.&nbsp; And considering Anthropic\u2019s lifelong track record of forgoing the principles that supposedly animate its existence (including the \u201cresponsible development\u201d ethos it <a href=\"https:\/\/thehill.com\/policy\/technology\/5735767-anthropic-researcher-quits-ai-crises-ads\/\">cast off<\/a> last week), no one paying attention should expect this&nbsp; conscientious objection to last either. Enjoy Claude if you want; it\u2019s a remarkable chatbot. Just don\u2019t expect it to do anything further to save our democracy, or anyone\u2019s life, or your efforts to prevent A.I. from ruining <span class=\"slate-paragraph--tombstone\">everything.<\/span><\/p>\n<p><strong><a href=\"https:\/\/blockads.fivefilters.org\"> <\/a><\/strong> <a href=\"https:\/\/blockads.fivefilters.org\/acceptable.html\"> <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>&#8230; surveilling Americans or allowing its <span class=\"match\">tech<\/span> to power fully autonomous killing &#8230; thing when so many other <span class=\"match\">tech<\/span> companies and CEOs have discarded &#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"fifu_image_url":"","fifu_image_alt":"","footnotes":""},"categories":[1],"tags":[],"class_list":["post-109994","post","type-post","status-publish","format-standard","hentry","category-news","wpcat-1-id"],"_links":{"self":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/109994","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/comments?post=109994"}],"version-history":[{"count":0,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/posts\/109994\/revisions"}],"wp:attachment":[{"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/media?parent=109994"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/categories?post=109994"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/new7.shop\/zerocostfreehost\/index.php\/wp-json\/wp\/v2\/tags?post=109994"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}