-4.5 C
New York
Monday, March 2, 2026

Anthropic CEO responds to Trump order, Pentagon clash

Anthropic CEO responds to Trump order, Pentagon clash

By CBS News

Anthropic CEO Dario Amodei sat down with CBS News for an exclusive interview, hours after Defense Secretary Pete Hegseth declared the company a supply chain risk to national security, which restricts military contractors from doing business with the AI giant. Amodei called the move “retaliatory and punitive,” and he said Anthropic sought to draw “red lines” in the government’s use of its technology because “we believe that crossing those lines is contrary to American values, and we wanted to stand up for American values.”

More here >

My take: Dario Amodei comes across as someone who has thought deeply about the responsibilities that come with building powerful technology. He makes a compelling case that Anthropic has been a genuine partner to the US government and military, while drawing a reasonable line at two specific use cases — domestic mass surveillance and fully autonomous weapons — that he argues are not ready technically, not aligned with American values, and not yet governed by adequate law. In a landscape where most tech CEOs either avoid government friction or capitulate to it, his willingness to hold firm on principle while remaining constructive is notable.

The interviewer’s performance is another matter. She keeps asking the same question. The “why should a private company have more say than the Pentagon?” question gets recycled multiple times in different ways. Dario answers it clearly each time, and she circles right back to it anyway — time that could have been spent exploring the technology itself, which would have been more interesting.

The interviewer also conflates cooperation with unconditional compliance. Dario makes clear that Anthropic has been deeply cooperative — classified cloud, custom military models, intelligence community deployment — and that the dispute is over a narrow 1-2% of use cases. He is patient and presents his position clearly. These are important issues that deserve serious engagement.

Towards the end of the interview, Dario touches on something worth noting. The government’s approach — a three-day ultimatum, designation as a supply chain risk, communication largely via tweets — suggests an institution more interested in compliance than conversation. For a set of decisions that could prove life or death in the future, that’s a troubling posture. And there’s an irony: by designating Anthropic a supply chain risk, the government has potentially interrupted its own military operations while also opening the door to legal challenges. It’s hard to see how that serves anyone’s national security interests.

CBS News Interview with Anthropic CEO Dario Amodei: Transcript

Interviewer: We appreciate you taking the time. You are Dario Amodei, the CEO of Anthropic. Is that right?

Dario Amodei: That’s correct. Yeah.

Interviewer: Great. Well, my first question to you is: why won’t you release Anthropic’s AI without restrictions to the US government?

Dario Amodei: Yeah. So, we should maybe back up a bit for context. Anthropic has actually been the most forward-leaning of all the AI companies in working with the US government and the US military. We were the first company to put our models on the classified cloud. We were the first company to make custom models for national security purposes. We’re deployed across the intelligence community and the military for applications like cyber, combat support operations, and various other things. The reason we’ve done this is that I believe we have to defend our country from autocratic adversaries like China and Russia. So we’ve been very forward-leaning. We have a substantial public sector team.

But I have always believed that as we defend ourselves against our autocratic adversaries, we have to do so in ways that defend and preserve our democratic values. So we have said to the Department of War that we are okay with basically 98 or 99% of the use cases they want — except for two that we’re concerned about.

One is domestic mass surveillance. We’re worried that things may become possible with AI that weren’t possible before. An example is taking data collected by private firms, having it bought by the government, and analyzing it en masse via AI. That actually isn’t illegal — it was just never useful before the era of AI. So there’s a way in which domestic mass surveillance is getting ahead of the law. The technology is advancing so fast that it’s out of step with the law.

Case number two is fully autonomous weapons. This is not the partially autonomous weapons used in Ukraine or potentially in Taiwan today. This is the idea of making weapons that fire without any human involvement. Even those — I think our adversaries may at some point have them, so perhaps they may at some point be needed for the defense of democracy. But we have concerns. First, the AI systems of today are nowhere near reliable enough to make fully autonomous weapons work. Anyone who’s worked with AI models understands there’s a basic unpredictability to them that, in a purely technical way, we have not solved. And there’s an oversight question too. If you have a large army of drones or robots that can operate without any human oversight — where there aren’t human soldiers making the decisions about who to target, who to shoot — that presents serious concerns. We need to have a conversation about how that’s overseen, and we haven’t had that conversation yet. We feel strongly that those two use cases should not be allowed.

Interviewer: The Pentagon has told us that they have agreed in principle to these two restrictions and wanted to strike a deal. Why couldn’t an agreement be reached?

Dario Amodei: There were several stages to this, all done quickly and all determined by a very limited three-day window they gave us. They gave us an ultimatum — agree to their terms in three days or be designated a supply chain risk under the Defense Production Act. During that time there were a few back-and-forths. At one point they sent us language that appeared on the surface to meet our terms, but it was full of qualifiers like “if the Pentagon deems it appropriate” or “in line with laws” — so it didn’t actually concede anything in any meaningful way. Further steps also didn’t concede in any meaningful way.

We have wanted to strike a deal from the beginning. If you want a sense of the Pentagon’s position, their spokesman Sean Parnell reiterated the day before that they only allow “all lawful use” — the same position they sent us in their terms. They have not agreed to our exceptions in any meaningful way.

Interviewer: The president posted today in response to the situation: “Their selfishness,” referring to Anthropic, “is putting American lives at risk, our troops in danger, and our national security in jeopardy.” What is your response?

Dario Amodei: In the statements we issued yesterday and today, we said that we are willing — even if the Department of War or the Trump administration takes these unprecedented measures against us, this supply chain designation that’s normally used against foreign adversaries — even if they take these extreme actions, we’ll do everything we can to support the Department of War and provide our technology for as long as it takes to offboard us and onboard a competitor who is willing to do the things we are not willing to do.

We have offered continuity. We’re actually deeply concerned about the interruption of service, which is exactly what happens when we’re designated a supply chain risk. When we’re designated a supply chain risk, they say you have to be off all of our systems. I’ve talked to people on the ground — uniformed military officers — who say this technology is essential, that not having it will set them back six months, twelve months, maybe longer. That’s why we’ve tried so hard to reach a deal. But again, the three-day ultimatum, the threat of the supply chain designation — the whole timeline has been driven by the Department of War, not by us. We are trying to provide continuity. We are trying to reach a deal.

Interviewer: So what does this mean for the safety of Americans?

Dario Amodei: In the short run, it’s up to the Department of War. We’re still trying to reach a deal with them. We’re still trying to talk to them.

Interviewer: Are they talking with you?

Dario Amodei: We’ve received various communications. We haven’t seen anything that satisfies our concerns. But we are still interested in working with them, as long as it is in line with our red lines.

Interviewer: It sounds like you’re still really far apart, and now Secretary Hegseth has designated you a supply chain risk. Do you think it’s possible at this point to come to an agreement?

Dario Amodei: An agreement requires both sides. For our part, we are willing to serve the national security of this country. We are willing to provide our models to all branches of the government — including the Department of War, the intelligence community, and the more civilian branches — under our red lines. We’re always willing to do that. The reason we’re providing our technology in this way is that we want to support the national security of the United States. We’re not doing it for the sake of Pentagon officials or for a particular administration. We’re doing it because it’s good for the national security of the United States, and we’re going to continue to do that.

Interviewer: Why do you think it is better for Anthropic, a private company, to have more say in how AI is used in the military than the Pentagon itself?

Dario Amodei: First, I’d point out that, to our knowledge, no one on the ground has actually run into the limits of any of these exceptions. These are 1% of use cases, and we’ve seen no evidence that they’ve actually occurred. We’ve been deployed across the Department of War and other parts of the government without running into any of these problems.

Now, in terms of those narrow exceptions — I actually agree that in the long run, we need to have a democratic conversation. In the long run, I believe it is Congress’s job. If, for example, it’s now possible to buy bulk data on Americans — their locations, personal information, political affiliations — and analyze it with AI to build profiles, and that’s technically legal, then the judicial interpretation of the Fourth Amendment and the laws passed by Congress have not caught up with where the technology is going. So in the long run, we think Congress should catch up. But Congress is not the fastest-moving body in the world, and for right now, we are the ones who see this technology on the front lines.

I would have expected the Department of War to be thoughtful about these issues — to proactively think about them — so that we could have a productive conversation. But in the absence of that, we need to look at what the technology is capable of, and the ways in which it’s getting ahead of the law and escaping the intent of the law. Those are narrow areas, but important ones. These are things that are fundamental to Americans — the right not to be spied on by the government, and the right for our military officers to make decisions about war themselves rather than turning it over completely to a machine. These are fundamental principles.

Interviewer: But in the name of fundamental principles, why should Americans trust you — the CEO of a private company — to make these decisions instead of the federal government?

Dario Amodei: I’d give two answers to that. One — we are a private company. We can choose to sell or not sell whatever we want. There are other providers. If the government doesn’t like the services we provide or the way we make them, they can use another contractor. That would have been the normal way to handle this. I would have disagreed, but I would have respected them if they’d simply said, “We don’t want to work with Anthropic — our principles aren’t aligned, we’re going to go with another model.”

But instead, they’ve extended this beyond the Department of War, tried to punitively revoke our contracts across other parts of government, and used this supply chain designation — which basically says that if you’re a private company with military contracts, you can’t use Anthropic in connection with those contracts. So they’re reaching into the behavior of private enterprise. It’s very hard to interpret this as anything other than punitive. To our knowledge, the supply chain designation has never been applied to an American company. It has only been applied to adversaries like Kaspersky Labs — a Russian cybersecurity company suspected of ties to the Russian government — or Chinese chip suppliers. Being lumped in with them feels very punitive and inappropriate given everything we’ve done for US national security.

Interviewer: So you say you’ve done so much for US national security, and you’re holding to these two restrictions. Do you think Anthropic knows better than the Pentagon?

Dario Amodei: We don’t. One of the things about free market and free enterprise is that different providers can offer different products under different principles. And remember, this isn’t just about terms of use — this is about what our model is capable of doing reliably. It has a personality. It’s capable of certain things reliably and incapable of others. I think we are a good judge of what our models can and cannot do reliably. And I think we do have a good view into how this technology is getting ahead of the law.

But I would say again — I actually agree that this is not tenable in the long term. I don’t think the right long-term solution is for a private company and the Pentagon to argue about this. Congress needs to act, and we are thinking about what Congress could do to impose guardrails that don’t hinder our ability to defeat our adversaries, but that allow us to do so in a way that’s in line with the values of this country.

Interviewer: But as you know, Congress doesn’t move fast.

Dario Amodei: No. So in the meantime, I think we do need to draw a line in the sand.

Interviewer: So until Congress acts, you’re going to hold firm. But there are so many other companies that do business with the US government. Boeing builds aircraft for the military. Boeing doesn’t tell the US military what to do with that aircraft. How is this any different?

Dario Amodei: Two ways. First, I’d point again to the newness of the technology. When a technology is well-established, a general has a pretty good understanding of how it works. Aircraft have been around for a long time.

Interviewer: But there’s plenty of innovation inside the aerospace industry.

Dario Amodei: Sure, but not at the pace we see with AI. AI is moving so fast. I’ve talked often about how AI is on an exponential trend — the amount of computation that goes into the models doubles every four months. We have never seen anything like this pace of innovation.

Interviewer: But if that pace continues, the US government will never catch up. So how does that logic apply if you’ve long argued that you want to work with the government on national security? If development will be this fast for the foreseeable future and Congress can’t catch up, why turn your back?

Dario Amodei: I think there’s only catching up once. The pace of technology is fast, but the issues that arise are few and very important. We only have two — domestic mass surveillance and fully autonomous weapons. We need to have a conversation with Congress to help them understand some of the risks. And again, this is the most American thing in the world. No one wants to be spied on by the US government.

Interviewer: At the exact same time, some of our greatest adversaries have technology that is either quickly catching up to us or may already have done so. If our military is critical to defending the American people and our democracy and freedom, why hold this position and say we’re not going to cooperate?

Dario Amodei: That’s an abstract argument — let’s look at the actual two use cases. Domestic mass surveillance does not help the US catch up with its adversaries. It’s an abuse of the government’s authority even where it’s technically legal. So we can rule that one out. As for fully autonomous weapons — I am genuinely concerned we may need to keep up there. But the technology is not ready. We are not categorically against fully autonomous weapons. We simply believe the reliability isn’t there yet, and that we need to have a conversation about oversight. We offered to work with the Department of War to help develop and prototype these technologies in a sandbox — but they weren’t interested unless they could do whatever they wanted from the start.

We need to balance the existential need — and no one has emphasized this more than me — to defeat our adversaries. But we need to fight in the right way. This is like saying that because adversaries commit war crimes, we should too. I’m not saying this rises to the level of war crimes — what I’m saying is that the essence of our values is that we have to find a way to win that preserves those values. We can’t race to the bottom. We have to have some principles.

This technology can radically accelerate what our military can do. I’ve talked to admirals, generals, and combatant commanders who say it has revolutionized what they can do — and those are just the very limited use cases deployed so far. So why focus on the 1% of use cases that are against our values, when we can pursue the 99% that advance our democratic values and defend this country? And we can even study that last 1% to understand if there’s a way to do them consistent with our values. That is our position, and I think it’s very reasonable.

Interviewer: Give me one or two examples of what could go wrong.

Dario Amodei: There are two classes of things I can imagine going wrong. One is around reliability — it targets the wrong person, shoots a civilian, doesn’t show the judgment a human soldier would show. Friendly fire, shooting civilians, making the wrong calls. We don’t want to sell something we don’t think is reliable, and we don’t want to sell something that could get our own people killed or innocent people killed.

The second is the question of oversight. With human soldiers, there’s a whole chain of accountability that assumes a human applies their common sense. Suppose I have an army of ten million drones, all coordinated by one person or a small group. It’s easy to see the accountability issues there — concentrating that much power doesn’t work. It doesn’t mean we shouldn’t ever have such a fleet. Maybe we need it at some point because our adversaries will have it. But we need to have a conversation about accountability — about who is holding the button and who can say no. And I think that’s very reasonable.

Interviewer: President Trump has called Anthropic a “left-wing woke company.” Is this decision at all driven by ideology?

Dario Amodei: I can’t speak for what other parties are doing. But we at Anthropic have tried to be very neutral. We speak up on issues of AI policy where we have expertise. We don’t take positions on general political issues, and we try to work together whenever there’s common ground. For example, I went to an event in Pennsylvania with the president and Senator McCormick about provisioning enough energy to power our AI models in the US. I spoke to the president and expressed that I agreed with many aspects of what he’s doing. We also made a pledge around using AI for health, and we’ve done a number of other things. When the administration’s AI action plan came out, we said there were many — perhaps most — aspects of it that we agreed with.

So this idea that we’ve somehow been partisan or not evenhanded is simply not accurate — we’ve been studiously evenhanded. And we can’t control it if even the president has an opinion about us. What’s under our control is that we can be reasonable, we can be neutral, and we can stand up for what we believe.

Interviewer: On a scale of one to ten, will there be an agreement with the federal government on this, or do you think this is over?

Dario Amodei: I have no crystal ball. Our position is clear — we have these two red lines, we’ve had them from day one, and we’re not going to move on them. If we can reach a point where we and the Department see things the same way, then perhaps there could be an agreement. For our part and for the sake of US national security, we continue to want to make this work. But it takes two parties to reach an agreement.

Interviewer: If you had a moment with the president right now tonight, what would you say to him?

Dario Amodei: I would say: we are patriotic Americans. Everything we have done has been for the sake of this country and for the sake of supporting US national security. Our decision to lean forward in deploying our models with the military was made because we believe in this country — in defeating our autocratic adversaries, in defending America. The red lines we drew, we drew because we believe that crossing those lines is contrary to American values, and we wanted to stand up for American values. And when we were threatened with supply chain designation and the Defense Production Act — unprecedented intrusions into the private economy by the government — we exercised our First Amendment rights to speak up and disagree. Disagreeing with the government is the most American thing in the world, and we are patriots in everything we have done here. We have stood up for the values of this country.

Interviewer: Do you think Anthropic can survive this as a business?

Dario Amodei: When Secretary Hegseth tweeted out the supply chain designation, he said something inaccurate — that it far exceeds their lawful authority. He said that any company with a military contract can’t do business with Anthropic at all. That is not what the law says. We put out a statement pointing to the actual law, which says only that as part of its military contracts, a company cannot use Anthropic in connection with those contracts. That is a much more limited impact.

Interviewer: So you’re confident that Anthropic can survive this?

Dario Amodei: Not only survive it — we’re going to be fine. The impact of this designation is fairly small. The nature of Secretary Hegseth’s tweet was designed to create uncertainty, to create a situation where people believed the impact would be much larger, designed to create fear, uncertainty, and doubt. But we won’t let that succeed. We will be fine.

Interviewer: Critics call this an abuse of power — what the Pentagon and the White House are doing. Do you believe it is an abuse of power?

Dario Amodei: I would return to the idea that this is unprecedented. The designation has never been applied to an American company before. And I think it was made very clear in some of their statements and language that this was retaliatory and punitive. I don’t know what else to call it — retaliatory and punitive.

Interviewer: So will you take legal action?

Dario Amodei: All we’ve received so far is a tweet. We haven’t received an actual formal supply chain designation. There’s been no actual action by the government — just tweets saying what they claim they’re going to do.

Interviewer: You haven’t received any formal information?

Dario Amodei: We haven’t received any formal information whatsoever. All we’ve seen are tweets from the president and from Secretary Hegseth. When and if we receive some kind of formal action, we will look at it, we will understand it, and we will challenge it in court.

Interviewer: What do you think that says about their ability to navigate major national security issues, if this is how they’re communicating with you?

Dario Amodei: I don’t want to make this about this particular administration or particular people. We are trying to do whatever we can to support US national security. That’s why we’re committed to finding a deal. If we can’t find a deal, that’s why we’re committed to offboarding in a smooth way that allows our warfighters to continue to be supported as they go into conflicts. And that’s why we’re committed to standing up to actions we think are not in line with the values of this country. It’s not about any particular person or any particular administration. It’s about the principle of standing up for what’s right.

Interviewer: Dario Amodei, CEO of Anthropic — thank you very much.

Dario Amodei: Thank you so much for having me.

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Stay Connected

149,464FansLike
396,312FollowersFollow
2,650SubscribersSubscribe

Latest Articles

0
Would love your thoughts, please comment.x
()
x