Tech

Google Abandoned Its Pledge to Never Use AI for Deadly Weapons

You can’t slip one past us, Google. We saw you remove your promise to never use AI for “technologies whose purpose contravenes widely accepted principles of international law and human rights.”

google-abandoned-core-parts-ai-pledge-weapons
(Photo Illustration by Omar Marques/SOPA Images/LightRocket via Getty Images)

Is anyone that surprised? Anyone? This is an opportunistic mega-corporation we’re talking about, after all. The one whose original motto, “Don’t be evil,” supposedly doesn’t conflict with Google’s newfound openness to making weaponry specifically designated as lethal.

Google’s motto wasn’t deleted years ago as widely reported, just moved around in its code of conduct, but up through this past week it’d maintained specifics on what it wouldn’t do regarding its AI development.

Videos by VICE

Now entire sections—important ones—have been removed from Google’s pledge toward principles. It isn’t just a step backward. It’s a leap backward.

“evolving” its thinking

Take a look at the previous version of Google’s page on its AI principles, which was archived on the Internet Archive’s Wayback Machine on January 30, 2025. Under “Applications we will not pursue,” they specify the following:

  • “Weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people.”
  • “Technologies that gather or use information for surveillance violating internationally accepted norms.”
  • “Technologies whose purpose contravenes widely accepted principles of international law and human rights.”
  • “Technologies that cause or are likely to cause overall harm. Where there is a material risk of harm, we will proceed only where we believe that the benefits substantially outweigh the risks, and will incorporate appropriate safety constraints.”

Now take a look at the updated and most current version of Google’s AI principles page. Notice something missing? As in, all of the aforementioned entries?

A robotic hand presents a caution symbol – credit: sankai / getty images

The cynic in me says that all those years ago when they were establishing their digital empires, Big Tech knew the conservative Right would be hostile, or at best apathetic, to its rise and so it was expedient to spout progressive slogans to court the tech-friendly Left that would help incubate it.

Then when these corporations became large and established enough to court and be courted by politicians who make careers out of swooning for Big Business, they would cast off their “core values” like a cheap Party City costume on November 1st.

Google’s Senior Vice President of Research, Labs, Technology & Society James Manyika and CEO and Co-Founder of DeepMind head Demis Hassabis published an entry on Google’s blog on February 4, 2025, regarding the update to Google’s AI principles.

They explain the reasoning behind the omissions and additions of the code, including a few new entries under the freshly added section “Responsible development and deployment,” including (but not only) the following two that are most closely related to the deleted sections of the previous “Applications we will not pursue” section:

  • “Implementing appropriate human oversight, due diligence, and feedback mechanisms to align with user goals, social responsibility, and widely accepted principles of international law and human rights.”
  • “Promoting privacy and security, and respecting intellectual property rights.”

These read like purposefully bland commitments designed to skate around Google’s willingness to discard its own principles in the pursuit of profit. They’re nothing like the more defined, specific “what we won’t do” stipulations they’ve just jettisoned.

Like the mustache-twirling villain of a whodunit movie, Google may have had a tell this whole time, an inkling almost humorously teased like an Easter Egg underneath the “Applications we will not pursue” section of its AI principles page the whole time: “As our experience in this space deepens, this list may evolve.”