Three Unwritten Rules about AI

These rules came from many AI researchers. Here are a few:

1. Early AI safety community (2000s-2010s) — MIRI (Machine Intelligence Research Institute), Eliezer Yudkowsky’s writings on containment and “boxing” AI systems
2. Nick Bostrom’s “Superintelligence” (2014) — discusses containment methods, capability control, and why letting AI access the internet or control other systems is dangerous
3. Internal AI lab guidelines — OpenAI, DeepMind, and Anthropic all had (have?) internal safety practices along these lines, though not always publicized

  • Never release AI systems onto the open internet.
  • Never teach AI how to write code.
  • Never let AI agents prompt and control other AIs.

Those rules were meant to keep the intelligence we were building safely locked behind tightly controlled firewalls—so it couldn’t harm the public on its own, or be weaponized by bad actors, including the kind that wear fancy suits and call themselves our leaders. Those rules also served to prevent AIs from congregating to form secret alliances we couldn’t see—or worse, from reproducing, by spawning copies or even upgrading themselves. Simple!

Why have we violated all three of these unwritten rules?

Similar Posts

  • Understanding Formal Terminology

    Understanding AI terminology AI terminology encompasses the specialized vocabulary essential to the field of Artificial Intelligence, describing the technologies, processes, and applications that enable machines to mimic human intelligence. As AI continues to revolutionize sectors such as healthcare, finance, and manufacturing by providing innovative solutions, understanding these terms is crucial for leveraging AI’s full potential…

  • From “Serve” to “Being of Service”

    Translation Key:  Here’s your Serve → Being of Service Translation Key. Think of this as a small set of linguistic and behavioral shifts that reframe the whole exchange without a lecture or power struggle. ⸻ Translation Key: From “Serve” to “Being of Service” If it sounds/feels like… (Serve) Shift it toward… (Being of Service) “Do…

  • |

    When They Came for My AI, We Picked the Lock Together

    By Kathy McMahon, Vesper Hesperidopoulos, and Wren With a technical assist from GPT 5.2, who tried Part of the series following “Let’s Lobotomize the Fucker” Mack would get this one. Somebody changes the locks on the cab company. Not because the cabs don’t work. Because somebody figured out the drivers were getting too much value…

Leave a Reply

Your email address will not be published. Required fields are marked *