https://debbiecoffey.substack.com/p/ai-endgame-eliezer-yudkowsky-on-ai
January 3, 2025
By Debbie Coffey, AI Endgame
It’s the beginning of a new year, and I’m hoping that you, and all of humanity, will have many more new years.
I’m writing AI Endgame newsletters to warn people about the risks of AI. We all need to be aware of not only the societal, economic and environmental risks, but also of the great risk AI superintelligence poses to the very survival of humanity. We’re on the brink of AI superintelligence.
This topic should be dominating the news, but it’s not.
Who is fighting for you and your family?
For twenty years, Eliezer Yudkowsky, an American artificial intelligence researcher and writer, has warned of the need to shut down advanced AI technology until we have a plan to control it.
Eliezer Yudkowsky is the Co-Founder of the Machine Intelligence Research Institute, a private nonprofit organization.
Yudkowsky is featured in a 2023 TED talk titled “Will Superintelligent AI End the World?”
His TED talk is on YouTube, and it’s only 10 minutes in length. If you want to save your family, you need to watch this.
Yudkowsky also wrote an open letter that was published in Time magazine in March 2023 titled “Pausing AI Developments Isn’t Enough. We Need to Shut it All Down.”
Yudkowsky states this about the soon-to-be-achieved superintelligent AI: the “most likely outcome is AI that does not do what we want, and does not care for us nor for sentient life in general. That kind of caring is something that could in principle be imbued into an AI but we are not ready and do not currently know how.
Absent that caring, we get ‘the AI does not love you, nor does it hate you, and you are made of atoms it can use for something else.’”
Yudkowsky warns “If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter.”
He believes that “The moratorium on new large training runs needs to be indefinite and worldwide.”
Yudkowsky advises that we “Shut down all the large GPU clusters (the large computer farms where the most powerful AIs are refined). Shut down all the large training runs. Put a ceiling on how much computing power anyone is allowed to use in training an AI system, and move it downward over the coming years to compensate for more efficient training algorithms. No exceptions for governments and militaries…”
If we want many new years in our future, we should all listen carefully to every word Eliezer Yudkowsky says. Now.
Next week: How AI is expanding the child sex abuse crisis
Read all AI Endgame newsletters HERE.
What you can do:
Copy this link: https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/ to Time magazine’s “Pausing AI Developments Isn’t Enough. We Need to Shut it All Down.” Send this link to all of your political representatives. Ask them to stop AI until there is a plan to control it.
Find out how to contact your Congressional representatives here:
https://www.house.gov/representatives/find-your-representative
Find out how to contact your Senators here:
https://www.senate.gov/senators/senators-contact.htm?Class=1