April 19, 2024
Doomsday Clock Identifies AI as Existential Threat
AI

Doomsday Clock Identifies AI as Existential Threat

The renowned “Doomsday Clock” was adjusted on Jan. 23 by atomic scientists, placing the planet at imminent risk, just 90 seconds from midnight. Factors such as artificial intelligence (AI) contributed to this perilous setting. The clock gauges “existential risks” encompassing nuclear threats, climate change, and disruptive technologies, with midnight signifying an “apocalyptic” state. This year’s 90-second mark is termed a “moment of historic danger.”

Founded in 1945 by scientists including Albert Einstein and J. Robert Oppenheimer, The Bulletin oversees the Doomsday Clock. AI emerged as a new concern in 2023, labeled by The Bulletin as the “most significant development” in disruptive technology. The potential corruption of information by AI is highlighted, posing a hurdle in addressing urgent threats, along with its military and information operations use.

“It is clear that AI is a paradigmatic disruptive technology. Any physical threat posed by AI must be enabled by a link to devices that can change the state of the physical world.”

The paper acknowledges global recognition of AI concerns, with China issuing regulations in August 2023, and the EU passing provisional regulations in December 2023. The US, a hub for AI development, lacks official laws but established six new standards for AI safety in October 2023. However, The Bulletin warns of AI enabling authoritarian regimes to monitor citizens and impact political elections, raising concerns about fake news and information manipulation in the 2024 global election season.

Image By freepik

Related posts

Amazon Pours $4 Billion into Anthropic AI Startup

Robert Paul

AI Token Interest Declines: Insights from Kaiko on Market Dynamics

Henry Clarke

18 Nations Unveil Comprehensive AI Security Protocols: Push for Inherent Model Safety

Anna Garcia

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More