Zum Hauptinhalt springen Zur Suche springen Zur Hauptnavigation springen
Beschreibung

THE INSTANT NEW YORK TIMES BESTSELLER

'The most important book of the decade'
MAX TEGMARK, author of Life 3.0

'A loud trumpet call to humanity to awaken us as we sleepwalk into disaster - we must wake up' STEPHEN FRY

'The best no-nonsense, simple explanation of the AI risk problem I've ever read' YISHAN WONG, former Reddit CEO

AI is the greatest threat to our existence that we have ever faced.

The scramble to create superhuman AI has put us on the path to extinction - but it's not too late to change course. Two pioneering researchers in the field, Eliezer Yudkowsky and Nate Soares, explain why artificial superintelligence would be a global suicide bomb and call for an immediate halt to its development.

The technology may be complex, but the facts are simple: companies and countries are in a race to build machines that will be smarter than any person, and the world is devastatingly unprepared for what will come next.

Could a machine superintelligence wipe out our entire species? Would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares explore the theory and the evidence, present one possible extinction scenario and explain what it would take for humanity to survive.

The world is racing to build something truly new - and if anyone builds it, everyone dies.

*A GUARDIAN AND NEW STATESMAN BOOK OF THE YEAR*

THE INSTANT NEW YORK TIMES BESTSELLER

'The most important book of the decade'
MAX TEGMARK, author of Life 3.0

'A loud trumpet call to humanity to awaken us as we sleepwalk into disaster - we must wake up' STEPHEN FRY

'The best no-nonsense, simple explanation of the AI risk problem I've ever read' YISHAN WONG, former Reddit CEO

AI is the greatest threat to our existence that we have ever faced.

The scramble to create superhuman AI has put us on the path to extinction - but it's not too late to change course. Two pioneering researchers in the field, Eliezer Yudkowsky and Nate Soares, explain why artificial superintelligence would be a global suicide bomb and call for an immediate halt to its development.

The technology may be complex, but the facts are simple: companies and countries are in a race to build machines that will be smarter than any person, and the world is devastatingly unprepared for what will come next.

Could a machine superintelligence wipe out our entire species? Would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares explore the theory and the evidence, present one possible extinction scenario and explain what it would take for humanity to survive.

The world is racing to build something truly new - and if anyone builds it, everyone dies.

*A GUARDIAN AND NEW STATESMAN BOOK OF THE YEAR*

Über den Autor

Eliezer Yudkowsky (Author)
Eliezer Yudkowsky is a founding researcher of the field of AI alignment, with influential work spanning more than twenty years. As co-founder of the non-profit Machine Intelligence Research Institute (MIRI), Yudkowsky sparked early scientific research on the problem and has played a major role in shaping the public conversation about smarter-than-human AI. He appeared on Time magazine's 2023 list of the 100 Most Influential People In AI, and has been discussed or interviewed in the New York Times, New Yorker, Newsweek, Forbes, Wired, Bloomberg, The Atlantic, The Economist, Washington Post, and elsewhere.

Nate Soares (Author)
Nate Soares is the president of the non-profit Machine Intelligence Research Institute (MIRI). He has been working in the field for over a decade, after previous experience at Microsoft and Google. Soares is the author of a large body of technical and semi-technical writing on AI alignment, including foundational work on value learning, decision theory, and power-seeking incentives in smarter-than-human AIs.

Details
Erscheinungsjahr: 2025
Genre: Importe, Politikwissenschaften
Rubrik: Wissenschaften
Medium: Buch
ISBN-13: 9781847928924
ISBN-10: 1847928927
Sprache: Englisch
Einband: Gebunden
Autor: Yudkowsky, Eliezer
Soares, Nate
Hersteller: Vintage Publishing
Verantwortliche Person für die EU: Libri GmbH, Europaallee 1, D-36244 Bad Hersfeld, gpsr@libri.de
Maße: 236 x 158 x 27 mm
Von/Mit: Eliezer Yudkowsky (u. a.)
Erscheinungsdatum: 18.09.2025
Gewicht: 0,466 kg
Artikel-ID: 134467935