GLOBAL CATASTROPHIC RISKS SURVEY

Similar documents
Section 6.4. Sampling Distributions and Estimators

Tren ds i n Nuclear Security Assessm ents

Ten years closer to the future A personal reflection on the ten year anniversary of the Future of Humanity Institute

[Existential Risk / Opportunity] Singularity Management

humanitarian impact & risks

Superintelligence Paths, Dangers, Strategies

Disarmament and Arms Control An overview of issues and an assessment of the future

April 10, Develop and demonstrate technologies needed to remotely detect the early stages of a proliferant nation=s nuclear weapons program.

RAHS: A Systematic Process of Foresight

ADDENDUM 1. Changes Related to the Bachelor of Science in Intelligence Degree:

AI and the Future. Tom Everitt. 2 March 2016

Blast effects and protective structures: an interdisciplinary course for military engineers

Scenario Development Process

Sid Drell: Beyond the Blackboard Physics of Nuclear Weapons. Raymond Jeanloz University of California, Berkeley

How to Learn from the Experience in Japan

Webinar SEREN3 and Net4Society 14 th of February Fare clic per modificare lo stile del titolo

"The future of Social Sciences and Humanities in Horizon 2020"

AI for Global Good Summit. Plenary 1: State of Play. Ms. Izumi Nakamitsu. High Representative for Disarmament Affairs United Nations

Duck & Cover: School Drills During the Cold War By Jessica McBirney 2016

Ch 26-2 Atomic Anxiety

The Biological Weapons Convention

Sustainable Transformation of Human Society in Asia

New developments in the philosophy of AI. Vincent C. Müller. Anatolia College/ACT February 2015

Towards an Integrated Assessment of Global Catastrophic Risk

Future Progress in Artificial Intelligence: A Survey of Expert Opinion

Knowledge Organiser. Year 8 English. Lord of the Flies

Total Ionizing Dose (TID) Radiation Testing of the RH1016MW UltraFast Precision Comparator for Linear Technology

Societal Implications of Additive Manufacturing

Essay and Panel Discussion Topics

Nuclear Weapons. Dr. Steinar Høibråten Chief Scientist. Norwegian Defence Research Establishment. NKS NordThreat Asker, 31 Oct.

Report to Congress regarding the Terrorism Information Awareness Program

Past Science Advisors

Book Essay. The Future of Artificial Intelligence. Allison Berke. Abstract

Will AI Kill Us All? Machine Learning: Jordan Boyd-Graber University of Maryland SCENARIOS. Ideas Adapted from Rodney Brooks

ENGINEERING A TRAITOR

Responding to the Potential Threat of a Near-Earth-Object Impact

NEO PUBLIC OUTREACH AND EDUCATION AT KLET OBSERVATORY AND CESKE BUDEJOVICE PLANETARIUM. South Bohemia, Czech Republic;phone:

The IAEA s role in defining and promoting Nuclear Security. Dr Columba Peoples

the ARTICLE (for teachers)

Radiation Lot Acceptance Testing (RLAT) of the RH1016MW UltraFast Precision Comparator for Linear Technology

Asilomar principles. Research Issues Ethics and Values Longer-term Issues. futureoflife.org/ai-principles

How do you teach AI the value of trust?

GDP as a measure of economic growth

Secure Societies. Pauli Stigell, Pekka Rantala

News English.com Ready-to-use ESL / EFL Lessons

Supporting Consumers Facilitating Behaviour that Reduces Risky Behaviours. Professor Lynn J. Frewer. Food and Society Group

International Efforts for Transparency and Confidence-Building Measures (TCBM) and Japan s Contribution

Nuclear weapons: Ending a threat to humanity

Specialized Committee. Committee on the Peaceful Uses of Outer Space

Please send your responses by to: This consultation closes on Friday, 8 April 2016.

Strategic Partner of the Report

Data-Starved Artificial Intelligence

Many Bible commentators thought the disasters of Biblical prophecy were unrealistic. So they saw:

Welcome. Pro Vice-Chancellor John Germov, Faculty Education and Arts

New Technologies and Smart Things in the Maritime Sector

U.S. National Space Policy

Global Challenges 12. Executive Summary. Risks that threaten. Global System Collapse. Major Asteroid Impact. Global Pandemic.

Quantum Technologies Public Dialogue Report Summary

Expert Group Meeting on

g~:~: P Holdren ~\k, rjj/1~

National approach to artificial intelligence

The Biological Weapons Convention

Autonomous weapons systems as WMD vectors a new threat and a potential for terrorism?

Second explosion at Japan nuclear plant

The Stockholm International Peace Research Institute (SIPRI) reports that there were more than 15,000 nuclear warheads on Earth as of 2016.

The global in the social sciences and humanities

INVESTMENT IN COMPANIES ASSOCIATED WITH NUCLEAR WEAPONS

RELEVANCE: Where are we are going anyway?

Presidential Elections and the Stock Market

A Gift of Fire: Social, Legal, and Ethical Issues for Computing Technology (Fourth edition) by Sara Baase. Term Paper Sample Topics

16 - INTERSTELLAR COMUNICATION

Ethical Bias in AI-Based Security Systems: The Big Data Disconnect

Radiation Lot Acceptance Testing (RLAT) of the RH137H Negative Adjustable Regulator for Linear Technology

WMD Events and Other Catastrophes

2. CYBERSPACE Relevance to Sustainability? Critical Features Knowledge Aggregation and Facilitation Revolution Four Cases in the Middle East**

News English.com Ready-to-use ESL / EFL Lessons

Empreendedorismo Social João Cotter Salvado IES Social Entrepreneurship Institute

Do Now. Don't forget to turn your homework into the basket! Describe what you know about how the Japanese were defeated in World War II.

HORIZON 2020 PROGRAMME

INTRODUCTION. Costeas-Geitonas School Model United Nations Committee: Disarmament and International Security Committee

Science and Technology for Naval Warfare,

PSC/IR 106: Nuclear Weapons. William Spaniel williamspaniel.com/classes/pscir

Artificial Intelligence:

Sentinel Cities: a systems theory analysis of global disease management

The Information Commissioner s response to the Draft AI Ethics Guidelines of the High-Level Expert Group on Artificial Intelligence

the EU framework programme for research and innovation

S&T roadmap and implementation strategy: Perspective from the DRR process

Stochastic Game Models for Homeland Security

Chapter 12 Summary Sample Surveys

Educating Leaders for the 21 st Century Role of Engineering

Amateur Radio Emergency Communications Interoperability Plan

Cyberspace: The new international legal frontier

Decision making in complex systems Workshop Facilitators: Dr Mal Tutty, Dr Keith Joiner, Luke Brown

Foresight Security Scenarios for Prepared Response to the Unplanned: Results from an EU Security Research Project and its Transatlantic Dimension

Integrated approaches to innovative climate change adaptation and resource use in Africa

Products contained in this shipment may be subject to ITAR regulations.

Military Performance Specifications

Littoral Operations Center Overview. OpTech East 1 December 2015

Building a Better World

ICT in HORIZON 2020 Societal Challenges

Transcription:

GLOBAL CATASTROPHIC RISKS SURVEY (2008) Technical Report 2008/1 Published by Future of Humanity Institute, Oxford University Anders Sandberg and Nick Bostrom At the Global Catastrophic Risk Conference in Oxford (17 20 July, 2008) an informal survey was circulated among participants, asking them to make their best guess at the chance that there will be disasters of different types before 2100. This report summarizes the main results. The median extinction risk estimates were: Risk Number killed by molecular nanotech weapons. Total killed by superintelligent AI. wars (including civil wars). engineered nuclear wars. nanotech accident. natural acts of nuclear terrorism. Overall risk of extinction prior to 2100 At least 1 million dead At least 1 billion dead Human extinction 25% 10% 5% 10% 5% 5% 98% 30% 4% 30% 10% 2% 30% 10% 1% 5% 1% 0.5% 60% 5% 0.05% 15% 1% 0.03% n/a n/a 19% These results should be taken with a grain of salt. Non responses have been omitted, although some might represent a statement of zero probability rather than no opinion. 1

There are likely to be many cognitive biases that affect the result, such as unpacking bias and the availability heuristic well as old fashioned optimism and pessimism. In appendix A the results are plotted with individual response distributions visible. Other Risks The list of risks was not intended to be inclusive of all the biggest risks. Respondents were invited to contribute their own global catastrophic risks, showing risks they considered significant. Several suggested totalitarian world government, climate induced disasters, ecological/resource crunches and other risks specified or unknowable threats. Other suggestions were asteroid/comet impacts, bad crisis management, hightech asymmetric war attacking brittle IT based societies, back contamination from space probes, electromagnetic pulses, genocide/democides, risks from physics research and degradation of quality assurance. Suggestions Respondents were also asked to suggest what they would recommend to policymakers. Several argued for nuclear disarmament, or at least lowering the number of weapons under the threshold for existential catastrophe, as well as reducing stocks of highly enriched uranium and making nuclear arsenals harder to accidentally launch. One option discussed was formation of global biotech related governance, legislation and enforcement, or even a global body like the IPCC or UNFCCC to study and act on catastrophic risk. At the very least there was much interest in developing defenses against misuses of biotechnology, and a recognition for the need of unbiased early detection systems for a variety of risks, be they near Earth objects or actors with WMD capabilities. Views on emerging technologies such as nanotech, AI, and cognition enhancement were mixed: some proposed avoiding funding them; others deliberate crash programs to ensure they would be in the right hands, the risks understood, and the technologies able to be used against other catastrophic risks. Other suggestions included raising awareness of the problem, more research on cyber security issues, the need to build societal resiliency in depth, prepare for categories of disasters rather than individual types, building refuges and change energy consumption patterns. Appendix A Below are the individual results, shown as grey dots (jittered for distinguishability) and with the median as a bar. 2

Total killed in all acts of nuclear terrorism. 15% median 0.03% Total killed in all nuclear wars. 30% Number killed biggest natural 60% median 0.05% 3

Number killed biggest engineered 30% median 2% Total killed by superintelligent AI. 10% Number killed biggest nanotech accident. 5% median 0.5% 4

Number killed by molecular nanotech weapons. 25% Total killed in all wars (including civil wars). 98% median 30% median 4% Total risk of extinction: median 19% 5