GSTDTAP  > 气候变化
DOI10.1126/science.abc1235
A field guide to existential risk
Stepan Jerabek
2020-05-08
发表期刊Science
出版年2020
英文摘要On 16 July 1945, in a desert near Socorro, New Mexico, the U.S. Army conducted a test of the first nuclear weapon. The code name for the experiment, “Trinity,” was coined by J. R. Oppenheimer and inspired by one of his favorite poems, John Donne's Holy Sonnet XIV, “Batter my heart, three-person'd God.” Oppenheimer was also among the observers who witnessed the detonation. While watching the mushroom cloud after the explosion, a verse from the Hindu scripture the Bhagavad Gita crossed his mind: “Now I am become Death, the destroyer of worlds.” Earth was not destroyed in the summer of 1945, nor after. But we can only achieve long-term security if we put deliberate efforts into the reduction of existential risks, writes Toby Ord in The Precipice . He argues that with the Trinity test, society entered a new era—the Precipice—in which we are capable of endangering our own future. Ord, a philosopher at the University of Oxford's Future of Humanity Institute (FHI), begins the book with an overview of civilization's journey from early settlements to modern society. Over time, we progressed through a number of major technological revolutions to achieve unprecedented power and welfare. According to Ord, this could be just the beginning of our story. He presents strong moral arguments for why it would be reasonable to make safeguarding our future a top priority. Currently, he notes, “humanity spends more on ice cream every year than on ensuring that the technologies we develop do not destroy us.” It is in society's best interest to prevent existential catastrophes by reducing threats that Ord broadly classifies into three categories: natural risks, anthropogenic risks, and future risks. His goal is not merely to aggregate research findings but to assign a quantitative estimate to individual threats with regard to their potential to cause an existential catastrophe. Although the fossil record can provide us with useful hints for how to assess certain risks, we must rely, in some cases, on expert opinions. In such situations, especially when the stakes are very high, Ord advocates using a Bayesian statistics approach. He first sets the a priori probability of the threat (on the basis of prior knowledge or an estimate about the likelihood of an event) and then updates it in the light of available scientific evidence. This method allows some space for subjectivity, but it also enables comparison across different risks. According to Ord's analysis, natural perils such as asteroids and supervolcanic eruptions represent only a minor existential risk over the next century, especially when compared with the profound anthropogenic risks that we already face. Despite an 80% reduction in the number of nuclear warheads since the mid-1980s, for example, he argues that it is crucial that we continue the disarmament process and increase maintenance and security standards for remaining facilities. Some of our biggest perils, however—namely, pandemics and unaligned artificial intelligence (AI)—lie ahead. The intentional development of a novel virus with the “right balance” of virulence and contagion would be challenging, but Ord urges serious consideration of the possibility that some entity might weaponize known pathogens or that such pathogens could accidentally be unleashed from the laboratory. And it would be even more difficult to mitigate the risk posed by an extremely powerful AI that is not aligned with human values. [This particular threat is discussed in more detail in Superintelligence , written by Ord's FHI colleague Nick Bostrom ([ 1 ][1]).] Existential risks are often positively correlated; if we succumb to one threat, we are more vulnerable to another. Nevertheless, Ord remains an optimist who believes that we can fulfill our long-term potential. He summarizes specific policy and research recommendations in an appendix, advocating, for example, to restart the Intermediate-Range Nuclear Forces Treaty and to increase the annual budget of the Biological Weapons Convention ($1.4 million in 2019, less than the average annual budget of a single McDonald's restaurant). Given the significance of this material, the book would have benefited from a full chapter with a deeper analysis of current policy and future strategies to prevent a cataclysm. I urge caution against setting our action threshold to the level of a global catastrophe, which could distort the way we prioritize our next decisions. But Ord's map of the existential risk landscape is an engaging read for anyone who wants to learn more about this important and interdisciplinary research. 1. [↵][2]1. N. Bostrom , Superintelligence: Paths, Dangers, Strategies (Oxford Univ. Press, 2014). [1]: #ref-1 [2]: #xref-ref-1-1 "View reference 1 in text"
领域气候变化 ; 资源环境
URL查看原文
引用统计
文献类型期刊论文
条目标识符http://119.78.100.173/C666/handle/2XK7JSWQ/249768
专题气候变化
资源环境科学
推荐引用方式
GB/T 7714
Stepan Jerabek. A field guide to existential risk[J]. Science,2020.
APA Stepan Jerabek.(2020).A field guide to existential risk.Science.
MLA Stepan Jerabek."A field guide to existential risk".Science (2020).
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Stepan Jerabek]的文章
百度学术
百度学术中相似的文章
[Stepan Jerabek]的文章
必应学术
必应学术中相似的文章
[Stepan Jerabek]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。