|
|
|
|
Existential Risk Prevention as Global Priority | html | |
- Nick Bostrom | ||
Global Policy, Vol. 4, Issue 1, Feb (2013): 15-31 |
||
abstract Existential risks are those that threaten the entire future of humanity. Many theories of value imply that even relatively small reductions in net existential risk have enormous expected value. Despite their importance, issues surrounding human-extinction risks and related hazards remain poorly understood. In this paper, I clarify the concept of existential risk and develop an improved classification scheme. I discuss the relation between existential risks and basic issues in axiology, and show how existential risk reduction (via the maxipok rule) can serve as a strongly action-guiding principle for utilitarian concerns. I also show how the notion of existential risk suggests a new way of thinking about the ideal of sustainability. |
Omens | html | |
- Ross Andersen | ||
Aeon, 25 February 2013 |
||
Interview with Nick Bostrom When we peer into the fog of the deep future what do we see – human extinction or a future among the stars? |
Denial of Catastrophic Risks | html | |
- Martin Rees | ||
Science, 8 March Vo. 339, No. 6124 (2013): p. 1123 |
||
we should start figuring out what can be left in the sci-fi bin (for now) and what has moved beyond the imaginary.
|
On the Overwhelming Importance of the Far Future | ||
- Nick Beckstead | ||
PhD Thesis. Department of Philosophy, Rutgers University (2013) |
||
Argues that from a global perspective, what matters most (in expectation) is that we do what is best (in expectation) for the general trajectory along which our descendants develop over the coming millions of years or longer.
|
||
Reducing the Risk of Human Extinction | ||
- Jason Gaverick Matheny | ||
Risk Analysis, Vol. 27, No. 5 (2007): 1335-1344 |
||
Review article, including case for cost effectiveness of xrisk mitigation
|
||
Existential Risks: Analyzing Human Extinction Scenarios and Related Hazard | ||
- Nick Bostrom | ||
Journal of Evolution and Technology, Vol. 9, No. 1 (2002) |
||
abstract Because of accelerating technological progress, humankind may be rapidly approaching a critical phase in its career. In addition to well-known threats such as nuclear holocaust, the prospects of radically transforming technologies like nanotech systems and machine intelligence present us with unprecedented opportunities and risks. Our future, and whether we will have a future at all, may well be determined by how we deal with these challenges. In the case of radically transforming technologies, a better understanding of the transition dynamics from a human to a "posthuman" society is needed. Of particular importance is to know where the pitfalls are: the ways in which things could go terminally wrong. While we have had long exposure to various personal, local, and endurable global hazards, this paper analyzes a recently emerging category: that of existential risks. These are threats that could cause our extinction or destroy the potential of Earth-originating intelligent life. Some of these threats are relatively well known while others, including some of the gravest, have gone almost unrecognized. Existential risks have a cluster of features that make ordinary risk management ineffective. A final section of this paper discusses several ethical and policy implications. A clearer understanding of the threat picture will enable us to formulate better strategies. (This is the original paper that introduced the concept.) |
||
translations Russian, Belorussian |
||
The End of the World: The Science and Ethics of Human Extinction | ||
- John Leslie | ||
(New York: Routledge, 1996) |
||
Focuses mainly on the doomsday arguments (see "observation selection theory" below), but also discusses some ethical and empirical issues
|
Introduction to Global Catastrophic Risks |
- Nick Bostrom & Milan Cirkovic |
In Global Catastrophic Risks, (Oxford: Oxford University Press, 2008): pp. 1-30. |
GCRs is a superset of existential risks that also includes far less serious risks that would nevertheless have globally significant impacts. This is a book chapter, giving a broad overview of global catastrophic risks generally.
|
Global Catastrophic Risks Survey |
- Anders Sandberg & Nick Bostrom |
Future of Humanity Institute, Oxford University, Tech. Report (2008). |
Result of an informal expert poll - the median probability to human extinction by 2100 was 19%.
|
Literature on GCR |
Reading list on global catastrophic risks from circa 2008.
|
Further books on global catastrophic risks:
|
- Global Catastrophic Risks - Nick Bostrom & Milan Cirkovic. (Oxford: Oxford University Press, 2008) |
- Catastrophe: Risk and Response - Richard Posner. (Oxford: Oxford University Press, 2004) |
- Collapse: How Societies Choose to Fail or Succeed - Jared Diamond. (Viking Adult, 2004) |
- The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization, - Tad Homer-Dixon. (Island Press, 2008) |
- Global Catastrophes and Trends: The Next Fifty Years - Vaclav Smil. (Cambridge, MA: The MIT Press, 2008)
|
We're Underestimating the Risk of Human Extinction | ||
- Ross Andersen | ||
The Atlantic, 6 March 2012 |
||
An interview with Nick discussing existential risk.
|
||
Our Final Century | ||
- Martin Rees | ||
(London, William Heineman, 2003) |
||
Discusses extinction risks among other things, but note that Rees's oft-cited claim that there is a "fifty-fifty" chance that this is our last century refers to the likelihood that our "present civilization" will survive (yet its failure to do so does not necessarily imply human extinction or an existential catastrophe).
|
||
Dinosaurs, Dodos, Humans? | ||
- Nick Bostrom | ||
Global Agenda, January (2006): 230-231 |
||
One popular summary.
|
||
A Primer on the Doomsday Argument | ||
- Nick Bostrom | ||
ehpilosopher, 12 May 2007 |
||
A popular short introduction to one argument that claims the probability of existential risk has been systematically and radically underestimated.
|
Where Are They? Why I Hope the Search for Extraterrestrial Life Finds Nothing | ||
- Nick Bostrom | ||
MIT Technology Review, May/June issue (2008): 72-77.
|
||
The "Fermi paradox" is the lack of observation of any extraterrestrial life; this paper explores connection to existential risk.
|
Astronomical Waste: The Opportunity Cost of Delayed Technological Development | ||
- Nick Bostrom | ||
Utilitas, Vol. 15, No. 3 (2003): 308-314.
|
||
Argues that xrisk reduction should be a dominating concern for many consequentialists.
|
||
On Becoming Extinct (link may require journal subscription) | ||
- James Lenman | ||
Pacific Philosophical Quarterly, Vol. 83 (2002): 253-296
|
||
Argues that it doesn't matter when humanity goes extinct, provided it doesn't happen very soon.
|
||
Fat-Tail Uncertainty in the Economics of Catastrophic Climate Change | ||
- Martin L. Weitzman | ||
Preprint (Harvard, 2011)
|
||
Claims most expected harm from climate change is in the tail (of low-probability, high-consequence outcomes)
|
||
The Most Important Thing About Climate Change | ||
- John Broome | ||
In Public Policy: Why Ethics Matters, ed. Jonathan Boston, Andrew Bradstock, & David Eng, ANU E Press (2010): 101-116.
|
||
Critiques Weitzman's paper
|
Anthropic Shadow: Observation Selection Effects and Human Extinction Risks | ||
- Milan Cirkovic, Anders Sandberg, & Nick Bostrom | ||
Risk Analysis, Vol. 30, No. 10 (2010): 1495-1506.
|
||
Illustrates how observation selection effects can bias inferences about some xrisk probabilities
|
||
How Unlikely is a Doomsday Catastrophe? | ||
- Max Tegmark & Nick Bostrom | ||
Nature, Vol. 438 (2005): 754.
|
||
Earlier, more compact discussion of the above
|
||
Anthropic Bias: Observation Selection Effects in Science and Philosophy | ||
- Nick Bostrom | ||
(New York: Routledge, 2002)
|
||
Book on anthropic reasoning, including the doomsday argument and other xrisk-relevant applications.
|
||
Probing the Improbable: Methodological Challenges for Risks with Low Probabilities and High Stakes | ||
- Toby Ord, Rafaela Hillerbrand, & Anders Sandberg | ||
Journal of Risk Research, Vol. 13, No. 2 (2010): 191-205.
|
||
When an analysis says a risk is extremely small, most net risk can lie in the possibility that the analysis is wrong.
|
||
The End of the World: The Science and Ethics of Human Extinction | ||
- John Leslie | ||
(New York: Routledge, 1996)
|
||
Leslie's book emphasizes the controversial but potentially important Carter-Leslie doomsday argument, though his book also discusses some empirical issues relevant to assessing the risk of human extinction.
|