A global catastrophic risk is a hypothetical
future event which could damage human well-being
on a global scale, even crippling or destroying
modern civilization.
An event that could cause human extinction
or permanently and drastically curtail humanity's
potential is known as an existential risk.Potential
global catastrophic risks include anthropogenic
risks, caused by humans (technology, governance,
climate change), and non-anthropogenic or
external risks.
Examples of technology risks are hostile artificial
intelligence and destructive biotechnology
or nanotechnology.
Insufficient or malign global governance creates
risks in the social and political domain,
such as a global war, including nuclear holocaust,
bioterrorism using genetically modified organisms,
cyberterrorism destroying critical infrastructure
like the electrical grid; or the failure to
manage a natural pandemic.
Problems and risks in the domain of earth
system governance include global warming,
environmental degradation, including extinction
of species, famine as a result of non-equitable
resource distribution, human overpopulation,
crop failures and non-sustainable agriculture.
Examples of non-anthropogenic risks are an
asteroid impact event, a supervolcanic eruption,
a lethal gamma-ray burst, a geomagnetic storm
destroying electronic equipment, natural long-term
climate change, hostile extraterrestrial life,
or the predictable Sun transforming into a
red giant star engulfing the Earth.
== Classifications ==
=== 
Global catastrophic vs. existential ===
A "global catastrophic risk" is any risk that
is at least "global" in scope, and is not
subjectively "imperceptible" in intensity.
Those that are at least "trans-generational"
(affecting all future generations) in scope
and "terminal" in intensity are classified
as existential risks.
While a global catastrophic risk may kill
the vast majority of life on earth, humanity
could still potentially recover.
An existential risk, on the other hand, is
one that either destroys humanity (and, presumably,
all but the most rudimentary species of non-human
lifeforms and/or plant life) entirely or at
least prevents any chance of civilization
recovering.Similarly, in Catastrophe: Risk
and Response, Richard Posner singles out and
groups together events that bring about "utter
overthrow or ruin" on a global, rather than
a "local or regional" scale.
Posner singles out such events as worthy of
special attention on cost-benefit grounds
because they could directly or indirectly
jeopardize the survival of the human race
as a whole.
Posner's events include meteor impacts, runaway
global warming, grey goo, bioterrorism, and
particle accelerator accidents.
Researchers experience difficulty in studying
near human extinction directly, since humanity
has never been destroyed before.
While this does not mean that it will not
be in the future, it does make modelling existential
risks difficult, due in part to survivorship
bias.
However, civilizations vanished rather frequently
in human history.
== Likelihood ==
Some risks are due to phenomena that have
occurred in earth's past and left a geological
record.
Together with contemporary observations, it
is possible to make informed estimates of
the likelihood such events will occur in the
future.
For example, an extinction-level comet or
asteroid impact event before the year 2100
has been estimated at one-in-a-million.
Supervolcanoes are another example.
There are several known supervolcanos, including
Mt. Toba, which some say almost wiped out
humanity at the time of its last eruption.
The geologic record suggests this particular
supervolcano re-erupts about every 50,000
years.Without the benefit of geological records
and direct observation, the relative danger
posed by other threats is much more difficult
to calculate.
In addition, it is one thing to estimate the
likelihood of an event taking place, something
else to assess how likely an event will cause
extinction if it does occur, and most difficult
of all, the risk posted by synergistic effects
of multiple events taking place simultaneously.Given
the limitations of ordinary calculation and
modeling, expert elicitation is frequently
used instead to obtain probability estimates.
In 2008, an informal survey of experts on
different global catastrophic risks at the
Global Catastrophic Risk Conference at the
University of Oxford suggested a 19% chance
of human extinction by the year 2100.
The conference report cautions that the results
should be taken "with a grain of salt", the
results were not meant to capture all large
risks and did not include things like climate
change, and the results likely reflect many
cognitive biases of the conference participants.
Table source: Future of Humanity Institute,
2008.The 2016 annual report by the Global
Challenges Foundation estimates that an average
American is more than five times more likely
to die during a human-extinction event than
in a car crash.There are significant methodological
challenges in estimating these risks with
precision.
Most attention has been given to risks to
human civilization over the next 100 years,
but forecasting for this length of time is
difficult.
The types of threats posed by nature have
been argued to be relatively constant, though
this has been disputed, and new risks could
be discovered.
Anthropogenic threats, however, are likely
to change dramatically with the development
of new technology; while volcanoes have been
a threat throughout history, nuclear weapons
have only been an issue since the 20th century.
Historically, the ability of experts to predict
the future over these timescales has proved
very limited.
Man-made threats such as nuclear war or nanotechnology
are harder to predict than natural threats,
due to the inherent methodological difficulties
in the social sciences.
In general, it is hard to estimate the magnitude
of the risk from this or other dangers, especially
as both international relations and technology
can change rapidly.
Existential risks pose unique challenges to
prediction, even more than other long-term
events, because of observation selection effects.
Unlike with most events, the failure of a
complete extinction event to occur in the
past is not evidence against their likelihood
in the future, because every world that has
experienced such an extinction event has no
observers, so regardless of their frequency,
no civilization observes existential risks
in its history.
These anthropic issues can be avoided by looking
at evidence that does not have such selection
effects, such as asteroid impact craters on
the Moon, or directly evaluating the likely
impact of new technology.In addition to known
and tangible risks, unforeseeable black swan
extinction events may occur, presenting an
additional methodological problem.
== Moral importance of existential risk ==
Some scholars have strongly favored reducing
existential risk on the grounds that it greatly
benefits future generations.
Derek Parfit argues that extinction would
be a great loss because our descendants could
potentially survive for four billion years
before the expansion of the Sun makes the
Earth uninhabitable.
Nick Bostrom argues that there is even greater
potential in colonizing space.
If future humans colonize space, they may
be able to support a very large number of
people on other planets, potentially lasting
for trillions of years.
Therefore, reducing existential risk by even
a small amount would have a very significant
impact on the expected number of people who
will exist in the future.
Exponential discounting might make these future
benefits much less significant.
However, Jason Matheny has argued that such
discounting is inappropriate when assessing
the value of existential risk reduction.Some
economists have discussed the importance of
global catastrophic risks, though not existential
risks.
Martin Weitzman argues that most of the expected
economic damage from climate change may come
from the small chance that warming greatly
exceeds the mid-range expectations, resulting
in catastrophic damage.
Richard Posner has argued that we are doing
far too little, in general, about small, hard-to-estimate
risks of large-scale catastrophes.Numerous
cognitive biases can influence people's judgment
of the importance of existential risks, including
scope insensitivity, hyperbolic discounting,
availability heuristic, the conjunction fallacy,
the affect heuristic, and the overconfidence
effect.Scope insensitivity influences how
bad people consider the extinction of the
human race to be.
For example, when people are motivated to
donate money to altruistic causes, the quantity
they are willing to give does not increase
linearly with the magnitude of the issue:
people are roughly as concerned about 200,000
birds getting stuck in oil as they are about
2,000.
Similarly, people are often more concerned
about threats to individuals than to larger
groups.There are economic reasons that can
explain why so little effort is going into
existential risk reduction.
It is a global good, so even if a large nation
decreases it, that nation will only enjoy
a small fraction of the benefit of doing so.
Furthermore, the vast majority of the benefits
may be enjoyed by far future generations,
and though these quadrillions of future people
would in theory perhaps be willing to pay
massive sums for existential risk reduction,
no mechanism for such a transaction exists.
== Potential sources of risk ==
Some sources of catastrophic risk are natural,
such as meteor impacts or supervolcanoes.
Some of these have caused mass extinctions
in the past.
On the other hand, some risks are man-made,
such as global warming, environmental degradation,
engineered pandemics and nuclear war.
=== Anthropogenic ===
The Cambridge Project at Cambridge University
states that the "greatest threats" to the
human species are man-made; they are artificial
intelligence, global warming, nuclear war,
and rogue biotechnology.
The Future of Humanity Institute also states
that human extinction is more likely to result
from anthropogenic causes than natural causes.
==== Artificial intelligence ====
It has been suggested that learning computers
that rapidly become superintelligent may take
unforeseen actions, or that robots would out-compete
humanity (one technological singularity scenario).
Because of its exceptional scheduling and
organizational capability and the range of
novel technologies it could develop, it is
possible that the first Earth superintelligence
to emerge could rapidly become matchless and
unrivaled: conceivably it would be able to
bring about almost any possible outcome, and
be able to foil virtually any attempt that
threatened to prevent it achieving its objectives.
It could eliminate, wiping out if it chose,
any other challenging rival intellects; alternatively
it might manipulate or persuade them to change
their behavior towards its own interests,
or it may merely obstruct their attempts at
interference.
In Bostrom's book, Superintelligence: Paths,
Dangers, Strategies, he defines this as the
control problem.
Physicist Stephen Hawking, Microsoft founder
Bill Gates and SpaceX founder Elon Musk have
echoed these concerns, with Hawking theorizing
that this A.I. could "spell the end of the
human race".In 2009, the Association for the
Advancement of Artificial Intelligence (AAAI)
hosted a conference to discuss whether computers
and robots might be able to acquire any sort
of autonomy, and how much these abilities
might pose a threat or hazard.
They noted that some robots have acquired
various forms of semi-autonomy, including
being able to find power sources on their
own and being able to independently choose
targets to attack with weapons.
They also noted that some computer viruses
can evade elimination and have achieved "cockroach
intelligence."
They noted that self-awareness as depicted
in science-fiction is probably unlikely, but
that there were other potential hazards and
pitfalls.
Various media sources and scientific groups
have noted separate trends in differing areas
which might together result in greater robotic
functionalities and autonomy, and which pose
some inherent concerns.A survey of AI experts
estimated that the chance of human-level machine
learning having an "extremely bad (e.g., human
extinction)" long-term effect on humanity
is 5%.
A 2008 survey by the Future of Humanity Institute
estimated a 5% probability of extinction by
superintelligence by 2100.
Eliezer Yudkowsky believes that risks from
artificial intelligence are harder to predict
than any other known risks due to bias from
anthropomorphism.
Since people base their judgments of artificial
intelligence on their own experience, he claims
that they underestimate the potential power
of AI.
==== Biotechnology ====
Biotechnology can pose a global catastrophic
risk in the form of bioengineered organisms
(viruses, bacteria, fungi, plants or animals).
In many cases the organism will be a pathogen
of humans, livestock, crops or other organisms
we depend upon (e.g. pollinators or gut bacteria).
However, any organism able to catastrophically
disrupt ecosystem functions, e.g. highly competitive
weeds, outcompeting essential crops, poses
a biotechnology risk.
A biotechnology catastrophe may be caused
by accidentally releasing a genetically engineered
organism from controlled environments, by
the planned release of such an organism which
then turns out to have unforeseen and catastrophic
interactions with essential natural or agro-ecosystems,
or by intentional usage of biological agents
in biological warfare, bioterrorism attacks.
Pathogens may be intentionally or unintentionally
genetically modified to change virulence and
other characteristics.
For example, a group of Australian researchers
unintentionally changed characteristics of
the mousepox virus while trying to develop
a virus to sterilize rodents.
The modified virus became highly lethal even
in vaccinated and naturally resistant mice.
The technological means to genetically modify
virus characteristics are likely to become
more widely available in the future if not
properly regulated.Terrorist applications
of biotechnology have historically been infrequent.
To what extent this is due to a lack of capabilities
or motivation is not resolved.
However, given current development, more risk
from novel, engineered pathogens is to be
expected in the future.
Exponential growth has been observed in the
biotechnology sector, and Noun and Chyba predict
that this will lead to major increases in
biotechnological capabilities in the coming
decades.
They argue that risks from biological warfare
and bioterrorism are distinct from nuclear
and chemical threats because biological pathogens
are easier to mass-produce and their production
is hard to control (especially as the technological
capabilities are becoming available even to
individual users).
In 2008, a survey by the Future of Humanity
Institute estimated a 2% probability of extinction
from engineered pandemics by 2100.Noun and
Chyba propose three categories of measures
to reduce risks from biotechnology and natural
pandemics: Regulation or prevention of potentially
dangerous research, improved recognition of
outbreaks and developing facilities to mitigate
disease outbreaks (e.g. better and/or more
widely distributed vaccines).
==== Cyberattack ====
Cyberattacks have the potential to destroy
everything from personal data to electric
grids.
Christine Peterson, co-founder and past president
of the Foresight Institute, believes a cyberattack
on electric grids has the potential to be
a catastrophic risk.
==== Environmental disaster ====
An environmental or ecological disaster, such
as world crop failure and collapse of ecosystem
services, could be induced by the present
trends of overpopulation, economic development,
and non-sustainable agriculture.
Most environmental scenarios involve one or
more of the following: Holocene extinction
event, scarcity of water that could lead to
approximately one half of the Earth's population
being without safe drinking water, pollinator
decline, overfishing, massive deforestation,
desertification, climate change, or massive
water pollution episodes.
Detected in the early 21st century, a threat
in this direction is colony collapse disorder,
a phenomenon that might foreshadow the imminent
extinction of the Western honeybee.
As the bee plays a vital role in pollination,
its extinction would severely disrupt the
food chain.
An October 2017 report published in The Lancet
stated that toxic air, water, soils, and workplaces
were collectively responsible for 9 million
deaths worldwide in 2015, particularly from
air pollution which was linked to deaths by
increasing susceptibility to non-infectious
diseases, such as heart disease, stroke, and
lung cancer.
The report warned that the pollution crisis
was exceeding "the envelope on the amount
of pollution the Earth can carry" and “threatens
the continuing survival of human societies”.
==== Experimental technology accident ====
Nick Bostrom suggested that in the pursuit
of knowledge, humanity might inadvertently
create a device that could destroy Earth and
the Solar System.
Investigations in nuclear and high-energy
physics could create unusual conditions with
catastrophic consequences.
For example, scientists worried that the first
nuclear test might ignite the atmosphere.
Others worried that the RHIC or the Large
Hadron Collider might start a chain-reaction
global disaster involving black holes, strangelets,
or false vacuum states.
These particular concerns have been refuted,
but the general concern remains.
Biotechnology could lead to the creation of
a pandemic, chemical warfare could be taken
to an extreme, nanotechnology could lead to
grey goo in which out-of-control self-replicating
robots consume all living matter on earth
while building more of themselves—in both
cases, either deliberately or by accident.
==== Global warming ====
Global warming refers to the warming caused
by human technology since the 19th century
or earlier.
Projections of future climate change suggest
further global warming, sea level rise, and
an increase in the frequency and severity
of some extreme weather events and weather-related
disasters.
Effects of global warming include loss of
biodiversity, stresses to existing food-producing
systems, increased spread of known infectious
diseases such as malaria, and rapid mutation
of microorganisms.
In November 2017, a statement by 15,364 scientists
from 184 countries indicated that increasing
levels of greenhouse gases from use of fossil
fuels, human population growth, deforestation,
and overuse of land for agricultural production,
particularly by farming ruminants for meat
consumption, are trending in ways that forecast
an increase in human misery over coming decades.
==== Mineral resource exhaustion ====
Romanian American economist Nicholas Georgescu-Roegen,
a progenitor in economics and the paradigm
founder of ecological economics, has argued
that the carrying capacity of Earth — that
is, Earth's capacity to sustain human populations
and consumption levels — is bound to decrease
sometime in the future as Earth's finite stock
of mineral resources is presently being extracted
and put to use; and consequently, that the
world economy as a whole is heading towards
an inevitable future collapse, leading to
the demise of human civilization itself.
Ecological economist and steady-state theorist
Herman Daly, a student of Georgescu-Roegen,
has propounded the same argument by asserting
that "... all we can do is to avoid wasting
the limited capacity of creation to support
present and future life [on Earth]."
Ever since Georgescu-Roegen and Daly published
these views, various scholars in the field
have been discussing the existential impossibility
of allocating earth's finite stock of mineral
resources evenly among an unknown number of
present and future generations.
This number of generations is likely to remain
unknown to us, as there is no way — or only
little way — of knowing in advance if or
when mankind will ultimately face extinction.
In effect, any conceivable intertemporal allocation
of the stock will inevitably end up with universal
economic decline at some future point.
==== Nanotechnology ====
Many nanoscale technologies are in development
or currently in use.
The only one that appears to pose a significant
global catastrophic risk is molecular manufacturing,
a technique that would make it possible to
build complex structures at atomic precision.
Molecular manufacturing requires significant
advances in nanotechnology, but once achieved
could produce highly advanced products at
low costs and in large quantities in nanofactories
of desktop proportions.
When nanofactories gain the ability to produce
other nanofactories, production may only be
limited by relatively abundant factors such
as input materials, energy and software.Molecular
manufacturing could be used to cheaply produce,
among many other products, highly advanced,
durable weapons.
Being equipped with compact computers and
motors these could be increasingly autonomous
and have a large range of capabilities.Chris
Phoenix and Treder classify catastrophic risks
posed by nanotechnology into three categories:
From augmenting the development of other technologies
such as AI and biotechnology.
By enabling mass-production of potentially
dangerous products that cause risk dynamics
(such as arms races) depending on how they
are used.
From uncontrolled self-perpetuating processes
with destructive effects.Several researchers
state that the bulk of risk from nanotechnology
comes from the potential to lead to war, arms
races and destructive global government.
Several reasons have been suggested why the
availability of nanotech weaponry may with
significant likelihood lead to unstable arms
races (compared to e.g. nuclear arms races):
A large number of players may be tempted to
enter the race since the threshold for doing
so is low;
The ability to make weapons with molecular
manufacturing will be cheap and easy to hide;
Therefore, lack of insight into the other
parties' capabilities can tempt players to
arm out of caution or to launch preemptive
strikes;
Molecular manufacturing may reduce dependency
on international trade, a potential peace-promoting
factor;
Wars of aggression may pose a smaller economic
threat to the aggressor since manufacturing
is cheap and humans may not be needed on the
battlefield.Since self-regulation by all state
and non-state actors seems hard to achieve,
measures to mitigate war-related risks have
mainly been proposed in the area of international
cooperation.
International infrastructure may be expanded
giving more sovereignty to the international
level.
This could help coordinate efforts for arms
control.
International institutions dedicated specifically
to nanotechnology (perhaps analogously to
the International Atomic Energy Agency IAEA)
or general arms control may also be designed.
One may also jointly make differential technological
progress on defensive technologies, a policy
that players should usually favour.
The Center for Responsible Nanotechnology
also suggests some technical restrictions.
Improved transparency regarding technological
capabilities may be another important facilitator
for arms-control.
Grey goo is another catastrophic scenario,
which was proposed by Eric Drexler in his
1986 book Engines of Creation and has been
a theme in mainstream media and fiction.
This scenario involves tiny self-replicating
robots that consume the entire biosphere using
it as a source of energy and building blocks.
Nowadays, however, nanotech experts—including
Drexler—discredit the scenario.
According to Phoenix, a "so-called grey goo
could only be the product of a deliberate
and difficult engineering process, not an
accident".
==== Warfare and mass destruction ====
The scenarios that have been explored most
frequently are nuclear warfare and doomsday
devices.
Although the probability of a nuclear war
per year is slim, Professor Martin Hellman
has described it as inevitable in the long
run; unless the probability approaches zero,
inevitably there will come a day when civilization's
luck runs out.
During the Cuban missile crisis, U.S. president
John F. Kennedy estimated the odds of nuclear
war at "somewhere between one out of three
and even".
The United States and Russia have a combined
arsenal of 14,700 nuclear weapons, and there
is an estimated total of 15,700 nuclear weapons
in existence worldwide.
Beyond nuclear, other military threats to
humanity include biological warfare (BW).
By contrast, chemical warfare, while able
to create multiple local catastrophes, is
unlikely to create a global one.
Nuclear war could yield unprecedented human
death tolls and habitat destruction.
Detonating large numbers of nuclear weapons
would have an immediate, short term and long-term
effects on the climate, causing cold weather
and reduced sunlight and photosynthesis that
may generate significant upheaval in advanced
civilizations.
However, while popular perception sometimes
takes nuclear war as "the end of the world",
experts assign low probability to human extinction
from nuclear war.
In 1982, Brian Martin estimated that a US–Soviet
nuclear exchange might kill 400–450 million
directly, mostly in the United States, Europe
and Russia, and maybe several hundred million
more through follow-up consequences in those
same areas.
In 2008, a survey by the Future of Humanity
Institute estimated a 4% probability of extinction
from warfare by 2100, with a 1% chance of
extinction from nuclear warfare.
==== World population and agricultural crisis
====
The 20th century saw a rapid increase in human
population due to medical developments and
massive increases in agricultural productivity
such as the Green Revolution.
Between 1950 and 1984, as the Green Revolution
transformed agriculture around the globe,
world grain production increased by 250%.
The Green Revolution in agriculture helped
food production to keep pace with worldwide
population growth or actually enabled population
growth.
The energy for the Green Revolution was provided
by fossil fuels in the form of fertilizers
(natural gas), pesticides (oil), and hydrocarbon-fueled
irrigation.
David Pimentel, professor of ecology and agriculture
at Cornell University, and Mario Giampietro,
senior researcher at the National Research
Institute on Food and Nutrition (INRAN), place
in their 1994 study Food, Land, Population
and the U.S. Economy the maximum U.S. population
for a sustainable economy at 200 million.
To achieve a sustainable economy and avert
disaster, the United States must reduce its
population by at least one-third, and world
population will have to be reduced by two-thirds,
says the study.The authors of this study believe
that the mentioned agricultural crisis will
begin to have an effect on the world after
2020, and will become critical after 2050.
Geologist Dale Allen Pfeiffer claims that
coming decades could see spiraling food prices
without relief and massive starvation on a
global level such as never experienced before.Wheat
is humanity's third-most-produced cereal.
Extant fungal infections such as Ug99 (a kind
of stem rust) can cause 100% crop losses in
most modern varieties.
Little or no treatment is possible and infection
spreads on the wind.
Should the world's large grain-producing areas
become infected, the ensuing crisis in wheat
availability would lead to price spikes and
shortages in other food products.
=== Non-anthropogenic ===
==== 
Asteroid impact ====
Several asteroids have collided with earth
in recent geological history.
The Chicxulub asteroid, for example, is theorized
to have caused the extinction of the non-avian
dinosaurs 66 million years ago at the end
of the Cretaceous.
No sufficiently large asteroid currently exists
in an Earth-crossing orbit; however, a comet
of sufficient size to cause human extinction
could impact the Earth, though the annual
probability may be less than 10−8.
Geoscientist Brian Toon estimates that a 60-mile
meteorite would be large enough to "incinerate
everybody".
Asteroids with around a 1 km diameter have
impacted the Earth on average once every 500,000
years; these are probably too small to pose
an extinction risk, but might kill billions
of people.
Larger asteroids are less common.
Small near-Earth asteroids are regularly observed
and can impact anywhere on the Earth injuring
local populations.
As of 2013, Spaceguard estimates it has identified
95% of all NEOs over 1 km in size.In April
2018, the B612 Foundation reported "It's a
100 per cent certain we'll be hit [by a devastating
asteroid], but we're not 100 per cent sure
when."
Also in 2018, physicist Stephen Hawking, in
his final book Brief Answers to the Big Questions,
considered an asteroid collision to be the
biggest threat to the planet.
In June 2018, the US National Science and
Technology Council warned that America is
unprepared for an asteroid impact event, and
has developed and released the "National Near-Earth
Object Preparedness Strategy Action Plan"
to better prepare.
According to expert testimony in the United
States Congress in 2013, NASA would require
at least five years of preparation before
a mission to intercept an asteroid could be
launched.
==== Cosmic threats ====
A number of astronomical threats have been
identified.
Massive objects, e.g. a star, large planet
or black hole, could be catastrophic if a
close encounter occurred in the Solar System.
In April 2008, it was announced that two simulations
of long-term planetary movement, one at the
Paris Observatory and the other at the University
of California, Santa Cruz, indicate a 1% chance
that Mercury's orbit could be made unstable
by Jupiter's gravitational pull sometime during
the lifespan of the Sun.
Were this to happen, the simulations suggest
a collision with Earth could be one of four
possible outcomes (the others being Mercury
colliding with the Sun, colliding with Venus,
or being ejected from the Solar System altogether).
If Mercury were to collide with Earth, all
life on Earth could be obliterated entirely:
an asteroid 15 km wide is believed to have
caused the extinction of the non-avian dinosaurs,
whereas Mercury is 4,879 km in diameter.Another
cosmic threat is a gamma-ray burst, typically
produced by a supernova when a star collapses
inward on itself and then "bounces" outward
in a massive explosion.
Under certain circumstances, these events
are thought to produce massive bursts of gamma
radiation emanating outward from the axis
of rotation of the star.
If such an event were to occur oriented towards
the Earth, the massive amounts of gamma radiation
could significantly affect the Earth's atmosphere
and pose an existential threat to all life.
Such a gamma-ray burst may have been the cause
of the Ordovician–Silurian extinction events.
Neither this scenario nor the destabilization
of Mercury's orbit are likely in the foreseeable
future.A powerful solar flare or solar superstorm,
which is a drastic and unusual decrease or
increase in the Sun's power output, could
have severe consequences for life on Earth.If
our universe lies within a false vacuum, a
bubble of lower-energy vacuum could come to
exist by chance or otherwise in our universe,
and catalyze the conversion of our universe
to a lower energy state in a volume expanding
at nearly the speed of light, destroying all
that we know without forewarning.
Such an occurrence is called vacuum decay.The
most predictable outcome for the future of
the Earth is the Sun's expansion into a red
giant star.
The Sun will be about 12 billion years old
and expand to swallow both Mercury and Venus,
reaching a maximum radius of 1.2 AU (180,000,000
km).
The Earth will interact tidally with the Sun's
outer atmosphere, which would serve to decrease
Earth's orbital radius.
Drag from the chromosphere of the Sun would
also reduce the Earth's orbit.
These effects will act to counterbalance the
effect of mass loss by the Sun, and the Earth
will probably be engulfed by the Sun.
==== Extraterrestrial invasion ====
Intelligent extraterrestrial life, if existent,
could invade Earth either to exterminate and
supplant human life, enslave it under a colonial
system, steal the planet's resources, or destroy
the planet altogether.
Although evidence of alien life has never
been documented, scientists such as Carl Sagan
have postulated that the existence of extraterrestrial
life is very likely.
In 1969, the "Extra-Terrestrial Exposure Law"
was added to the United States Code of Federal
Regulations (Title 14, Section 1211) in response
to the possibility of biological contamination
resulting from the U.S. Apollo Space Program.
It was removed in 1991.
Scientists consider such a scenario technically
possible, but unlikely.An article in The New
York Times discussed the possible threats
for humanity of intentionally sending messages
aimed at extraterrestrial life into the cosmos
in the context of the SETI efforts.
Several renowned public figures such as Stephen
Hawking and Elon Musk have argued against
sending such messages on the grounds that
extraterrestrial civilizations with technology
are probably far more advanced than humanity
and could pose an existential threat to humanity.
==== Global pandemic ====
Numerous historical examples of pandemics
had a devastating effect on a large number
of people.
The present, unprecedented scale and speed
of human movement make it more difficult than
ever to contain an epidemic through local
quarantines, and other sources of uncertainty
and the evolving nature of the risk means
natural pandemics may pose a realistic threat
to human civilization.There are several classes
of argument about the likelihood of pandemics.
One class of argument about likelihood stems
from the history of pandemics, where the limited
size of historical pandemics is evidence that
larger pandemics are unlikely.
This argument has been disputed on several
grounds, including the changing risk due to
changing population and behavioral patterns
among humans, the limited historical record,
and the existence of an anthropic bias.Another
argument about the likelihood of pandemics
is based on an evolutionary model that predicts
that naturally evolving pathogens will ultimately
develop an upper limit to their virulence.
This is because pathogens with high enough
virulence quickly kill their hosts and reduce
their chances of spread the infection to new
hosts or carriers.
This model has limits, however, because the
fitness advantage of limited virulence is
primarily a function of a limited number of
hosts.
Any pathogen with a high virulence, high transmission
rate and long incubation time may have already
caused a catastrophic pandemic before ultimately
virulence is limited through natural selection.
Additionally, a pathogen that infects humans
as a secondary host and primarily infects
another species (a zoonosis) has no constraints
on its virulence in people, since the accidental
secondary infections do not affect its evolution.
Lastly, in models where virulence level and
rate of transmission are related, high levels
of virulence can evolve.
Virulence is instead limited by the existence
of complex populations of hosts with different
susceptibilities to infection, or by some
hosts being geographically isolated.
The size of the host population and competition
between different strains of pathogens can
also alter virulence.Neither of these arguments
is applicable to bioengineered pathogens,
and this poses entirely different risks of
pandemics.
Experts have concluded that "Developments
in science and technology could significantly
ease the development and use of high consequence
biological weapons," and these "highly virulent
and highly transmissible [bio-engineered pathogens]
represent new potential pandemic threats."
==== 
Natural climate change ====
Climate change refers to a lasting change
in the Earth's climate.
The climate has ranged from ice ages to warmer
periods when palm trees grew in Antarctica.
It has been hypothesized that there was also
a period called "snowball Earth" when all
the oceans were covered in a layer of ice.
These global climatic changes occurred slowly,
prior to the rise of human civilization about
10 thousand years ago near the end of the
last Major Ice Age when the climate became
more stable.
However, abrupt climate change on the decade
time scale has occurred regionally.
Since civilization originated during a period
of stable climate, a natural variation into
a new climate regime (colder or hotter) could
pose a threat to civilization.In the history
of the Earth, many ice ages are known to have
occurred.
An ice age would have a serious impact on
civilization because vast areas of land (mainly
in North America, Europe, and Asia) could
become uninhabitable.
Currently, the world is in an interglacial
period within a much older glacial event.
The last glacial expansion ended about 10,000
years ago, and all civilizations evolved later
than this.
Scientists do not predict that a natural ice
age will occur anytime soon.
The amount of heat trapping gases emitted
into Earth's Oceans and atmosphere will prevent
the next ice age, which otherwise would begin
in around 50,000 years, and likely more glacial
cycles.
==== Volcanism ====
A geological event such as massive flood basalt,
volcanism, or the eruption of a supervolcano
could lead to a so-called volcanic winter,
similar to a nuclear winter.
One such event, the Toba eruption, occurred
in Indonesia about 71,500 years ago.
According to the Toba catastrophe theory,
the event may have reduced human populations
to only a few tens of thousands of individuals.
Yellowstone Caldera is another such supervolcano,
having undergone 142 or more caldera-forming
eruptions in the past 17 million years.
A massive volcano eruption would eject extraordinary
volumes of volcanic dust, toxic and greenhouse
gases into the atmosphere with serious effects
on global climate (towards extreme global
cooling: volcanic winter if short-term, and
ice age if long-term) or global warming (if
greenhouse gases were to prevail).
When the supervolcano at Yellowstone last
erupted 640,000 years ago, the thinnest layers
of the ash ejected from the caldera spread
over most of the United States west of the
Mississippi River and part of northeastern
Mexico.
The magma covered much of what is now Yellowstone
National Park and extended beyond, covering
much of the ground from Yellowstone River
in the east to the Idaho falls in the west,
with some of the flows extending north beyond
Mammoth Springs.According to a recent study,
if the Yellowstone caldera erupted again as
a supervolcano, an ash layer one to three
millimeters thick could be deposited as far
away as New York, enough to "reduce traction
on roads and runways, short out electrical
transformers and cause respiratory problems".
There would be centimeters of thickness over
much of the U.S. Midwest, enough to disrupt
crops and livestock, especially if it happened
at a critical time in the growing season.
The worst-affected city would likely be Billings,
Montana, population 109,000, which the model
predicted would be covered with ash estimated
as 1.03 to 1.8 meters thick.The main long-term
effect is through global climate change, which
reduces the temperature globally by about
5–15 degrees C for a decade, together with
the direct effects of the deposits of ash
on their crops.
A large supervolcano like Toba would deposit
one or two meters thickness of ash over an
area of several million square kilometers.(1000
cubic kilometers is equivalent to a one-meter
thickness of ash spread over a million square
kilometers).
If that happened in some densely populated
agricultural area, such as India, it could
destroy one or two seasons of crops for two
billion people.However, Yellowstone shows
no signs of a supereruption at present, and
it is not certain that a future supereruption
will occur there.Research published in 2011
finds evidence that massive volcanic eruptions
caused massive coal combustion, supporting
models for significant generation of greenhouse
gases.
Researchers have suggested that massive volcanic
eruptions through coal beds in Siberia would
generate significant greenhouse gases and
cause a runaway greenhouse effect.
Massive eruptions can also throw enough pyroclastic
debris and other material into the atmosphere
to partially block out the sun and cause a
volcanic winter, as happened on a smaller
scale in 1816 following the eruption of Mount
Tambora, the so-called Year Without a Summer.
Such an eruption might cause the immediate
deaths of millions of people several hundred
miles from the eruption, and perhaps billions
of death worldwide, due to the failure of
the monsoons, resulting in major crop failures
causing starvation on a profound scale.A much
more speculative concept is the verneshot:
a hypothetical volcanic eruption caused by
the buildup of gas deep underneath a craton.
Such an event may be forceful enough to launch
an extreme amount of material from the crust
and mantle into a sub-orbital trajectory.
== Proposed mitigation ==
Planetary management and respecting planetary
boundaries have been proposed as approaches
to preventing ecological catastrophes.
Within the scope of these approaches, the
field of geoengineering encompasses the deliberate
large-scale engineering and manipulation of
the planetary environment to combat or counteract
anthropogenic changes in atmospheric chemistry.
Space colonization is a proposed alternative
to improve the odds of surviving an extinction
scenario.
Solutions of this scope may require megascale
engineering.
Food storage has been proposed globally, but
the monetary cost would be high.
Furthermore, it would likely contribute to
the current millions of deaths per year due
to malnutrition.Some survivalists stock survival
retreats with multiple-year food supplies.
The Svalbard Global Seed Vault is buried 400
feet (120 m) inside a mountain on an island
in the Arctic.
It is designed to hold 2.5 billion seeds from
more than 100 countries as a precaution to
preserve the world's crops.
The surrounding rock is −6 °C (21 °F)
(as of 2015) but the vault is kept at −18
°C (0 °F) by refrigerators powered by locally
sourced coal.More speculatively, if society
continues to function and if the biosphere
remains habitable, calorie needs for the present
human population might in theory be met during
an extended absence of sunlight, given sufficient
advance planning.
Conjectured solutions include growing mushrooms
on the dead plant biomass left in the wake
of the catastrophe, converting cellulose to
sugar, or feeding natural gas to methane-digesting
bacteria.
=== Global catastrophic risks and global governance
===
Insufficient global governance creates risks
in the social and political domain, but the
governance mechanisms develop more slowly
than technological and social change.
There are concerns from governments, the private
sector, as well as the general public about
the lack of governance mechanisms to efficiently
deal with risks, negotiate and adjudicate
between diverse and conflicting interests.
This is further underlined by an understanding
of the interconnectedness of global systemic
risks.
=== Climate emergency plans ===
In 2018, the Club of Rome submitted a plan
to the European Parliament, urging to address
the existential threat from climate change
more forcefully, calling for a collaborative
climate action afford.
== Organizations ==
The Bulletin of the Atomic Scientists (est.
1945) is one of the oldest global risk organizations,
founded after the public became alarmed by
the potential of atomic warfare in the aftermath
of WWII.
It studies risks associated with nuclear war
and energy and famously maintains the Doomsday
Clock established in 1947.
The Foresight Institute (est. 1986) examines
the risks of nanotechnology and its benefits.
It was one of the earliest organizations to
study the unintended consequences of otherwise
harmless technology gone haywire at a global
scale.
It was founded by K. Eric Drexler who postulated
"grey goo".Beginning after 2000, a growing
number of scientists, philosophers and tech
billionaires created organizations devoted
to studying global risks both inside and outside
of academia.Independent non-governmental organizations
(NGOs) include the Machine Intelligence Research
Institute (est. 2000), which aims to reduce
the risk of a catastrophe caused by artificial
intelligence, with donors including Peter
Thiel and Jed McCaleb.
The Nuclear Threat Initiative (est. 2001)
seeks to reduce global threats from nuclear,
biological and chemical threats, and containment
of damage after an event.
It maintains a nuclear material security index.
The Lifeboat Foundation (est. 2009) funds
research into preventing a technological catastrophe.
Most of the research money funds projects
at universities.
The Global Catastrophic Risk Institute (est.
2011) is a think tank for catastrophic risk.
It is funded by the NGO Social and Environmental
Entrepreneurs.
The Global Challenges Foundation (est. 2012),
based in Stockholm and founded by Laszlo Szombatfalvy,
releases a yearly report on the state of global
risks.
The Future of Life Institute (est. 2014) aims
to support research and initiatives for safeguarding
life considering new technologies and challenges
facing humanity.
Elon Musk is one of its biggest donors.University-based
organizations include the Future of Humanity
Institute (est. 2005) which researches the
questions of humanity's long-term future,
particularly existential risk.
It was founded by Nick Bostrom and is based
at Oxford University.
The Centre for the Study of Existential Risk
(est. 2012) is a Cambridge-based organization
which studies four major technological risks:
artificial intelligence, biotechnology, global
warming and warfare.
All are man-made risks, as Huw Price explained
to the AFP news agency, "It seems a reasonable
prediction that some time in this or the next
century intelligence will escape from the
constraints of biology".
He added that when this happens "we're no
longer the smartest things around," and will
risk being at the mercy of "machines that
are not malicious, but machines whose interests
don't include us."
Stephen Hawking was an acting adviser.
The Millennium Alliance for Humanity and the
Biosphere is a Stanford University-based organization
focusing on many issues related to global
catastrophe by bringing together members of
academic in the humanities.
It was founded by Paul Ehrlich among others.
Stanford University also has the Center for
International Security and Cooperation focusing
on political cooperation to reduce global
catastrophic risk.
The Center for Security and Emerging Technology
was established in January 2019 at Georgetown's
Walsh School of Foreign Service and will focus
on policy research of emerging technologies
with an initial emphasis on artificial intelligence.
They received a grant of 55M USD from Good
Ventures as suggested by the Open Philanthropy
Project.Other risk assessment groups are based
in or are part of governmental organizations.
The World Health Organization (WHO) includes
a division called the Global Alert and Response
(GAR) which monitors and responds to global
epidemic crisis.
GAR helps member states with training and
coordination of response to epidemics.
The United States Agency for International
Development (USAID) has its Emerging Pandemic
Threats Program which aims to prevent and
contain naturally generated pandemics at their
source.
The Lawrence Livermore National Laboratory
has a division called the Global Security
Principal Directorate which researches on
behalf of the government issues such as bio-security
and counter-terrorism.
== 
See also ==
== 
Notes ==
== 
Further reading ==
Classifying global catastrophic risks Avin
et al. (2018)
Corey S. Powell (2000).
"Twenty ways the world could end suddenly",
Discover Magazine
Martin Rees (2004).
OUR FINAL HOUR: A Scientist's warning: How
Terror, Error, and Environmental Disaster
Threaten Humankind's Future in This Century—On
Earth and Beyond.
ISBN 0-465-06863-4
Jean-Francois Rischard (2003).
High Noon 20 Global Problems, 20 Years to
Solve Them.
ISBN 0-465-07010-8
Edward O. Wilson (2003).
The Future of Life.
ISBN 0-679-76811-4
Roger-Maurice Bonnet and Lodewijk Woltjer,
Surviving 1,000 Centuries Can We Do It?
(2008), Springer-Praxis Books.
Derrick Jensen (2006) Endgame (ISBN 1-58322-730-X).
Jared Diamond, Collapse: How Societies Choose
to Fail or Succeed, Penguin Books, 2005 and
2011 (ISBN 9780241958681).
Huesemann, Michael H., and Joyce A. Huesemann
(2011).
Technofix: Why Technology Won't Save Us or
the Environment, Chapter 6, “Sustainability
or Collapse”, New Society Publishers, Gabriola
Island, British Columbia, Canada, 464 pages
(ISBN 0865717044).
Joel Garreau, Radical Evolution, 2005 (ISBN
978-0385509657).
John A. Leslie (1996).
The End of the World (ISBN 0-415-14043-9).
Donella Meadows (1972).
The Limits to Growth (ISBN 0-87663-165-0).
Joseph Tainter, (1990).
The Collapse of Complex Societies, Cambridge
University Press, Cambridge, UK (ISBN 9780521386739).
== External links ==
Annual Reports on Global Risk by the Global
Challenges Foundation
"What a way to go" from The Guardian.
Ten scientists name the biggest dangers to
Earth and assess the chances they will happen.
April 14, 2005.
Stephen Petranek: 10 ways the world could
end, a TED talk
"Top 10 Ways to Destroy Earth".
livescience.com.
LiveScience.
Archived from the original on 2011-01-01.
"Are we on the road to civilisation collapse?".
BBC. 19 February 2019.
