Today, existential risks are the subject of much debate. It’s probably because there are many global events occurring that feed the feeling that, as a species, we’re drawing close to an abyss. However, although it may appear to be a current debate, it’s actually much older. In fact, the first reflections on the extinction of our species date from a time closer to our own origins.
The human being has always faced existential risks. At the dawn of our species, there were many threats and few resources. In fact, a volcano, a meteorite, a simple disease, etc, could have wiped us out. This didn’t happen, but the risks didn’t disappear. We have certain limits, even though we sometimes ignore them. As a matter of fact, we might say that, compared to other species, our ability to adapt is extremely limited.
Today, we’re in a position to control many factors, but not all. As a result of this, in 2012, the Centre for the Study of Existential Risk was created at the University of Cambridge (UK). They propose that there are currently four basic factors that could end the species. They’re as follows.
“In our time the greatest risks we face have a high probability of being the result of our activities.“
-Seán Ó hÉigeartaigh-
Artificial intelligence, the greatest existential risk?
Many experts think that artificial intelligence is one of the fronts where we could experience problems. Daniel Dewey, a prestigious scientist, suggests that we’re beginning to trust technology, which seems to produce positive results, but we don’t know too much about them. We know that there’s a model, but its sophistication is such that we may not really know enough about it. We understand that, under certain input parameters, it succeeds, and under others, it fails. But why?
There’s a risk that computer intelligence will be put at the service of hegemonic interests, with the purpose of exercising dominance and absolute control over humans. It’s also possible that the systems could end up being able to program themselves and create a superintelligence, superior to that of any human being. We’re talking about science fiction that’s becoming increasingly less fictional.
Another of the great existential risks is climate change. This is a phenomenon that’s already present. However, action hasn’t been taken against it speedily or efficiently enough. Data indicates that, during the last century, the 20 richest countries in the world have consumed more raw materials and non-renewable energy resources than all human beings throughout their history and prehistory. At the same time, since the middle of the 20th century, more people have been born than in all of history.
The risk is that consumerism and the intensive use of dirty technologies will cause irreversible changes in the climate. Moreover, the depletion of basic resources, such as water, will occur, as well as changes in ecosystems. This could lead to our extinction.
Experiments with microorganisms, which can even be done at home today, could lead to the creation of an uncontrolled planetary plague. Likewise, genetic manipulation has the potential to cause a sequence of unforeseen events that could lead to a major biological accident.
On the other hand, nanotechnology is also within the range of existential risks. These microscopic robots that can be inserted into the body could generate a ‘gray plague’. It’d be an uncontrolled disease that would focus on consuming living matter.
During the Cold War, there were many times when we were at the point of nuclear war, mainly due to errors in reporting or misinterpretation of the facts. Indeed, it doesn’t take two nuclear-armed nations to collide for an atomic catastrophe to occur.
If there was a nuclear war between the great powers, millions of people would die from radiation sickness. After this, a nuclear winter would occur: the excess matter in the atmosphere would cover the sun’s rays, causing extreme darkness and cold. This would dramatically reduce food production and put the survival of the species at risk.
Existential risks are often seen as extraneous factors in most of our lives. That’s one of the factors that makes them so dangerous. It’s extremely important that we’re all aware of these dangers and that, wherever we are, we contribute to reducing them.
The post The Main Existential Risks for Humanity appeared first on Exploring your mind.