In July, Ronald Bailey wrote several articles for Reason while attending the Global Catastrophic Risks Conference in Oxford. You can read Bailey’s dispatches here, here and here.
The conference was sponsored by Oxford’s Future of Humanity Institute which is run by the always interesting Nick Bostrom. Bostrom opened the conference with perhaps the little bit of good news on existential threats — so far, none of them have come to pass,
The good news is that no existential catastrophe has happened. Not one. Yet.
On the other hand, depending on how far you want to go back to date the first homo sapien or homo sapien-like ancestor, the best explanation for this could be that we simply haven’t been around long enough to face an existential catastrophe. And, of course, the evidence supports the claim that at times the breeding population of our distant ancestors was reduced to extremely low levels.
Bostrom himself noted the debate surrounding the Toba super volcano eruption that some have speculated may reduced the human population to a few thousand people, though there is also some evidence that the reduction in population may not have been quite that severe.
According to Bailey, Bostrom argued the biggest existential threats facing humanity are self-induced,
Bostrom did note that people today are safer from small to medium threats than ever before. As evidence he cites increased life expectancy from 18 years in the Bronze Age to 64 years today (the World Health Organizations thinks it’s 66 years). And he urged the audience not to let future existential risks occlude our view of current disasters, such as 15 million people dying of infectious diseases every year, 3 million from HIV/AIDS, 18 million from cardiovascular diseases, and 8 million per year from cancer. Bostrom did note that, “All of the biggest risks, the existential risks are seen to be anthropogenic, that is, they originate from human beings.” The biggest risks include nuclear war, biotech plagues, and nanotechnology arms races. The good news is that the biggest existential risks are probably decades away, which means we have time to analyze them and develop countermeasures.
In his final dispatch from the conference, Bailey reported on Joseph Cirincione who spoke at the conference and noted how human civilization almost ended in 1995 due to, of all things, a Norwegian weather satellite,
With regard to the possibility of an accidental nuclear war, Cirincione pointed to the near miss that occurred in 1995 when Norway launched a weather satellite and Russian military officials mistook it as a submarine launched ballistic missile aimed at producing an electro-magnetic pulse to disable a Russian military response. Russian nuclear defense officials opened the Russian “football” in front of President Boris Yeltsin, urging him to order an immediate strike against the West. Fortunately, Yeltsin held off, arguing that it must be a mistake.
Cirincione noted that worldwide stockpiles of nuclear weapons have been reduced dramatically since the end of the Cold War, and the possibility for a worldwide disarmament of nuclear weapons is higher than at any time since 1945.
Bailey also reports on a few folks who presented the view that a strong AI and/or nanotechnology present serious existential risks, but the arguments presented there (at least as filtered through Bailey) seemed shallow,
In addition, an age of nanotech abundance would eliminate the majority of jobs, possibly leading to massive social disruptions. Social disruption creates the opportunity for a charismatic personality to take hold. “Nanotechnology could lead to some form of world dictatorship,” said [the Center for Responsible Nanotechnology’s Michael] Treder. “There is a global catastrophic risk that we could all be enslaved.”
Ok, but the reason jobs would be eliminated and this would be “an age of nanotech abudance” would be precisely that the little nanobots would be doing all the work. and the resulting goods would be essentially free. I guess if by “massive social disruptions” you mean everyone skiing and hanging out at the beach instead of working, then yeah, ok, but I doubt that’s going to lead to a worldwide dictator (who, as a reactionary, is probably going to want to force people to go back to work — about as attractive an offer as religious sects that demand celibacy).
Maybe it’s just me, but I’m worried about more abstract possibilites such as a devestating gamma ray burst which would wipe out all of humanity except for Bruce Banner. And there’s always that old standby, entropy.