We can't compartmentalize away our apocalyptic future
Like most people, I'm a compartmentalizer. For years I went blithely about my business — doing my work, watching movies, celebrating birthdays — while only rarely thinking about the end of the world.
But as I get older and as the threats to people and the planet grow more grave and imminent, I find it increasingly difficult to go too long without a pang of panic.
It was not particularly helpful that I recently read a paper from the U.S. National Intelligence Council talking about "existential threats" to mankind. They included "runaway artificial intelligence, engineered pandemics, nanotechnology weapons [and] nuclear war."
These perils, as the report put it, "could damage life on a global scale." They could mean humanity's extinction in the relative short term. And they're all dangers to us, created by us.
Once, I might have brushed that realization off and headed out to lunch. This time, I mentally added climate change to the list of potential calamities, and grew worried.
William MacAskill, an Oxford University philosophy professor, recently put threats like these in their proper historical context, noting that for most of mankind's existence, we humans didn't have the ability to destroy ourselves, at least not entirely. Of course we were often vicious and violent, and we killed each other to the very best of our abilities. But until the mid-20th century, we didn't have the technological wherewithal to wipe ourselves out.
But then, thanks to the brilliance of our species — the same brilliance that cures diseases, erects skyscrapers and launches moon rockets — we developed the atomic bomb.
I was born in the early years of the nuclear age, only a decade after Hiroshima, when the notion of looming Armageddon was still relatively new. In my childhood, we ducked-and-covered beneath our school desks. Bob Dylan released "Talkin' World War III Blues." During 1962's Cuban Missile Crisis even President John F. Kennedy believed the chance of nuclear war was "between one-in-three and even."
But those days seem almost quaint and comforting now. The apocalyptic hazards have multiplied.
"A worrying number of risks conspire to threaten the end of humanity … ," writes MacAskill in the current issue of Foreign Affairs, a staid journal not known for sensationalism. "Advances in weaponry, biology and computing could spell the end of the species, either through deliberate misuse or a large-scale accident."
"There are deadly risks over the horizon for which we are not prepared," said Sen. Rob Portman, R-Ohio, recently as he and a Democratic colleague introduced the Global Catastrophic Risk Mitigation Act, to ensure the U.S. is better prepared for "high consequence events, regardless of low probability."
Shaken, I began to read up. I hadn't focused on the dangers of runaway artificial intelligence or worried much when Elon Musk (a known shoot-from-the-hipper) said machines would overtake humans by 2025 and constituted a "fundamental existential risk." But it seems that plenty of other scientists and chief executives and government officials, including Bill Gates and Stephen Hawking (before he died), have also worried about whether we're in full control of the technology we're developing. The nightmare scenario appears to be that machine intelligence could surpass human intelligence and turn destructive, either maliciously or by accident. It doesn't seem imminent, and AI's danger is often hyped or conflated with sci-fi, but the danger is not nonexistent either.
Of more immediate concern is climate change. It's less dramatic perhaps, but also more unstoppable because we've dithered for so long. The parade of climate horribles if emissions continue to rise unabated goes well beyond hot days, brownouts and lawn-watering restrictions. Ultimately, water scarcity and intensified heat could lead to food shortages and malnutrition, mass migrations of tens of millions of people, conflict and war from heightened competition for minerals and water, and collapsed economies.
As for pandemics, we'd been warned for years — and COVID-19 should have been our wake-up call. It has killed 6.5 million people so far and cost the world economy trillions of dollars. Future pandemics, though, will emerge more often, spread more rapidly and kill more people without transformative change in our approach to infectious diseases, experts say. And do you really believe we're better prepared for a worst-case pandemic now — or will we be plunged right back into the world of anti-maskers, anti-vaxxers and science deniers?
What's more, a bioengineered pandemic seems possible and potentially deadlier.
Finally, the dangers of nuclear war haven't gone away. The U.S. still has some 5,425 nuclear warheads in its arsenal and Russia has 5,977 — at a moment when relations between the two are increasingly hostile. Seven other countries possess nuclear weapons and others hope to attain them.
Plenty of rational people have proposals to address these challenges. They include enhanced global cooperation, better risk assessment, the development of advance mitigation strategies and adoption of multinational rules to rein in work that could lead to dangerous outcomes.
I'm for all that, but it'll be tough. We live in a time of resurgent hostility among the great powers, of renewed territorial and imperial ambitions. Russia's angry and China's rising. The United Nations is on the defensive; the U.S., for its part, is politically polarized and divided.
The risks are so profound that, as the National Intelligence Council put it, they "challenge our ability to imagine and comprehend their potential scope and scale."
We're not wired — biologically as individuals or politically as a society — to respond to long-term threats. We don't worry much about the future or take its needs into account. As individuals, we feel powerless; compartmentalization is a natural defense mechanism.
But as much as I'd like to bluster through life enjoying myself and ignoring the impending threats, that's an increasingly irresponsible stance. I'll keep watching movies and celebrating birthdays, but we all need to get focused on the future, and on making the world a safer place for our children's children.
— Nicholas Goldberg is an associate editor and Op-Ed columnist for the Los Angeles Times.