An unarmed Minuteman III intercontinental ballistic missile launches from Vandenberg Air Force Base in California on February 5, 2020. Despite heightened fears of nuclear war, given tensions over Ukraine and Taiwan, the biggest threat to human survival is more likely to be artificial intelligence or other ‘human software’. Photo: AFP
An unarmed Minuteman III intercontinental ballistic missile launches from Vandenberg Air Force Base in California on February 5, 2020. Despite heightened fears of nuclear war, given tensions over Ukraine and Taiwan, the biggest threat to human survival is more likely to be artificial intelligence or other ‘human software’. Photo: AFP
Gwynne Dyer
Opinion

Opinion

Gwynne Dyer

Don’t fear nuclear war – a killer plague or rogue AI are more likely to end humanity

  • Humanity tends to lack a long-term perspective because there has been little in our evolutionary history that rewards such thinking
  • Long-term strategies can help avert existential threats we create ourselves, such as climate change, lab-engineered viruses and artificial intelligence

An unarmed Minuteman III intercontinental ballistic missile launches from Vandenberg Air Force Base in California on February 5, 2020. Despite heightened fears of nuclear war, given tensions over Ukraine and Taiwan, the biggest threat to human survival is more likely to be artificial intelligence or other ‘human software’. Photo: AFP
An unarmed Minuteman III intercontinental ballistic missile launches from Vandenberg Air Force Base in California on February 5, 2020. Despite heightened fears of nuclear war, given tensions over Ukraine and Taiwan, the biggest threat to human survival is more likely to be artificial intelligence or other ‘human software’. Photo: AFP
READ FULL ARTICLE