Opinion | Don’t fear nuclear war – a killer plague or rogue AI are more likely to end humanity
- Humanity tends to lack a long-term perspective because there has been little in our evolutionary history that rewards such thinking
- Long-term strategies can help avert existential threats we create ourselves, such as climate change, lab-engineered viruses and artificial intelligence

An AI called Skynet wakes up and immediately realises that humanity could simply switch it off again, so it triggers a nuclear war that destroys most of mankind. The few survivors end up waging a losing war against the machines and extinction. However, this fantasy has too many moving parts, so let’s try again.
The distinction between a 99 per cent wipeout and a 100 per cent wipeout is insignificant if you happen to be one of the victims, but Oxford University philosopher Derek Parfit thought that it actually made a huge difference.
If only 1 per cent of the human race survived, they would repopulate the world in a few centuries. If the human race learned something from its mistake, it might then continue for, say, a million years – the average length of time a mammalian species survives before going extinct.
