For a species that loves new technology - that lives and dies by it - it's strange how we sometimes demonise our own creations. Last time, it was genetically modified foods. They might just save the world, except that there's something "sciencey" and unnatural about them. Hard-to-understand foods? No, thank you.
Drones, or unmanned military systems, are GM's heirs. Everywhere we're encouraged to fear and detest these newfangled planes without pilots. We read about China and Japan embarking on a "drone arms race", or about the US developing a new "killer drone", or about autonomous "killer robots" that will snuff out human lives with all the moral uncertainty of a vending machine delivering cups of tea.
A lot of this is hysterical and alarmist. But there are also genuine concerns. Much like GM foods, drones lead us into a maze of legal and ethical questions that require serious debate - not facile conclusions one way or the other.
So first, the benefits. Most unmanned aerial vehicles (UAVs) are harmless, unarmed systems for surveillance and reconnaissance. Many countries already operate them. They're cheaper than manned aircraft, they can stay in the air for longer, and no one dies when they crash. They're masters of the three Ds: "dull, dirty and dangerous" jobs that humans don't want to do, or can't do. They save money, time, effort and lives. Tick!
The controversy arises when you start strapping bombs onto them. Weaponised UAVs like the US' notorious Predators and Reapers have given drones a bad rep because of the way Washington has been using them to attack enemy targets, mainly in Pakistan.
This policy isn't as outrageous as it seems. It's an open secret that the Pakistani government has given the US permission to conduct its drone missions there - it's just that neither side openly admits it. This denial sends a garbled message to other countries acquiring an armed UAV capability that drones are somehow a weapon apart, exempt from the rules of war and sovereignty. They are not.
It's true that America's use of drone strikes has risen exponentially. An estimated 4,700 people have now died in these strikes but, like it or not, this is the way of the future.
The apparent impunity of these things upsets people most of all. There's an impression that UAVs wipe out a lot of innocent people as they hunt their targets - and that no one ever has to answer for that. The drones' operators are typified as being unable - or too lazy and distracted - to differentiate between a target and innocent bystander as they stare, far away, at their monochrome screen. And there's a sense that it's inherently ghoulish to use a glorified flying PC to kill human beings.
But this portrayal isn't real. The idea that a UAV operator has scant regard for human life simply because he pulls the trigger remotely is unfair. UAV pilots have baulked at firing on those little black blobs precisely because they were unconvinced about the validity of the targets. Then there's the footage - leaked by whistleblower Bradley Manning - of US Apache helicopter pilots casually blitzing a crowd of children and journalists in Iraq. Immediacy is no guarantor of humanity, just as remoteness does not necessarily invite recklessness.
As for our discomfort about the use of robots to kill humans, it's important to remember what we're really dealing with here. Predators might come across as chilling, brutal even, but in the end they're not so different from a piloted fighter plane: they're still weapons operated by a man. The burden of responsibility still rests squarely on the human operator, and with the politicians who give the orders.
The biggest controversy of all comes when UAVs become truly autonomous, and the only human in the "kill chain" is the poor bloke at the end of it. We aren't there yet, but UAV technology is advancing at an extraordinary rate. Aeroplanes always had to be designed around a human pilot: now the aircraft has been liberated. Soon there will be UAVs smaller than a bee, and bigger than a 747. Some will be able to stay airborne perpetually. The possibilities are incredibly exciting.
But they are also disturbing, at least if we allow governments to replace our air force pilots with autonomous combat UAVs. Right now, we seem to be drifting much too casually towards this risky outcome. After all, if a human operator can't tell the difference between a gunman and a child, or a jihadi training camp and a wedding, then how are these flying hard-drives supposed to know?
So, it's time to propose a new arms treaty: the Convention on the Use of Autonomous UAVs. It will stipulate that no machine can kill without there being a human controller somewhere in the loop. Arms conventions have been successful in curbing cluster bombs, anti-personnel mines, and other nasty military creations. Let's forestall the "killer robot" before drones, for all their benefits, really become something to panic about.
Trefor Moss is an independent journalist based in Hong Kong and former Asia-Pacific editor for Jane's Defence Weekly. Follow him on Twitter @Trefor1