Important news stories sometimes slip under the radar when much is happening on a given day. That can only be the explanation as to why the world's media gave such limited attention to what is surely among the most significant items of this past year: the US military's deployment of machine-gun-toting robots in Iraq.
Having been a science fiction enthusiast since childhood, I am well aware of the ethical conundrum posed by robots that have the power to kill. The prolific author Isaac Asimov set the ground rules in the early 1940s with his Three Laws of Robotics, which governed the behaviour of the machines throughout his fiction and were adopted by many other writers of the genre. The laws were: a robot may not injure a human being or, through inaction, allow a human being to come to harm; a robot must obey orders given to it by human beings except where such orders would conflict with the first law; and a robot must protect its own existence as long as such protection does not conflict with the first or second laws.
Given such laws, I assumed that any robots created for war zones would only support human soldiers. Little did I realise that the brains behind the US military do not much care for science fiction. If they did, they would know that three generations of writers have concluded that the machines are best left to doing humankind's dirty and mundane work. That does not mean killing people, but cleaning, performing repetitive tasks and venturing into dangerous places.
This latter use was certainly the reason the first robots to venture on to a battlefield were deployed in Afghanistan and Iraq. They have become an essential component through spying on and searching for the enemy, and detecting and clearing explosive devices.
South Korea's Samsung Corporation, the Korea University and two other institutions took the step that Asimov had foreshadowed - and fictionally legislated against - when they unveiled a static, remotely controlled, robot turret sentry armed with a 5.5mm machine gun in September last year. The country's government later announced the robot would be deployed along the Demilitarised Zone between the two Koreas. In June, Israel revealed that it had rolled out a similar system along its border with the Gaza Strip, to protect against attacks by militants.
But the most chilling revelation was the US military's announcement in August that it had robots armed with guns moving remotely about the battlefields of Iraq. News wires ran the story and a few media features followed, but the item quickly fell off news agendas. It has all been forgotten as an important event of the year, buried by global warming, the downfall of long-time political leaders and the subprime credit crisis.
But the fact that lawless robots armed with automatic weapons are on the prowl somewhere in the world is, surely, just as far-reaching. A human can be charged and prosecuted for murder - but a machine cannot. How many of the robots have been deployed in Iraq is unknown. The three Talon Sword robots revealed to have been purchased by the US 3rd Infantry Division, based south of Baghdad, were among 80 ordered by the military.
Armed with M240 machine guns or .50 calibre rifles, the Sword robots are a modified version of the track-wheeled, bomb-disposal devices used throughout the world. They are operated by soldiers with modified laptop computers and joysticks.
Now it's not too hard to imagine robots taking on an enemy without human intervention. As dark as that scenario is, there is a bright spot courtesy of another much-overlooked story this year. In March, South Korea's government said it was drafting an ethical charter to govern how robots would function beside humans.
A spokesman for the Commerce, Industry and Energy Ministry showed that South Korean authorities were better-read than their American counterparts. In a pleasing reassurance for me and other science fiction lovers, he said the guidelines were expected to reflect the laws originally proposed in 1942 by Asimov.
Peter Kammerer is the Post's foreign editor