Controlling the rapidly evolving artificial intelligence of "killer drones" could well become one of the key disarmament questions of our time.
Unmanned aerial vehicles (UAV), or drones as they are often called, have had a good press recently: from humanitarian rescues to the promotion of free trade, these remote-controlled, pilotless robots are the heroes of the moment.
Still, their biggest use continues to be for military purposes. And although the deployment of drones in military operations is nowhere near as morally objectionable as some critics allege, the inevitable proliferation of UAVs does raise some serious security questions.
The real challenge is not to prevent nations or corporations from acquiring them but, rather, to adopt international safeguards on how independent such drones are from their owners and operators.
For controlling the rapidly evolving artificial intelligence of "killer drones" could well become one of the key disarmament questions of our time.
Undoubtedly, the beneficial, civilian uses of drones remain huge. UAVs can deliver supplies such as medicines and food to remote areas.
Drones can also ensure respect for law and property. Germany's national railways, for instance, are deploying mini-drones to catch graffiti vandals and reduce damage to the country's transport infrastructure.
Meanwhile, Kruger National Park in South Africa, which is losing 1,000 rhinos a year to illegal poaching, hopes to stem this threat to endangered animal species when its new fleet of drones goes into action.
UAVs have revolutionised media coverage of global conflicts, and the response to such crises. And they will transform the rules of commerce, due to their ability to deliver goods cheaply and fast.
Amazon, the world's top online retailer, is not the only corporation testing a fleet of small drones to deliver its packages; Google X, a secretive unit of the internet information giant, is developing its own prototype drones.
Yet, at least for the moment, the single biggest users of drones remain the military: all told, there are more than 8,000 unmanned aircraft in the US military inventory alone, on top of another 12,000 unmanned ground vehicles.
And although the United States and Israel retain a substantial lead in UAVs - with China alongside Europe, India and Russia still in the second tier - up to 87 countries are known to have such aircraft in their military.
To date, only the US, Britain and Israel have used armed drones in an overt, operational environment in which they killed opponents. But the reason why other nations have not used drones is political, not technological, for almost every government is developing an offensive UAV capability.
Some of the humanitarian and moral arguments against the use of drones are either unproven or nonsensical.
The most common criticism is that drones often kill innocent civilians. True, but other weapon systems produce much more "collateral damage" - as the military euphemistically likes to refer to civilian casualties - than UAVs.
Organisations which monitor the performance of US drones indicate that in Pakistan, for instance, an average of 6 per cent of those killed by drone attacks are civilians, while the rest of those killed have some known association with al-Qaeda or the Taliban.
Such a civilian death rate compares very favourably with the 60 per cent civilian death rates in World War II, the 70 per cent civilian casualty rate in Korea, or the 50 per cent civilian share of fatalities recorded as late as the 1990s during the Yugoslav civil wars.
Critics also claim that killing people with drone attacks generates resentment in places such as Pakistan or Yemen.
"It creates enemies just as it obliterates them," wrote Mark Manzzetti, who won a Pulitzer Prize for his investigative journalism into America's drone wars.
But nobody has ever produced any evidence for this assertion. And some of the strongest anti-American sentiments are recorded in Muslim countries which were never subjected to US drone attacks, places such as Egypt and Jordan.
It is also suggested that the introduction of drones makes going to war more likely. This argument is more persuasive but ultimately not very conclusive either, for it can be argued that, had US President Barack Obama not been able to use drones against terrorist organisations, the American military would have been embroiled in even bigger and bloodier ground offensives in places such as Pakistan or Yemen.
That does not mean that the advent of the UAVs is not disruptive for the current global order. The increasing use of drones forces other countries to engage in ever more sophisticated deception techniques to hide their military capabilities.
South Korea, for instance, wants to buy America's Global Hawks, one of the biggest drones in existence, costing US$250 million each, to peer deep inside North Korea.
But it's easy to predict the North Korean response: even deeper tunnels to hide artillery pieces and troops. So, instead of greater military transparency, drones risk promoting more deception, thereby increasing the risk of countries stumbling into conflicts.
Drones also act as entry points to technologies which could be used for other purposes: one reason the Americans hesitated in selling Global Hawks to South Korea was that they fall under the prohibition of the Missile Technology Control Regime; the arrival of these machines in South Korea's arsenal is also unlikely to be welcomed by China.
But the biggest problem is no longer the reality of drones and their likely proliferation; it is the fact that they are getting both much smaller and more "intelligent" at the same time, unleashing a new arms race with incalculable consequences.
The first generation of unmanned systems was much like the piloted aircraft; most even had cockpits, although nobody ever sat in them.
But UAV prototypes now being tested resemble birds rather than aircraft, and are increasingly disconnected from the human brain which invented them.
One example is the new Switchblade surveillance drone that is carried in a tube the size of a shoebox but can fly at the speed of an ordinary car, and can detonate itself against a target which it picks of its own accord.
At the other end of the spectrum, there is the Taranis, a highly complex drone being tested in Australia by Britain's BAE Systems: It is capable of refuelling in mid-air with no guidance, and of selecting its own targets.
Due to its radar-evading capabilities which dictate its shape, the Taranis has to fly itself; it cannot be fully controlled remotely. So, in effect, it is an independent killer agent of a state.
There is also the Transformer drone which saves fuel by effectively morphing into three different drones when it reaches its target, with each drone "doing its own thing".
And if that is not enough, the British are working on a UAV which will carry on board a 3D printer tasked with "producing" extra mini-drones as required; the age of the drone which not only acts independently but also spawns a further generation of drones in mid-air is now upon us.
And that's before one even considers the US Navy's latest generation of stealth jet drones launched not only from aircraft carriers, but also from submarines; all of these may be available by the end of the decade.
A report issued by a team of top British security service experts last week argued that the advent of these "autonomous killer robots" represents a serious future danger to all the international law norms which underpin global security.
"We are not persuaded that it will ever be possible to programme the laws of war into a killer robot," says Sir David Omand, who served as Britain's top spy and who chaired the task group.
For the moment, no government or senior military commander intends to respond to such ideas for regulation; technologies are far too secret and capabilities far too speculative for any nation to propose any restrictions.
Still, there are indications that the bond between drones and humans will not be cut off as quickly as some fear. For a recent study by American academic Julie Carpenter has discovered that soldiers are developing an "emotional attachment" to their UAV machines, in "ways similar to a human or a pet".
Whether that affects the way the UAVs themselves behave, or whether the drones will reciprocate the affection heaped upon them by their handlers, remains to be explored. But it's clear that the days when humans alone controlled warfare between states may be numbered.
©2014 the Asia News Network (Hamburg, Germany)
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.