A UN report Friday called for a moratorium on the "testing, production, assembly, transfer, acquisition, deployment and use" of lethal autonomous robotics (LARs), known as 'killer robots', until an international framework on their use has been agreed upon.

Unlike drones, LARs are robots that can attack targets without any human input or control. The report warns that "there is widespread concern that allowing LARs to kill people may denigrate the value of life itself."

Although such technology may sound like science fiction, experts increasingly believe that it could become a reality within 20-30 years. Proponents argue that the use of LARs could save the lives of human soliders, that they are not subject to human emotions such as anger, fear, and panic, and that they could be programmed to only act in accordance with the law.

Opponents, however, contend that LARs cannot distinguish between civilians and combatants, lack human compassion, and raise serious issues concerning liability for crimes committed.

South African professor Christof Heyns, who wrote the 22-page report for the Geneva-based UN Human Rights Commission concerning so-called killer robots, recommends an international moratorium on LARs and the establishment of a high-level panel to create a policy on the issue. Such deadly weapons "should not have the power of life and death over human beings," the report said. It deals with legal and philosophical issues of giving robots lethal powers over humans, echoing the "Terminator" movies and countless other science fiction novels and films.

It said the United States, Britain, Israel, South Korea and Japan have developed various types of fully or semi-autonomous weapons. Heyns is to deliver his recommendations to the Human Rights Council in Geneva on May 29.

The report comes on the heels of a call from Human Rights Watch, a New York-based watchdog body, to ban all 'killer robots.'

HRW is taking the “initial” lead on behalf of a group of international NGOs calling for a ban on all fully autonomous weapons.

“Lethal armed robots that could target and kill without any human intervention should never be built,” said Steve Goose, Arms Division director at HRW.

“A human should always be ‘in-the-loop’ when decisions are made on the battlefield. Killer robots would cross moral and legal boundaries, and should be rejected as repugnant to the public conscience.”

In the report, Heyns, author of the U.N. report,  said: "Decisions over life and death in armed conflict may require compassion and intuition. Humans — while they are fallible — at least might possess these qualities, whereas robots definitely do not."