Killer robots have the power to target a specific place without the help of any human. Some people think that this is a good idea, while others say that machines that powerful can be dangerous. In August, the UN had a meeting where they discussed about these weapons. The meeting took place in Geneva. Many attendants expected that by the end of the meeting, all countries would be against these machines. However, few countries were in support of this weapons, which means they will continue using this machine. Russia and Israel were the only countries expected to oppose the ban of these weapons since they have advanced programs in their countries. However, the U.S also opposed the ban, something that was a surprise to many.
In July, many researchers said that they could not work on a project that involved the killer weapons, as they would attack without the help of a human being. Most employees of Google rebelled any research related with the weapons. This forced Google to stop working on the project. A university in South Korea stopped working on the robots, because students boycotted studies, not just locally but internationally. Various campaigns aimed at stopping these deadly weapons have been successful.
The U.S government claims that if they stop working on these robots, they would be preventing developments that would ensure the safety of the ordinary citizens. Pentagon’s system states that, it is safe if a human being controlled the weapons. However, the U.S argued that the killer robots would target places and make strikes correctly. They further claimed that humans would perform this job poorly because of their wrong judgements. Professor Ron Arkin from Georgia further claimed that the robot’s system could decide not to fire even when an operator commands it, if it is inappropriate.
Arkin also argued that instead of completely banning the weapons, they should regulate them, to strike only valid targets. He claims that the system in the smarter machines helps differentiate between valid and invalid targets. Humans cannot recognize all these targets. Arkin says that human beings can make the wrong judgements in stressful situations, because they might see what they intend to see in a particular situation. He further claims that modifications made to the killer robots ensure they do not exhibit behavior changes like in humans.
The U.S should however look carefully into the matter of autonomous weapons before investing a huge amount into this project.