08/09/2019 / By Lance D Johnson
The Nations of the world are heading into forbidden territory, merging artificial intelligence with weapon systems. Future wars could be waged by criminals and terrorists who hide behind the mask of AI-powered weapon systems. AI-powered weapons could be programmed for genocide, and once unleashed, will be able to engage targets without forethought or human intervention. Robots will be able to calculate their target’s next move in real time and eliminate them, without creed or code. Warfare is entering a new frontier. The United Nations has been holding meetings on this serious topic since 2016.
Now Nobel Prize winners are sounding the alarm about the development of autonomous killer robots. Academics and activists from thirty-five different countries are calling for the U.N. to ban the use of AI-powered weapon systems. Over one hundred non-governmental groups gathered in Berlin at the Geneva conference to convince their U.N. representatives that killer robots should be stopped. Great Britain, Russia and the U.S. are all working on their own fleet of AI-powered weapons, which are currently able to extinguish anyone without oversight from a human. These autonomous weapon systems can be as simple as programmable drones, equipped with assault weapons.
The International Physicians for the Prevention of Nuclear War, 1985 Nobel Prize winners, joined in the protests and wrote an open letter to the U.N. “Fully autonomous weapons select and engage targets without human intervention, representing complete automation of lethal harm,” the letter states. “This ability to selectively and anonymously target groups of people without human oversight would carry dire humanitarian consequences and be highly destabilizing.” 1997 Nobel Prize winner Jody Williams is calling on Germany to lead on this issue and take steps to ban their use. Williams helped get landmines banned in the 90s.
Because the AI weapons are cheap and easy to mass produce, they could easily fall into the hands of terrorist organizations. Autonomous weapons make killing much easier because no human armies are needed to fill the trenches. No human armies are needed to go fight and risk their lives. For this reason, autonomous weapons allow very few machines to cause a great number of casualties in a short amount of time, causing mass destruction. Dr Noel Sharkey, a Professor of AI and Robotics, who helped start the Campaign to Stop Killer Robots, warns, “It is not too late to prevent autonomous weapons. Decades of advocacy efforts have shown, however, that once created and in military use, entire classes of weapons are extremely difficult to eliminate: the threat of nuclear war is growing despite the non-proliferation treaty.”
Not only will the machines make conflict easier to start, and stealth missions easier to carry out, but the technology could also be hacked. Programming errors could accidentally kill many innocent people. Terrorists could use the systems to leverage power very quickly. Activists are also concerned the autonomous weapons will become common law enforcement tools. When a robot takes a life in the heat of the moment, who, or what country or entity will be liable for the death? Intelligence agencies could conduct stealth operations, using robots to kill, with no human or organization to be held accountable. Bonnie Docherty, senior Arms Division researcher at Human Rights Watch warns, “No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party.”
“Each nation must take a stand that autonomous weapons must never come into existence. And we have to do it now, before it’s too late,” says Dr. Emilia Javorsky from the group Scientists Against Inhumane Weapons. Javorsky believes the world’s superpowers must agree that the weapons are “fundamentally wrong” and should never be used.
For more on AI weapon systems, visit Robotics.News.
Sources include:
Tagged Under: accountability, AI, autonomous weapons, dangerous tech, future tech, genocide, global treaty, human rights, killer robots, law enforcement abuses, mass death, military, military tech, programming errors, robotics, robots, terrorism, weapons
COPYRIGHT © 2018 MILITARYTECHNOLOGY.NEWS
All content posted on this site is protected under Free Speech. MilitaryTechnology.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. MilitaryTechnology.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.