Killer Robots and the Losing Battle for Humanity

Killer Robots and the Losing Battle for Humanity

Closing the distance between you and your opponent is what secures victory in an attack. That principle of combat is as old as combat itself.

Closing the distance breaks your opponent’s structure, collapsing any type of defense he may be capable of mounting.

This is why maintaining distance is critically important in any defensive maneuver – be it in a street brawl or an all-out naval battle.

This is also why any weapon that allows you to kill your enemy from a distance provides a telling advantage.

For thousands of years, innovators have imagined and built such weapons. And for thousands of years, generals and military tacticians have employed such weapons on the battlefield.

Now it appears we are just a few years from seeing weapons capable of selecting targets and delivering force without any human interaction at all.

Any weapon that allows you to kill your enemy from a distance provides a telling advantage.
Any weapon that allows you to kill your enemy from a distance provides a telling advantage.

The Rise of Killer Robots

Deadly, largely autonomous weaponry isn’t entirely new. The US military employs such systems for defensive roles, such as shooting down missiles headed toward ships.

The drones the US Central Intelligence Agency used to assassinate Al Qaeda commanders in Pakistan and the Hindu Kush were obviously used for attack purposes.

But they were controlled by human pilots, albeit some were thousands of miles away.

With the development of artificial intelligence, however, military organizations throughout the developed world are now on the verge of something entirely different.

The US Central Intelligence Agency used drones to assassinate Al Qaeda commanders in Pakistan and the Hindu Kush.
The US Central Intelligence Agency used unmanned drones to assassinate Al Qaeda commanders in Pakistan and the Hindu Kush.

Robot Gunboats

If you happen to live along the east coast of Virginia, then you may have heard of Wallops Island. The US government uses that remote area to launch rockets and test new weapons technology.  

Had you been strolling along the shore there in October 2018 you would have glimpsed a few 35-foot inflatable boats darting through the shallows.

Chances are, you’d have thought little of them. Thousands of boats crisscross the Virginia coastline every day. But had you looked closer, you would have realized that none of the inflatable boats had a crew.

The US government uses the Wallops Island Research Range to launch rockets and test new weapons technology.
The US government uses the Wallops Island Research Range to launch rockets and test new weapons technology. 
(Photo: NASA/Wikimedia Commons) 

They were using high-tech gear to sense their surroundings, communicate with one another, and position themselves in the water.

The boats form part of a US Marine Corps program called Sea Mob. Sources familiar with the program say it’s a major step toward the weapons technology of the future.

If killing your enemy from a distance is an advantage, then Sea Mob is beyond cutting edge. The boats in the program are basically autonomous killer robots.

In theory, the .50-caliber Browning machineguns that can be strapped to the vessels could provide cover fire for US Marines taking a beachhead.

The guns could pulverize the trees and rocks beyond with a wall of bullets, suppressing enemy movement and fire – all without human supervision.

The .50-caliber machineguns that can be strapped to Sea Mob vessels could provide cover fire for US Marines taking a beachhead.
The .50-caliber machineguns that can be strapped to Sea Mob vessels could provide cover fire for US Marines taking a beachhead.

“Active Technology Transfers”

The US Marines are not alone in the effort to develop lethal autonomous weapons systems. Every branch of the US military is currently seeking ways to create a new – and less human – kind of warfare.

“We have active technology transfers with the Office of Naval Research, Naval Research Laboratory, Air Force Research Laboratory, Army Research Laboratory, and the Defense Advanced Research Projects Agency,” Dr. William Roper, former head of the Pentagon’s Strategic Capabilities Office, told US senators in May 2017.

For instance, the US Navy is experimenting with a 135-ton autonomous warship ship called Sea Hunter that can hunt and destroy enemy submarines on its own.

The US Navy is experimenting with a 135-ton autonomous submarine destroyer called Sea Hunter.
The US Navy is experimenting with a 135-ton autonomous submarine destroyer called Sea Hunter. (Photo: John F. Williams/US Navy)

The US Army is meanwhile developing a system called the Joint Air-to-Ground Missile (JAGM). The system has the ability to pick out vehicles to attack and destroy without human say-so.

In discussing these AI systems, US military officials generally say that humans will retain some level of supervision over decisions to use lethal force.

But their statements grimly leave open the possibility that robots could one day make such choices on their own.

The US Army's Joint Air-to-Ground Missile system has the ability to pick out vehicles to attack and destroy without human supervision.
The US Army’s Joint Air-to-Ground Missile system has the ability to pick out vehicles to attack and destroy without human supervision. (Photo: Tad Browing/Wikimedia Commons)

Beyond the Bleeding Edge

The trend is by no means confined to American fighting forces. The world’s most powerful militaries, eyeing each other in a silent but fierce arms race, are financing the most cutting-edge trials.

In May 2018, the Russian military revealed it had combat-tested its Uran-9 robot tank in Syria.

China is developing large, smart and relatively low-cost unmanned submarines that can roam the world’s oceans to perform a wide range of missions.

The Chinese company Ziyan is already selling its Blowfish A3 – an autonomous aerial drone with a machinegun – in the Middle East, Futurism reports.

The Russian military revealed it had combat-tested its Uran-9 robot tank in Syria.
The Russian military has revealed that it combat-tested its Uran-9 robot tank in Syria.
(Photo: Vitaly Kuzmin/Wikimedia Commons)

Israel already uses some of the most advanced autonomous killing machines. The Israeli military regularly deploys an armed ground robot to patrol the Gaza border.

Up above, it has the Harpy: an autonomous missile – or loitering munition – that circles the skies until it finds its target.

“There are not only legal and ethical concerns about lethal autonomy but practical ones as well,” says Paul Scharre, a former US Army ranger who wrote the Pentagon’s earliest policy statement on killer robots. “How does one control an autonomous weapon?”

The Israeli Harpy is an autonomous missile that circles the skies until it finds its target.
The Israeli Harpy is an autonomous missile that circles the skies until it finds its target.
(Photo: Julian Herzog/Wikimedia)

“Any Other Avenue”

No law governs the AI arms race. Countries developing killer robots are in a free-for-all.

Scientists have sounded the alarm. Military personnel, philosophers, and lawyers have contributed to the discussion.

More than 250 research and academic institutions and 3,000 prominent players in the field have called for a ban on killer robots.

Most states participating in diplomatic talks on lethal autonomous weapons have expressed a strong desire to negotiate a treaty amid mounting public concern.

China is developing large, smart and relatively low-cost unmanned submarines.
China is developing large, smart and relatively low-cost unmanned submarines that can roam the world’s oceans.

Some 30 countries urged a ban on killer robots during the Convention on Conventional Weapons meeting in Geneva last year.

But Russia and the United States repeatedly rejected any references in the meeting’s final report on the need for “human control” over the use of force.

The meeting subsequently ended without an agreement.

“We’d still like to see the CCW succeed,” Mary Wareham, the global coordinator of the Campaign to Stop Killer Robots, told Politico recently. “But what happened … is forcing us, and I think others, to explore any other avenue – because the CCW is not going to produce.”

The Israeli military regularly deploys an armed ground robot to patrol the Gaza border.
The Israeli military regularly deploys an armed ground robot to patrol the Gaza border.
(Photo: Israeli Defense Forces)

A Hard-Won Regret

Throughout our times, war and death have proliferated everywhere. We have learned to understand war not as the result of any politics that can possibly matter – but as a consequence of profound regret.

In that light, humanity has come to see war as a negative force whose only meaning is death on a massive scale.

Against that dark trench, our lives, frail and perilous as always, have often become conscious endeavors of resistance – of strident opposition – to such death.

Killing machines are the very definition of cold-blooded slaughter. They will allow no room for any shred of humanity that attempts to emerge in times of war.

The robot Achilles will not be moved to return Hector’s body to Priam. There will be no truce, no carols in the trenches of the Western Front on Christmas Eve.

Should we really relinquish control over the conduct of our deadliest and most egregious folly to artificial intelligence?

YOU MIGHT ALSO LIKE

|

Global Site Search

Get-our-free-newsletter-resized-1.png
newsletter option text

Your details will never be shared with any third party. Unsubscribe at any time with a single click.

The posts on this site sometimes contain an affiliate link or links to Amazon or other marketplaces. An affiliate link means that this business may earn advertising or referral fees if you make a purchase through those links.