IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Opinion: Artificial Intelligence’s Military Risks, Potential

In the future, all combat decisions — including targets, how much to fire to minimize collateral damage — could be made by robots, with humans just monitoring the battlefield situation from a central command.

A digital concept image of a neural network.
Neural networks try to simulate the brain by processing data through layers of artificial neurons.
Shutterstock/Tatiana Shepeleva
(TNS) — Former Secretary of the Navy J. William Middendorf II, of  Little Compton , lays out the threat posed by the Chinese Communist Party in his recent book, "The Great Nightfall."

With the emerging priority of artificial intelligence (AI), China is shifting away from a strategy of neutralizing or destroying an enemy’s conventional military assets — its planes, ships and army units. AI strategy is now evolving into dominating what are termed adversaries’ “systems-of-systems” — the combinations of all their intelligence and conventional military assets.

What China would attempt first is to disable all of its adversaries’ information networks that bind their military systems and assets. It would destroy individual elements of these now-disaggregated forces, probably with missiles and naval strikes.

Now, everything from submarines to satellites, tanks to jets, destroyers to drones, are AI connected by China. The People’s Liberation Army is developing autonomous vehicles that scout ahead of manned machines or provide supporting fire alongside them. These machines would be smart enough that a single human could supervise a whole pack of them. By replacing humans with electronics, combat vehicles will be more fuel-efficient, harder to hit and cheaper to build and operate. With AI at the helm, a central command could launch a multi-pronged attack from land, air and water simultaneously without any humans at the warfront.

All combat decisions, such as targets and how much to fire to minimize collateral damage, would be made by robots. It could come to humans just monitoring the battlefield situation while sitting away safely in a central command center taking corrective action as needed. AI vastly increases the speed at which tactical and even strategic decisions will be made.

The rise in the use of unmanned aerial vehicles — commonly known as drones — in both military and commercial settings has been accompanied by a heated debate as to whether there should be an outright ban on what some label “killer robots.” Such robots, which could be in the air, on the ground, or in and under water, theoretically incorporate AI that would make them capable of executing missions on their own. The debate concerns whether AI machines should be allowed to execute military missions, especially if there is a possibility that human life is at stake.

To understand AI, it is important first to understand the difference between an automated and an autonomous system. An automated system is one in which a computer reasons by a rule-based system, and the output will always be the same. An autonomous system (AI) is one that reasons probabilistically given a set of inputs, meaning it will produce a range of behavior. AI means the human faculties of judgment; feelings and belief may no longer be taken into consideration.

AI technologies are now widely used in tactical warfare situations, such as target acquisition for missiles launched from drones. But the actual command to fire the missile is reserved for the human operator. What might happen if the decision time is reduced from minutes to seconds, removing the human operator from the process entirely? And might that scenario be adopted by one of our less responsible adversaries? Or might it just happen accidentally — an accident far more plausible than ever before. A global leader in AI will emerge in the near future, achieving enormous international clout and the power to dictate the rules governing AI. The world will be safer and more peaceful with strong U.S. leadership in AI.

(c)2021 The Providence Journal (Providence, R.I.). Distributed by Tribune Content Agency, LLC.