By SYDNEY J. FREEDBERG JR.
on November 30, 2018 at 4:25 PM
Army soldiers are testing goggles with an image-recognition system that can automatically spot threats like tanks and warn the rest of the squad — or transmit the target data to a distant missile battery so they can take it out.
DETROIT: Army soldiers are testing goggles with an image-recognition system that can automatically spot threats like tanks and warn the rest of the squad — or transmit the target data to a distant missile battery so they can take it out. It’s the long-awaited realization of the desire to make “every soldier a sensor” feeding intelligence over wireless networks to the rest of the force.
The artificially intelligent target detection will be part of the Integrated Vision Augmentation System (IVAS), for which Microsoft’s HoloLens won a two-year, $480 million contract earlier this week. I’ve written extensively about IVAS’s other features back when it was still called HUD 3.0. It’s essentially an infantryman’s version of a fighter pilot’s Heads-Up Display, combining advanced night vision with augmented reality technology (much like Google Glass) to superimpose a targeting cross-hairs — wirelessly linked to the soldier’s weapon to show exactly where it’ll shoot — and tactical data over the wearer’s field of vision. Other aspects will monitor the medical condition of the soldier and record such things as blast overpressure from roadside bomb blasts to assist medical treatment.
But this is the first I’ve heard about the image-recognition feature. Brig. Gen. Chris Donahue — until recently director of the Army’s infantry modernization team — mentioned it in a discussion of IVAS features he said “we’re already prototyping.” It’s not clear whether it’ll be an integral part of IVAS or an add-on, though he did say “it’s on the body,” which implies there’s either some kind of body camera or a supplemental processor not built into the goggles themselves.
The hard part yet to be figured out, Donahue acknowledged, is how to transmit the data from frontline troops to distant artillery. But the Army has two other modernization teams — for the network and long-range precision firepower (LRPF) — already working hard on how to share targeting data: Once they have a solution, the infantry should simply be able to plug in.
Tracked variant of the Russian Pantsir S1 anti-aircraft missile system (NATO reporting name SA-22 Greyhound)
AI Targeting, In Detail
Here’s how Donahue described the auto-targeting at this week’s AUSA AI conference, with our annotations:
“Imagine you’re fighting some enemy out there and you’ve determined what your target is: The first thing you want to kill is a SA-22, the second thing is an S-300, the next thing is an S-400 (all three of these are Russian anti-aircraft systems, which new Army multi-domain doctrine prioritizes destroying, preferably with long-range missiles, so the Air Force can strike freely — ed.) and the fourth thing is a T-90 (tank).”
“Because of the machine learning and AI built into IVAS, it’s going to instantly look across the battlefield and pick out the SA-22. Why? Because it has the data, it’s seen thousands of images of this, back in its cloud, which probably sits back on the vehicle.” That might be a Humvee, for example, or an armored troop carrier like the 8×8 Stryker or tracked Bradley. This approach is an example of the Army moving away from central servers on fixed bases and instead pushing computing power to individual vehicles or even foot soldiers on the “tactical edge.”
“With the right network … it then shoots directly from that individual, goes all the way back, goes back to the LRPF (missile battery). Probably humans in the loop. They hit a button; LRPF shoots and kills the SA-22.” Pentagon policy requires a human being to make all decisions to use lethal force, although a less scrupulous adversary could save precious seconds by automating the decision to fire. Given how often even cutting-edge image recognition AI screws up, however, it seems like a very good idea to have a trained human check the target really is an enemy tank and not, say, a friendly one or a school bus full of orphans.
An example of the shortcomings of artificial intelligence when it comes to image recognition. (Andrej Karpathy, Li Fei-Fei, Stanford University)
“As that individual (soldier) continues to scan over to the left, they see a T-90 tank. It instantly kicks over to the person in their squad who’s holding the Javelin (anti-tank missile); they shoot the Javelin, they kill the T-90.” Again, Donahue’s talking about the AI automatically warning a human being, who then decides whether to shoot. Given the other features of IVAS already described, it’s likely this warning would consist of some kind of red icon popping up on the augmented reality goggles of every other soldier in the squad, probably with some kind of arrow pointing towards the threat.
This may sound awfully distracting, but it’s a lot better than soldiers having to take their eyes off the battle to look down at a modified smartphone for tactical updates, as with the current Nett Warrior system running the Android Tactical Assault Kit (ATAK) app. Years of “attention management” research on aircraft pilots show that people in combat hyperfocus on the threat in front of them and stop checking instrument displays — which is what you want them to do, rather than wander into combat staring down at a screen like an iPhone addict about to walk into an open manhole.
So in Army testing, soldiers vastly prefer the IVAS approach of putting vital data right in their field of vision. In fact, young troops wanted more data than their seniors originally thought they could handle, another Army one-star said. “What they ask for is more data, not less, because they’re just comfortable, because of the way they’ve been raised, with screens and video games,” said Brig. Gen. Joseph McGee, head of the Army’s talent management task force. “In the experiment with the heads-up display, we gave them just one or two pieces of data, and the soldiers came back and said, ‘I don’t just want one or two, I want 10 or 15 different data feeds coming in.’”
It’s also important to note that the Army is not talking about the AI warning the soldier who’s actually wearing it. That may seem counterintuitive, but given how the human eye and brain are still much more accurate than computerized image recognition, the wearer would probably have already seen anything the IVAS detected.
How ready is this technology? “We’re already doing it. It’s here,” Donahue said. “Now, how you build the infrastructure back behind to do that, we’re not there.” Presumably Donahue’s referring to the wireless networking to transmit the targeting data to artillery batteries many miles away. He might also be referring to the extensive database of images required to teach AI to recognize a tank or anti-aircraft missile.
“Probably nobody’s there,” Donahue added. “We’ve got to build it. It’s a race.”