This Man Is Not an Agent of Self-Aware Killer Drones

Dick Stottler is the 47-year old founder of a California software company specializing in artificial intelligence. The Air Force wants him to teach its drones to anticipate the movements of human pilots. Which raises an obvious question: is he preparing the robots to rebel against their human masters? “No, I am not,” Stottler promises. He […]


Dick Stottler is the 47-year old founder of a California software company specializing in artificial intelligence. The Air Force wants him to teach its drones to anticipate the movements of human pilots. Which raises an obvious question: is he preparing the robots to rebel against their human masters?

"No, I am not," Stottler promises. He doesn't look forward to the robot apocalypse, "other than the fact that such things are not technologically possible." A robot sympathizer would want us to think that, of course. But let's drop it for now.

The Air Force recently gave Stottler Henke Associates $100,000 to deliver a software package that can keep drones from colliding into human-piloted planes as they take off and land. Stottler's proposal, called the Intelligent Pilot Intent Analysis System, models pilots' behavior in manifested and predicted scenarios: how they take off, how they land, how they maneuver in between. It also incorporates information from Air Traffic Control and guidance for specific runways. All that will tell the drone how to react when a plane veers close or the trajectory of the two planes might portend a crash.

Put simply, it's analogous to getting a drone to think like a pilot, getting into his head. And it's a big step for drone autonomy. "We're encoding that knowledge that human pilots have, what they're going to do," Stottler says.

The model can't do everything, Stottler concedes. It has limited ability to deal with anomalous or erratic pilot behavior. There's an allowance for a damaged plane and the software's algorithms incorporate procedures from Air Traffic Control for an aircraft in trouble. Still, "if a pilot were to do something unusual unexpected we have nothing to say about that," Stottler says.

A Michigan-based company also designing an algorithm for drones to anticipate pilot behavior, Soar Technology, declined a request for an interview.

For all the effort to increase a drone's autonomy, the project Stottler's working on is limited to how the drones behave when taking off and landing. While Stottler wrote in his proposal that his algorithms are "directly applicable to finding terrorists and smugglers," its application on the current project won't be.

In short: it won't teach a drone to fire its missiles on its own. Which from the perspective of humanity is auspicious, considering the expansion of the military's reliance on drone war, as demonstrated by Thursday's announcement that Predators will stalk Libya.

"The military is pretty conservative, old school," he says. When it comes to firing weapons, "they always want to have a man in the loop."

Stottler estimates that even if his software convinces the Air Force to choose his company for the second phase of the contract, it won't get situated aboard a drone for another three and a half years. That suits him fine. "I've always been interested in autonomous aircraft," he says. "It's a cool intersection of two things for me: aircraft and robotics." Hmm -- and just when it seemed like he wasn't a Cylon agent...

Photo: U.S. Air Force

See Also:- Coming Soon From the Air Force: Mind-Reading Drones