Spotting a terrorist
Next-generation system for detecting suspects in public settings holds promise, sparks privacy concerns
CAMBRIDGE - Metal detectors, X-ray machines, and dogs are used at security checkpoints to look for bombs. Now a next-generation technology under development in Cambridge will look for the bomber.
With funding from the US Department of Homeland Security, Draper Laboratory and other collaborators are building technology to detect potential terrorists with cameras and noninvasive sensors that monitor eye blinks, heart rate, and even fidgeting.
The project, called the “Future Attribute Screening Technology,’’ is aimed at allowing security checkpoint personnel at airports or large public events to make better, faster decisions about whether a person should get follow-up screening.
At a demonstration of the technology this week, project manager Robert P. Burns said the idea is to track a set of involuntary physiological reactions that might slip by a human observer. These occur when a person harbors malicious intent - but not when someone is late for a flight or annoyed by something else, he said, citing years of research into the psychology of deception.
The development team is investigating how effective its techniques are at flagging only people who intend to do harm. Even if it works, the technology raises a slew of questions - from privacy concerns, to the more fundamental issue of whether machines are up to a task now entrusted to humans.
“I know what they’re doing, and I’m ambivalent,’’ said Paul Ekman, a consultant on the project and an eminent psychologist who pioneered the study of facial expression and emotion.
“I can understand why there’s an attempt being made to find a way to replace or improve on what human observers can do: the need is vast, for a country as large and porous as we are. However, I’m by no means convinced that any technology, any hardware will come close to doing what a highly trained human observer can do,’’ said Ekman, who directs a company that trains government workers, including for the Transportation Security Administration, to detect suspicious behavior.
The researchers hope to have the device ready for field testing in 2011, perhaps at a border crossing. If it works - even Burns concedes that’s no sure thing - it could be used by government agencies. There are no immediate plans for commercializing the technology, which has cost about $20 million to develop.
At the demonstration, actors walked one-by-one into a room with a metal detector, a guard, and a set of sensors that monitored their reactions while they spent a few minutes answering a dozen questions, ranging from where they lived to whether they planned to detonate a device.
Most of the system’s sensors are commercially available. An eye tracker measures blinks, gaze direction, and pupil dilation. Two separate devices track heart rate and respiration. A thermal camera measures the way heat changes on a person’s face. And underfoot, an accessory normally used with the Nintendo Wii gaming system, has been repurposed to detect fidgeting.
Burns said being able to simultaneously observe a suite of traits - such as slight variations in the interval between heartbeats, the way a person’s pupils dilate, the way the heat on their face changes, or whether they stop moving when asked a certain question - makes the system more accurate.
“I think it is very interesting, but it also sets off alarm bells,’’ said Jennifer Lerner, a professor of public policy and management at the Harvard Kennedy School who has studied how facial expressions can be a readout of biological responses to stress. “The key here is to not to get too impressed by the physiological measurement and to pay attention to the validity of the science.’’
Researchers are evaluating how well the technology can detect a person’s intentions to do harm, compared with a human observer’s ability to do the same thing. They say initial results are promising.
They also plan to test whether the software can distinguish malicious intent from other things that fluster a person. In the future, for example, experiments might include subjects who have to run to get to the screening checkpoint, or have an initial experience with a rude guard.
Hardened terrorists might be able to control their heartbeat or breath, but researchers say that is one reason they are looking at multiple traits thought to be involuntary. Even if someone could control many physiological factors, researchers said that a person who lacked normal bodily reactions to questions would also raise a red flag.
A privacy watchdog, after being told of the technology, expressed concern the technology would misidentify innocent people who might then be branded as potential terrorists by the Department of Homeland Security.
“For goodness sakes, you are at an airport,’’ said Lillie Coney, associate director of the Electronic Privacy Information Center. “How many people are calm? People running to get to the gate, sweating through the security line, will I get there before my plane takes off?’’
“This agency does maintain watch lists, it does maintain a number of other programs. It’s important for them not to create files or reports or records on individuals because this technology picked up something,’’ Coney said.
Burns said the technology would erase data after each screening, and no personal information would be used to identify subjects, create files, or make lists. He said there would be close oversight, and regulations would be put in place to protect privacy if and when the technology is deployed.
Ultimately, he hopes the technology can be developed to a point where it does not depend on an interviewer asking questions, but scans people as they walk through a checkpoint.
“I remember when I could freely go through airports, buildings,’’ Burns said. “I’d like to restore that - walk through security with my 4-year-old daughter, and not have her walk in front of me.’’
Carolyn Y. Johnson can be reached at firstname.lastname@example.org.