THIS STORY HAS BEEN FORMATTED FOR EASY PRINTING

Robot may furnish lesson in human trust

Interactions could show the basis for harmony

With expressive eyebrows and transfixing blue eyes, a robot named Nexi is helping researchers understand how trust is developed. With expressive eyebrows and transfixing blue eyes, a robot named Nexi is helping researchers understand how trust is developed. (David L. Ryan/ Globe Staff)
By Carolyn Y. Johnson
Globe Staff / July 5, 2010

E-mail this article

Invalid E-mail address
Invalid E-mail address

Sending your article

Your article has been sent.

Text size +

Scientists have long wondered how people decide, even after a brief interaction, whether or not to trust a person. What subtle cues are telegraphed in a handshake, a facial twitch, or a simple change in posture?

Now, to shed light on that age-old question, Boston-area scientists are turning to Nexi, a moonfaced robot with expressive eyebrows, dexterous mechanical hands, and a face that can flick from boredom to happiness.

By controlling how 4-foot-tall Nexi interacts with people, scientists have a new and powerful way to study the signals that allow people to trust one another, or not, within minutes of meeting.

“There should be some signal for trustworthiness that’s subtle and hard to find, but [it is] there,’’ said David DeSteno, a psychologist at Northeastern University and one leader of the experiment.

Nexi offers advantages over using a human participant because people give off subtle ges tures, or engage in unintentional mimicry, that can be hard to measure or control, and probably influence whether someone trusts them. Nexi has many of the expressive abilities of a person, but researchers can tightly control every aspect of her behavior — allowing them to test what nonverbal cues might make her seem more or less trustworthy.

Because the next generation of robots may be more like teammates, and less like tools, the research could help roboticists who are looking for ways to make sure they can design machines that will be trusted partners.

On the flip side, trust can be used not only by people seeking to do good, but also by marketers and con men. If trust can be deconstructed, it could also be manipulated. That makes understanding trust’s origin even more important.

“It’s a very serious question if you can design these robots and influence people’s judgments of trust,’’ said Cynthia Breazeal, director of the personal robots group at the MIT Media Lab, which led development of Nexi. “You have to understand what would be appropriate in different contexts.’’

The experiment — a collaboration between MIT, Northeastern, and Cornell University — has not yet revealed the basic components of signaling trust. But so far, researchers believe that perceiving trust is not merely a matter of one person projecting a shifty eye or some other untrustworthy vibe; instead it is a complicated interaction in which people may unconsciously mimic one another, and through their own motions learn something about the other person’s internal motivations.

“We think it’s a dynamic unfolding dance between two people,’’ DeSteno said. “You start fidgeting; I’ll start fidgeting. . . . My muscular movements mirror yours. We know from psychological research the body and brain can sense these movements and that builds a gist of what people are feeling.’’

The work builds off an initial study by Cornell researchers that suggested that such signals must exist. In an initial study published in 1993, researchers put people in a situation where they could either cooperate or defect to win money. Before they played, people interacted with their partners in the game and even talked about their strategies.

Researchers found that people were able to predict whether their partners would cooperate or defect better than if they were making random guesses, suggesting that somehow, they could discern whether the other person was trustworthy or not.

In experiments with Nexi designed to zero in on nonverbal cues and unconscious mimicry that could signal trustworthiness or a lack of it, research participants sat down in a wooden chair across from Nexi. Unlike the previous experiment, research participants were told nothing about the experiment and just talked with Nexi about everyday topics, ranging from summer plans to basketball, to ensure that the interaction mimicked the types of interactions people have every day with strangers.

Controlled by a team of scientists in an adjacent room, Nexi systematically made gestures or behaviors in half of the conversations that researchers think may signal untrustworthy behavior. In the other half, she made random conversational gestures.

One subject, Alex Deixler, a sophomore at Northeastern, chatted with Nexi about the Lakers and Celtics and her favorite TV shows.

“At first it was kind of a little weird, obviously, because you’re talking to a piece of metal,’’ Deixler said. But things began to feel more natural, and she said that the robot seemed a lot like what you would expect from the ways robots are portrayed in science fiction.

Still, socially interacting with Nexi at a superficial level was OK, but Deixler said she would find it hard to trust a robot.

At the end of the experiment, researchers measured how trustworthy the participants found Nexi to be using an economic task in which they decided how many tokens to exchange with Nexi and predicted how many tokens Nexi would give them.

The researchers have yet to analyze their data, and plan in their next step to test their results by giving Nexi the ability to predict how trustworthy her conversation partners are. Ultimately the project, funded by the National Science Foundation, could have the practical benefits of creating a robot that humans can trust, but scientists are also probing a deeper question about human existence.

“How do nice people manage to survive in the world?’’ said Robert H. Frank, an economist at the Johnson Graduate School of Management at Cornell University, who is collaborating on the project. “The issues in our project are to try and understand the signals people rely on to decide whom to trust. . . . What’s interesting to me is how mechanical the process of interacting with another being turns out to be.’’

Carolyn Y. Johnson can be reached at cjohnson@globe.com.

Connect with Boston.com

Twitter Follow us on @BostonUpdate, other Twitter accounts