Newswise — What can a wide-eyed, talking robot teach us about trust?

A lot, according to Northeastern psychology professor David DeSteno, and his colleagues, who are conducting innovative research to determine how humans decide to trust strangers — and if those decisions are accurate.

The interdisciplinary research project, funded by the National Science Foundation (NSF), is being conducted in collaboration with Cynthia Breazeal, director of the MIT Media Lab’s Personal Robots Group, Robert Frank, an economist, and David Pizarro, a psychologist, both from Cornell.

The researchers are examining whether nonverbal cues and gestures could affect our trustworthiness judgments. “People tend to mimic each other’s body language,” said DeSteno, “which might help them develop intuitions about what other people are feeling — intuitions about whether they’ll treat them fairly.”

This project tests their theories by having humans interact with the social robot, Nexi, in an attempt to judge her trustworthiness. Unbeknownst to participants, Nexi has been programmed to make gestures while speaking with selected participants — gestures that the team hypothesizes could determine whether or not she’s deemed trustworthy. 

“Using a humanoid robot whose every expression and gesture we can control will allow us to better identify the exact cues and psychological processes that underlie humans’ ability to accurately predict if a stranger is trustworthy,” said DeSteno.

During the first part of the experiment, Nexi makes small talk with her human counterpart for 10 minutes, asking and answering questions about topics such as traveling, where they are from and what they like most about living in Boston.

“The goal was to simulate a normal conversation with accompanying movements to see what the mind would intuitively glean about the trustworthiness of another,” said DeSteno.

The participants then play an economic game called “Give Some,” which asks them to determine how much money Nexi might give them at the expense of her individual profit.  Simultaneously, they decide how much, if any, they’ll give to Nexi. The rules of the game allow for two distinct outcomes:  higher individual profit for one and loss for the other, or relatively smaller and equal profits for both partners.

“Trust might not be determined by one isolated gesture, but rather a ‘dance’ that happens between the strangers, which leads them to trust or not trust the other,” said DeSteno, who, with his colleagues, will continue testing their theories by seeing if Nexi can be taught to predict the trustworthiness of human partners.

MEDIA CONTACT
Register for reporter access to contact details