Spotlight: Research scientist Peter Khooshabeh

Peter Khooshabeh is an ARL research fellow in ICT’s virtual humans group. His work explores the social effects that virtual humans can have on people in areas including culture, thought and emotion.

By Orli Belman, USC Institute for Creative Technologies

When ICT’s Peter Khooshabeh was an undergraduate at the University of California at Berkeley he worked on developing a virtual practice tool for surgeons. The idea was that an individual interacting in this simulated scenario would show improved outcomes in the operating room. But when Khooshabeh spent time in a real hospital, he observed that technical skill was just one aspect of surgical success. Any useful virtual environment would also need to capture the interpersonal dynamics of such a high-stress, multi-person setting.

“At first we were focused on putting just one person in this virtual environment but there are many players involved in any given surgery,” Khooshabeh said, a research fellow in ICT’s virtual humans research group. “I came to understand that the key to improving performance may not be in the quality of the technology, but in how much you understand about people and how they perceive one another”.

Khooshabeh went on to earn a Ph.D. in cognitive psychology from UC Santa Barbara and continues to leverage technology as a tool to better understand people.

In August 2013, Khooshabeh, along with Jonathan Gratch, ICT’s associate director for virtual humans research, and additional co-authors from USC’s Marshall School of Business and UCSB, presented recent research that uses virtual humans to advance knowledge of non-verbal thought and emotion in real humans. In this study, players took part in a negotiation game where the opposing negotiator was a virtual human programmed to be cooperative or competitive along with either an angry or happy facial expression.

“The expectation was that facial expression would override behavior, meaning people would be threatened by anger no matter if the virtual human was helping them or working against them in the negotiation,” Khooshabeh said.

However, the results showed that it wasn’t a competitive negotiation strategy or the anger expression that caused players stress but whether the strategy and the virtual human’s emotions were matched or not. Specifically, physiological data showed that virtual humans who played cooperatively but looked mad caused participants to show signs of distress, measured by lower cardiac output and increased eye fixations on the virtual human’s face.

“People don’t always respond to angry faces the same way,” Khooshabeh said.

“These results are significant because they suggest context matters in the perception of emotion.”

In another study to be published in an upcoming issue of the Journal of Artificial Intelligence and Society, Khooshabeh and ICT colleagues Morteza Dehghani, Angela Nazarian and Jonathan Gratch gave an otherwise identical virtual character different accents (either Iranian- Chinese- or California-accented spoken English) and analyzed how study subjects with the same ethnicity to the accented virtual character responded to the differently accented characters.

Across the two studies, Iranian-Americans interacting with an Iranian-accented, English-speaking virtual human were more likely to make decisions congruent with Iranian cultural customs. Similarly, Chinese-Americans listening to a Chinese-accented, English-speaking virtual human were more likely to make causal attributions congruent with collectivistic, East Asian cultural ideologies. A very recent study replicates this pattern with Mexican-Americans.

“Accents matter just as much or possibly more than visual information in forming impressions of others and how they affect our thinking,” Khooshabeh said. “Our work provides experimental evidence that accent can affect individuals differently depending on whether they share the same ethnic cultural background as the target culture.”

In addition to being informative for designing virtual humans for training or research tasks, Khooshabeh hopes his research helps make people aware of biases that they might not realize they possess and also contributes to a greater understanding of how people interact and respond to stressful situations, whether they are performing surgery, negotiations or cross-cultural dialogues.

“In the real world everything is mixed up,” he said. “If we want to understand the role of a single informational cue – be it an emotion or an accent – we have to take it into the lab.”

ICT partners with the Army Research Laboratory, which is part of the U.S. Army Research, Development and Engineering Command, which has the mission to develop technology and engineering solutions for America’s Soldiers.

RDECOM is a major subordinate command of the U.S. Army Materiel Command. AMC is the Army’s premier provider of materiel readiness — technology, acquisition support, materiel development, logistics power projection, and sustainment — to the total force, across the spectrum of joint military operations. If a Soldier shoots it, drives it, flies it, wears it, eats it or communicates with it, AMC provides it.