Robots that Influence Human-Human Trust-Related Behaviors in a Team

Group Trust

As robots become increasingly incorporated in teams with humans, robots must not only be able to perform their tasks accurately, but also contribute positively to the social dynamics of the team. If groups have high levels of group trust or psychological safety (the shared belief that group members are safe to take risks, admit mistakes, and ask for help without fearing social judgment), they are able to more effectively learn from mistakes and grow in productivity as a team. In a study with groups of three adult participants, we explored how a social robot can improve interpersonal trust between members of a group by having a robot make vulnerable statements consistently throughout a collaborative game. We found that during times of tension, human teammates in a group with a robot making vulnerable statements were more likely to explain their failure to the group, console team members who had made mistakes, and laugh together, all actions that reduce the amount of tension experienced by the team. These results suggest that a robot’s vulnerable behavior can have “ripple effects” on their human team members’ expressions of trust-related behavior.

Our results were published at HRI 2018.


Promoting Collaboration between Children with a Social Robot

Child Collaboration

Despite the growing body of research in human-robot collaboration, there has been little focus on how social robots can support human-to-human teaming. I am interested in investigating whether a social robot can improve human-human collaborative behavior while working on teaming tasks. In a study with six to nine year old children, my colleagues and I investigated whether asking children directed task-focused, relationally-focused questions, or no questions had an effect on their performance in a collaborative task (a touch-screen build-a-rocket game) and their perceptions of their performance. We found that participants who were asked task-focused questions had higher performance scores in the collaborative game than the other groups, however, had a lower perception of their performance than the participants who were asked relationally-focused questions.

Our results were be published at RO-MAN 2016.


The Detection of Social Dominance in Children

Social Dominance

As robots become more widely used in educational contexts, there is an increasing need to understand group dynamics and for robots to tailor their interactions to individual differences between children. One important factor in group interactions is social dominance, which is expressed through both verbal and nonverbal behaviors. I explored a method for determining whether a child in a group interaction is 'high' or 'low' in social dominance based on domain-independent verbal and nonverbal behaviors. I implemented several machine learning models to classify the social dominance levels of children interacting with social robots.

Our results were published at ICMI 2015.