Quantifying Gestural Mimicry in Social Interactions

Keywords: Unobtrusive Human Sensing, Social Interactions, Rapport

This project was done individually, during my first semester at Carnegie Mellon, as a part of the “Math Fundamentals” course.

Research in psychology explains a phenomenon called the chameleon effect, which is described as the tendency of people to unconsciously mimic each other during a social interaction. This kind of mimicry can include gestural and postural mimicry, as well as mimicry of facial expressions, and has often been linked to feelings of affiliation or rapport. In this project, I implemented an algorithm that estimates the amount gestural mimicry in a social interaction, given the position of joints over time for a pair of participants. The dataset used was collected using Microsoft Kinect SDK v2.0’s skeleton tracking feature in a pilot study conducted at the Connected Experience Lab (advised by Prof. Laura Dabbish) at CMU.

We first extracted unit motions from each joint’s motion trajectories. A unit motion is defined as a motion trajectory of a joint that starts at speed=0, gains speed, and then ends at speed=0 (e.g.: moving your hand from rest to the top of your head and back to rest). Mimicry is calculated by matching unit motions across partners in the dyad, using cosine similarity between features extracted from the unit motions. Preliminary results show that gestural mimicry calculated using this method has a stronger correlation with rapport than other methods like lagged correlation. In future, annotators will watch video clips and rate gestural mimicry, which will then be used to properly validate our method.

For more details, view a brief summary at: http://prernac.com/reports/mathproject.pdf