Laughter Data Collection Session in Paris

Posted on Mon 10 Dec 2012by by Radoslaw Niewiadomski
Within ILHAIRE Project, we organized several motion capture sessions between 26 and 30 November 2012. The sessions took place at the studio of CNRS/Telecom-ParisTech with the participation of ILHAIRE partners from CNRS, University of Augsburg (UAU), Università degli Studi of Genova (UNIGE), and University College London (UCL).

The main objective of the motion capture sessions is to collect multimodal data for different types of laughter episodes. For this purpose, we built a complex setup that allows us to collect synchronized multimodal data that contains body motion, visual, audio and respiration signals. The setup uses 3 inertial motion capture systems, multiple web cameras, Kinects, microphones, and a respiration sensor. All of the sensors are connected through SSI software, provided by UAU, in order to synchronize data from different sources.

The three motion capture systems are provided by CNRS, UAU and UCL. Each of the systems is composed of about 20 sensors that are placed on the different parts of the body. The sensors are equipped with accelerometers and gyroscopes; and they can send the data on each joint position in real time. The data contains body rotations and can be easily visualized with the use of a virtual agent. At the same time, we also used multiple 60 fps web cameras and Kinects (including the data on Kinects facial actions, 3d facial points, head rotations and Kinects skeletons). In addition, we recorded the sounds of each of the participants through personal wireless microphones. Our participants were also equipped in some additional body markers that will be used to analyze the body movement with the vision processing algorithms included in Eyesweb software provided by UNIGE. Last but not the least, we also used the respiration sensor to capture the data on respiration during laughter. The data is synchronized through SSI software, accordingly, it will allow us to analyze not only synchronization between different modalities in a laughter episode but also intra-subject synchronization. The latter is very important as within ILHAIRE project we also want to study the role of laughter in conversation (e.g. being a backchannel) or in the group dynamics, and laughter contagion.

The second aim of the sessions was to capture data for analysis of the occurrences and role of laughter in a group interaction. For this purpose, we invited triads of friends to participate in our data collection and we asked them to participate in a series of enjoyable tasks. The experimental protocol includes the scenarios such as watching together some funny films, and participation in several social games. Among others our sessions included games such as pictionary game, twist tongues and French "barbichette" game. We expected that the scenarios may facilitate the system to record several humorous reactions, and laugher episodes. For the same reason, it was important that our participants know well each other and they do not feel embarrassed.

During 5 days we were able to record 6 sessions with the participation of 16 persons. Despite the usage of quite intrusive capturing hardware setup, the participants enjoyed motion capture sessions and laughter occurred very often including not only participants but also experimenters. This is particularly optimistic result though the sessions were quite long - the average duration of each session is around 2 hours. Recapitulating we were able to collect many hours of synchronized multimodal data of laughter that will be used in many ILHAIRE tasks.


We would like to thank again all our participants as well as Telecom/CNRS that helped us in the organization of these sessions.
Radoslaw Niewiadomski