Page: 1 2 3 4 5 6 7 8 9 Next »

INTETAIN 2013 and CUTE Workshop

Posted on Mon 04 Feb 2013 by Jerome Urbain
Please join and submit your research paper to 5th International Conference on Intelligent Technologies for Interactive Entertainment – INTETAIN 2013 that will be held in Mons, Belgium from 3-5 July, 2013.


In addition, participants to Intetain 2013 will be invited to take part at the CUlture and TEchnology workshop (CUTE), with no additional cost. The CUTE workshop consists of digital artists exhibition to be held in the evenings after the Intetain sessions and an additional master class day to be held on the 2nd of July on 4 exciting tracks (Eye-tracking and MOCUP, Reactive speech synthesis, Kinect-based interactions, Sensor-based interactions).


HIGHLIGHTS

  • The event is endorsed by the European Alliance for Innovation (www.eai.eu), a leading community-based organisation devoted to the advancement of innovation in the field of ICT
  • All accepted papers will be published by Springer and made available through SpringerLink Digital Library, one of the world’s largest scientific libraries
  • Proceedings will be submitted for indexing by Google Scholar, ISI, EI Compendex, Scopus and many more

Scope

The NumediArt Institute of the University of Mons invites the academic, industrial and government community to submit work in all categories of participation for publication and presentation at the conference.

INTETAIN follows the usual international academic standards to evaluate and disseminate full papers, posters and demos. The conference intends to stimulate interaction among academic researchers and commercial developers of interactive entertainment systems. In addition to paper presentations, poster, and demos, the conference will foster discussion through special events.

As in the INTETAIN tradition, besides high quality paper presentations, posters and demos, an interactive hands-on session will be organized, along the lines of the Design Garage held in the conference first edition.
Individuals who want to organize special sessions during INTETAIN 2013, may contact the organizers (intetain13@numediart.org ).

The global theme of this 5th edition of the INTETAIN conference is :”Where technology meets culture”, which is in line with Mons 2015, Cultural Capital of Europe. Contributions may, for example, address non-verbal full-body interaction, automated measure of expressive and affective features of human behavior, automated measure of social behavior in entertainment, edutainment, artistic, and cultural applications.

The INTETAIN 2013 conference invites full papers for oral presentation (up to 10 pages) and short papers for poster presentation (up to 6 pages). Submissions should describe original research that has not been previously published or accepted for publication, and is not under consideration for publication at another conference or journal.

Important Dates

Paper Submission Deadline: February 28th, 2013
Notification of Acceptance: March 31th, 2013
Camera Ready Deadline: April 15th, 2013
Conference Dates: July 3-5, 2013

For further information please visit this link.

Laughter Motion Capture Samples

Posted on Mon 24 Dec 2012 by Laurent Ach

Here are a few video samples from the laughter data collection

           

Laughter Data Collection Session in Paris

Posted on Mon 10 Dec 2012 by Radoslaw Niewiadomski
Within ILHAIRE Project, we organized several motion capture sessions between 26 and 30 November 2012. The sessions took place at the studio of CNRS/Telecom-ParisTech with the participation of ILHAIRE partners from CNRS, University of Augsburg (UAU), Università degli Studi of Genova (UNIGE), and University College London (UCL).

The main objective of the motion capture sessions is to collect multimodal data for different types of laughter episodes. For this purpose, we built a complex setup that allows us to collect synchronized multimodal data that contains body motion, visual, audio and respiration signals. The setup uses 3 inertial motion capture systems, multiple web cameras, Kinects, microphones, and a respiration sensor. All of the sensors are connected through SSI software, provided by UAU, in order to synchronize data from different sources.

The three motion capture systems are provided by CNRS, UAU and UCL. Each of the systems is composed of about 20 sensors that are placed on the different parts of the body. The sensors are equipped with accelerometers and gyroscopes; and they can send the data on each joint position in real time. The data contains body rotations and can be easily visualized with the use of a virtual agent. At the same time, we also used multiple 60 fps web cameras and Kinects (including the data on Kinects facial actions, 3d facial points, head rotations and Kinects skeletons). In addition, we recorded the sounds of each of the participants through personal wireless microphones. Our participants were also equipped in some additional body markers that will be used to analyze the body movement with the vision processing algorithms included in Eyesweb software provided by UNIGE. Last but not the least, we also used the respiration sensor to capture the data on respiration during laughter. The data is synchronized through SSI software, accordingly, it will allow us to analyze not only synchronization between different modalities in a laughter episode but also intra-subject synchronization. The latter is very important as within ILHAIRE project we also want to study the role of laughter in conversation (e.g. being a backchannel) or in the group dynamics, and laughter contagion.

The second aim of the sessions was to capture data for analysis of the occurrences and role of laughter in a group interaction. For this purpose, we invited triads of friends to participate in our data collection and we asked them to participate in a series of enjoyable tasks. The experimental protocol includes the scenarios such as watching together some funny films, and participation in several social games. Among others our sessions included games such as pictionary game, twist tongues and French "barbichette" game. We expected that the scenarios may facilitate the system to record several humorous reactions, and laugher episodes. For the same reason, it was important that our participants know well each other and they do not feel embarrassed.

During 5 days we were able to record 6 sessions with the participation of 16 persons. Despite the usage of quite intrusive capturing hardware setup, the participants enjoyed motion capture sessions and laughter occurred very often including not only participants but also experimenters. This is particularly optimistic result though the sessions were quite long - the average duration of each session is around 2 hours. Recapitulating we were able to collect many hours of synchronized multimodal data of laughter that will be used in many ILHAIRE tasks.


We would like to thank again all our participants as well as Telecom/CNRS that helped us in the organization of these sessions.