Colloque, 28 septembre 2018
Movement Computation and Affect
![couverture](/_next/image?url=https%3A%2F%2Fwp.oic.uqam.ca%2Fwp-content%2Fuploads%2F2018%2F07%2Fgenerativityisabelleconference150-scaled.jpg&w=256&q=75)
Movement Signal Processing and Movement Computing are emerging research areas in human computer interaction. I will introduce recent advances in the field through a number of projects from the Metacreation Lab (http://metacreation.net/) and the Moving Stories (http://movingstories.ca/) research effort. I will present a series of Movement Computing Tools such as the Movement Database (MODA), the Movement Visualisation tool (MOVA), and the Movement Comparison tool (MOCOMP). I will then present fundamental empirical results that answer positively to the questions: Can human perceive the affect of a mover simply by observation of the body movement (no facial expression)? Can a machine recognize the affect of a human mover with human-competitive accuracy? Can machine learn the movement style of a given mover? Can we generate new movement based on this computational model? Can we train a virtual avatar to dance on any music?