Conferences Gesture Immersion Interaction Virtual Reality

2019-10-10 MOCO’19 Paper & Demo – Tempe, Az, USA

Meet me at MOCO’19 where I present a workflow for real-time gesture analysis to visualize gesture kinematics features (Velocity, Acceleration, Jerk) from heterogeneous data (Video, Motion Capture and Gesture Annotations) at the same time base

PAPER: J.-F. Jégo, V. Meyrueis, D. Boutet. A Workflow for Real-time Visualization and Data Analysis of Gesture using Motion Capture. In Proceedings of Movement and Computing conference, Tempe, AZ, USA, October 2019 (MOCO ’19) , 6 pages, ACM, New-York, USA, 2019

We investigate new ways to understand and to analyze human gesture in a research context applied on co-verbal gesture across language. The research project focuses on the quality of the movement and consider the gesture “pulse of effort.“ The tools designed here provide immersive and interactive explorations of data: users can test hypotheses and embody gesture visualization and descriptors adopting different Frames of Reference using augmented reality. We have conducted an evaluation protocol in the field of linguistics that compares 496 annotated gestures to benchmark the workflow.