One of the primary focusses of the ACELab has been the development of a methodology for analyzing behavioural data from multiple time-synchronized sources (primarily focused on eye tracking and motion capture data). This analysis approach is accompanied by a custom software platform for performing integrated analysis of eye and motion tracking data (or either stream individually). In the lab, this research area is commonly referred to as GaMA – for Gaze and Movement Analysis.

The GaMA approach and software platform are the result of an interdisciplinary collaboration between the ACELab and members of the BLINC and Neuromuscular Control & Biomechanics Labs at the University of Alberta, and have been applied in many projects between these three labs, as well as by collaborators at other institutions.

Projects

GaMA Software Development

The GaMA software platform is the result of nearly 5 years of development effort. The software provides a graphical user interface allowing users to visualize raw motion and eye data, as well as augment it to represent 3D objects, environments, kinematics, gaze vectors, and interactions between these entities. Analysis steps can be saved and applied across multiple trials and participants, allowing for custom automated analysis of gaze and motion data with no programming knowledge required.

Gaze Vector Prediction

The integration of eye tracking and motion capture (mo-cap) data in the GaMA approach is achieved by using eye tracking data in combination with an eye/mo-cap calibration to predict the 3D direction of an individual’s gaze in the mo-cap space (AKA: their gaze vector). We have investigated a number of factors in gaze vector prediction including the calibration routine, type of model used, and the impact of using monocular vs. binocular eye tracking data.

Task Segmentation

It is often valuable to break tasks into smaller phases during analysis (ex: reach, grasp, transport, and release phases during an object interaction task). We have developed a detailed, robust approach for automatically identifying these phases during an upper-limb object interaction task, as well as user interface tools for implementing this approach in our GaMA software.

Applications

  • Visuomotor behaviour of upper-limb prosthesis users during functional tasks [link to project and/or publications]
  • User gaze and mouse movement while interacting with a video game interface
  • Comparison of visuomotor behaviour in Virtual Reality vs. real-world tasks