The ACELab is interested in studying decision-driven movement as a means to understand the cognitive processes underlying the decisions that shape our daily lives and interactions with our environment. In order to provide a full picture of human behaviour during decision making, the ACELab employs a variety of methods including electroencephalography (EEG), eye-tracking, motion-tracking, non-invasive surface electromyography (EMG), physiological monitoring, virtual reality, mathematical modeling, audio/visual recordings, and surveys/questionnaires. From decisions requiring reaches towards real-world objects, to those made in virtual spaces (e.g., using a computer or smartphone), the ACELab investigates decision processes as they interact with the environment and the actions required to enact them.


Many experiments in psychology and neuroscience ask participants to press buttons to indicate their decisions. However, in real life, our decisions often involve movements like reaching to grab a drink, or walking to a restaurant. In the ACELab, we often ask people to reach and touch a screen or object to indicate their decision while we record their movements with motion tracking cameras. These experiments show that how people move reveals what they are thinking about, and how they arrive at their final decision.

Whether deciding between different chocolate bars for a snack, or which of two circles appears brighter, the ACELab is interested in how the human brain makes up its mind and enacts decisions. To investigate what the brain is doing when making decisions and actions, we record EEG activity to look at the human brain leading up to a decision in the lab.

Social interactions and contexts influence decision making and the movements made to enact or convey the eventual choice. Yet, most relevant research is conducted on a single participant in a controlled lab setting. In collaboration with Dr. Dana Hayward's Visual Attention and Social Processes Lab, we are investigating the decision making of two people (a dyad) while they play an online game in either a cooperative or competitive context. Using the online experimentation platform Labvanced, we are able to record eye gaze and mouse movements from the two online gameplayers. We are then able to segment and analyze this complex data using our GaMA platform to automatically extract nuanced and sophisticated measures of dyadic decision making.