Spatial Navigation
In these projects and collaborations, the goal is to find neural and behavioral markers of spatial (dis-)orientation. All projects leverage synchronized motion and EEG recordings. Using VR hardware, novel paradigms probe human navigators to make efficient decisions about spatial goals. Modeling the navigators brain and movement patterns, I strive to predict future success overcoming spatial challenges. Critically, spatial orientation skills worsen with the progression of several neurodegenerative diseases such as Alzheimer’s. Patients often attribute challenges maintaining their sense of direction as a core limitation in their day to day lives. This research can help design adaptive systems, that ease key symptoms. Finding the right level of automation to keep the human engaged in the spatial task is a noteworthy implication.
Applications in Human-Computer Interaction (HCI)
In extended realities (VR/AR/MR) sampling body movements enables the design of natural interactions. In concert with neural markers, detecting where things start to feel off as well as building novel interactions is feasible. My core motivation is bringing neuroscientific research findings to application, leveraging a variety of supplementary signals, such as body movements and other wearables, to design robust interactive experiences that feel natural.
in the media
Methods for an Integrated Analyses of Movement and Physiology
In several projects we pushed the investigation of human brain dynamics with the aid of information derived from full-body movement and studies with an interest in motor behavior using brain imaging as an additional source of information.