I research how to leverage neural interface technologies for human-computer interaction (HCI). Currently, I finish up my PhD in a joint project with TU Berlin and the Swartz Center for Computational Neuroscience at UC San Diego.
Recently, I interned at Chatham Labs (acquired by Facebook Reality Labs).
In the future, I want to realize novel experiences in AR/VR that feel natural. My goal is to design adaptive technologies in which computer and user coexist and together create truly connected experiences for the user.
In these projects and collaborations, the goal is to find neural and behavioral markers of spatial (dis-)orientation. All projects leverage synchronized motion and EEG recordings. Using VR hardware, novel paradigms probe human navigators to make efficient decisions about spatial goals. Modeling the navigators brain and movement patterns, I strive to predict future success overcoming spatial challenges. Critically, spatial orientation skills worsen with the progression of several neurodegenerative diseases such as Alzheimer’s. Patients often attribute challenges maintaining their sense of direction as a core limitation in their day to day lives. This research can help design adaptive systems, that ease key symptoms. Finding the right level of automation to keep the human engaged in the spatial task is a noteworthy implication
5. The Audiomaze: An EEG and motion capture study of human spatial navigation in sparse augmented reality Makoto Miyakoshi, Lukas Gehrke, Klaus Gramann, Scott Makeig, John R. Iversen | Eur J Neurosci. 2020; 00: 1β 10.
We developed the Audiomaze, a novel paradigm in which participants freely explore a room-sized virtual maze while EEG is recorded synchronized to motion capture. Participants (n=16) were blindfolded and explored different mazes, each in three successive trials, using their right hand as a probe to βfeelβ for virtual maze walls. We found behavioral evidence of navigational learning in a sparse-AR environment, and a neural correlate of navigational learning was found near lingual gyrus.
πPaper to appear | π¨π½βπ»Project on Researchgate
In this work, we focused on landmark-based navigation in actively behaving young adults solving a virtual reality Y-maze task. Our results confirm that combining mobile high-density EEG and biometric measures can help unravel the brain network and neural modulations subtending ecological landmark-based navigation.
3. The Invisible Maze Task (IMT): Interactive Exploration of Sparse Virtual Environments to Investigate Action-Driven Formation of Spatial Representations. Lukas Gehrke, John R. Iversen, Scott Makeig and Klaus Gramann | In: Creem-Regehr S., SchΓΆning J., Klippel A. (eds) Spatial Cognition XI. Spatial Cognition 2018. Lecture Notes in Computer Science, vol 11034. Springer, Cham
The neuroscientific study of human navigation has been limited by requiring participants to remain stationary during data recordings. With the Invisible Maze Task (IMT) we provide a novel VR paradigm to investigate freely moving, naturally interacting, navigators.
2. Mobile Brain/Body Imaging (MoBI) of Spatial Knowledge Acquisition during Unconstrained Exploration in VR. Lukas Gehrke and Klaus Gramann | In Proceedings of the First Biannual Neuroadaptive Technology Conference (pp. 22β23). Berlin, Germany.
Using the Invisible Maze Task (IMT) we show that study participants navigate with increasing efficiency as they learn the layout of a maze.
1. Heading computation in the human retrosplenial complex during full-body rotation. Klaus Gramann, Friederike U. Hohlefeld, Lukas Gehrke and Marius Klug | bioRxiv (2018). https://doi.org/10.1101/417972
We contrasted physically rotating participants with a traditional joystick setup with rotations based on visual flow only. We show that varying rotation velocities were accompanied by pronounced beta synchronization during physical rotation but not joystick control.
Together with 6sept13, I produced an image film about the Berlin Mobile Brain/Body Imaging Lab.
Applications in Human-Computer Interaction (HCI)
In extended realities (VR/AR/MR) sampling body movements enables the design of natural interactions. In concert with neural markers, detecting where things start to feel off as well as building novel interactions is feasible. My core motivation is bringing neuroscientific research findings to application, leveraging a variety of supplementary signals, such as body movements and other wearables, to design robust interactive experiences that feel natural.
2. Detecting Visuo-Haptic Mismatches in Virtual Reality using the Prediction Error Negativity of Event-Related Brain Potentials. Lukas Gehrke, Sezen Akman, Pedro Lopes, Albert Chen, Avinash Kumar Singh, Hsiang-Ting Chen, Chin-Teng Lin and Klaus Gramann | In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). ACM, New York, NY, USA, Paper 427, 11 pages. DOI: https://doi.org/10.1145/3290605.3300657
We detected conflicts in visuo-haptic integration by analyzing event-related potentials (ERP) during interaction with virtual objects. In our EEG study, participants touched virtual objects and received either no haptic feedback, vibration, or vibration and EMS (Electrical muscle stimulation). We report a sensitiviy to unrealistic VR situations of an early negativity component at electrode FCz (prediction error), indicating we successfully detected haptic conflicts.
1. Neurofeedback during Creative Expression as a Therapeutic Tool Stephanie Scott and Lukas Gehrke | Springer Series on Bio- and Neurosystems, Vol. 10, Jose L. Contreras-Vidal et al. (Eds): Mobile Brain-Body Imaging and the Neuroscience of Art, Innovation and Creativity, 978-3-030-24325-8
Using EEG power we gave a live visual feedback of white lines on a black background, borrowing from Joy Division’s famous album cover. This closed-loop neurofeedback stimulates creativity by making aware one’s own brain activity.
Methods for an Integrated Analyses of Movement and Physiology
In several projects we pushed the investigation of human brain dynamics with the aid of information derived from full-body movement and studies with an interest in motor behavior using brain imaging as an additional source of information.
4. MoBI – Mobile Brain/Body Imaging Evelyn Jungnickel, Lukas Gehrke, Marius Klug and Klaus Gramann | Academic Press, Neuroergonomics, 59β63, 2019
Mobile brain/body imaging (MoBI) is a method to record and analyze brain dynamics and motor behavior in naturalistic conditions. In this chapter we give an overview of its suitability to investigate a wide range of scientific problems., including analyses of human brain dynamics with the aid of information derived from movement and studies with an interest in motor behavior using brain imaging as an additional source of information.
3. Extracting Motion-Related Subspaces from EEG in Mobile Brain/Body Imaging Studies using Source Power Comodulation. Lukas Gehrke*, Luke Guerdan* and Klaus Gramann | 9th International IEEE/EMBS Conference on Neural Engineering (NER), San Francisco, CA, USA, 2019, pp. 344-347 | * contributed equally
We propose the use of a supervised spatial filtering method, Source Power Co-modulation (SPoC), for extracting source components that co-modulate with body motion. We illustrate the approach to investigate the link between hand and head movement kinematics and power dynamics of EEG sources while participants explore an invisible maze in virtual reality.
2. The BeMoBIL Pipeline β Facilitating Mobile Brain/Body Imaging (MoBI) Data Analysis in MATLAB Marius Klug, Lukas Gehrke, Friederike U. Hohlfeld and Klaus Gramann | In Proceedings of the 3rd International Conference on Mobile Brain/Body Imaging, 2018,https://doi.org/10.14279/depositonce-7236
A collection of Matlab scripts and functions for an integrated and streamlined analyses of EEG and movement data.
1. Prototypical Design of a Solution to Combine Head- Mounted Virtual Reality and Electroencephalography. Richard Wenzel, Lukas Gehrke and Klaus Gramann | In Proceedings of the 3rd International Conference on Mobile Brain/Body Imaging, 2018,https://doi.org/10.14279/depositonce-7236
We developed and tested a prototype of printable spacers to improve signal quality by preventing electrode movement when combining high-density active electrode EEG headsets with modern VR headsets.