In this project we propose a prototype which combines an existing AR headset Hololens 2 with a Brain-Computer Interfaces (BCI) system based on our AttentivU project, and we perform several tasks to validate this concept.
Application 1. Assessing Internal and External Attention in AR using Brain Computer Interfaces.
Most research works featuring AR and Brain- Computer Interface (BCI) systems are not taking advantage of the opportunities to integrate the two planes of data. Additionally, AR devices that use a Head-Mounted Display (HMD) face one major problem: constant closeness to a screen makes it hard to avoid distractions within the virtual environment. In the first application, we propose to reduce this distraction by including information about the current attentional state. We first introduce a clip-on solution for AR-BCI integration. A simple game was designed for the Microsoft HoloLens 2, which changed in real time according to the user’s state of attention measured via electroencephalography (EEG). The system only responded if the attentional orientation was classified as "external." Fourteen users tested the attention-aware system; we show that the augmentation of the interface improved the usability of the system. We conclude that more systems would benefit from clearly visualizing the user’s ongoing attentional state as well as further efficient integration of AR and BCI headsets.
Application 2. A Pilot Study using Covert Visuospatial Attention as an EEG-based Brain Computer Interface to Enhance AR Interaction.
In the second application we investigated the feasibility of using a BCI based on covert visuospatial attention (CVSA) – a process of focusing attention on different regions of the visual field without overt eye movements. We operated without relying on any stimulus- driven responses.
The proof-of-concept presented in this application opens up interesting possible applications of AR EEG-BCIs which use CVSA. Inherent gaze independence of CVSA makes it a potential alternative for patients who do not display any overt eye movements. Its intuitiveness—natural attraction toward regions or objects of interest in the visual field— makes it a possible candidate for BCI-driven navigation devices (like wheelchairs or robots), as well as yes–no communication. The absence of stimulation stimuli like ERPs/SSVEPs may prove it more suitable for use over longer periods of time, as it allows a more engaging, comfortable and direct operation, and it is better adapted toward out-of-lab interactions for different user groups.
Please check this presentation I gave for Society of Photo-Optical Instrumentation Engineers (SPIE) in February 2020 as well as papers at IEEE BSN 2021.
Big thank you to collaborators on this project: Yujie Wang, Qiuxuan Wu, Chia-Yun Hu, who equally contributed to this work. Chia-Yun Hu and Yujie Wang designed the clip-on holder for Hololens 2 to host all the electrodes and electronics for the brain-sensing component of our system - and we will be sharing it once the papers are out. Qiuxuan Wu designed the video game applications using Unity.