Biologically encoded augmented reality: cockpit


Copyright 2020 Everett Lawson

Everett Lawson

Information floods the center of our visual field and often saturates the focus of our attention, yet there are parallel channels in the visual system constantly and unconsciously processing our environment.  There is a dormant potential to activate these channels to challenge the limits of perception.

This research explores potentials in augmenting the perceived driving experience through the delivery of adaptive, context-aware stimuli.  The role of peripheral vision in self-motion estimation is far more efficient than central vision. Vection, or perceived self-motion through visual stimulus alone, is heavily influenced by peripheral cues and creates strong illusory effects to the observer.  Psychophysical stimuli are delivered as fast adaptation mechanisms to manipulate raw scene data streams in their animated motion trajectories.  These algorithmically-generated signals are subtly presented to affect the observer’s sensation of speed and rate of turn in a first-person point-of-view (POV) driving environment.  

This work represents a new intersection of the fields of vision science, computational imaging, and display technologies and could challenge the way we generate media for human consumption in active environments.