Abstract
In situ resource utilization (ISRU) technologies are a key advancement required to make long-term human habitation on the Moon and Mars viable. The upcoming Volatiles Investigating Polar Exploration Rover (VIPER) mission will provide crucial correlations between volatiles and the lunar environment and geologic setting to begin to understand the water content available for ISRU on the Moon. The mission, slated to launch in 2023, will require the coordination of multi-disciplinary teams across the country making real time decisions based on rover instrument data. The virtual Mission Simulation System (vMSS) is a virtual reality platform designed at MIT supporting the Resource Exploration and Science of our Cosmic Environment (RESOURCE) team to provide geographically distributed teams with a collaboration interface for planetary missions like VIPER. Herein we describe a preliminary assessment of the vMSS platform with a mobile rover platform integrating two onboard depth-cameras and synthetic instrument data representative of a VIPER onboard instrument, specifically the neutron spectrometer subsystem (NSS). Current lunar rover exploration missions have their console positions set up such that the science operations team is separate from the science backroom team. Logistically, this allows for the science backroom team to focus on detailed analyses to advise the operations team of potential new points of interest while the operations team can focus on the execution of the traverse. However, this physical separation can challenge communication of priorities and may become a detriment to maximizing science return. More efficient communication methods are needed to ensure this next phase of exploration provides every advantage to geological exploration. We propose vMSS to provide a collaborative environment equipped with visualization tools that can drive real-time science analysis of instrument data in easily digestible displays to allow the science analysis team to continuously monitor data streams during the rover traverse and rapidly communicate recommendations to the operations team. We present a prototype vMSS with proposed testing to demonstrate effectiveness in real-time decision making, rover traverse planning, situational awareness and to determine a minimum image resolution to minimize communication bandwidth requirements. The VR platform is designed with two primary views: 1) a mini overview with a birds-eye map which allows for real-time traverse monitoring and 2) an immersive point-of-view which provides an annotatable on the ground view of the surface. The rover used for this testing had an onboard Intel RealSense D435i depth-camera with integrated RGB imagery. The testing procedures were set up to test the user’s ability to identify objects and obstructions, make real-time decisions, re-plan traverses and complete increasingly complex tasks within the VR environment. The rover was set up in manual mode with access to the maps for user decision making. These capabilities will be tested for mission planning, in-mission traverse, science station exploration and drill site selection and assessment. The test area was predefined with analog NSS data such that the user could view both the location’s camera views and overlaid simulated instrument data. Preliminary data captures were completed for a basic traverse. Planned testing procedures are described herein. VR environments are not generally recommended for long-term continuous use; thus, it is important to identify the tasks where the usage of vMSS provides the greatest advantage. The experimental setup and early phase demonstration of vMSS will be the basis for determining the points during a rover mission when VR can reduce decision-making time, enable more efficient cross-team communication and reduce task loads.