Publication

Towards the development of 3D lunar surface depth-data collection for geology in virtual reality

Paige, C., Ward, F., Haddad, D. D., Forsey-Smerek, A., Sanneman, L., Todd, J., ... & Newman, D. (2021, December). Towards the development of 3D lunar surface depth-data collection for geology in virtual reality. In AGU Fall Meeting Abstracts (Vol. 2021, pp. P41C-01).

Abstract

Lunar surface exploration missions are being planned for as early as 2022. With mission operational complexity increasing, development of heavy lift launch capabilities and increased funding of surface missions, frequency will gain significant momentum as well demanding new enabling technologies and capabilities.

The development of a three-dimensional map of the lunar surface has the potential to provide a basis for analysis tool development and in-situ lunar geology. Given the challenges associated with perceiving scale on the lunar surface, having high resolution depth data available for geological analysis, or later for situational awareness as humans return to the moon, will be invaluable. The use of stereoscopic imagery to create a 3D environment in virtual reality for surficial geological analysis was demonstrated with the Mars Curiosity rover for the Kimberly outcrop in the Gale Crater on Mars. However, using stereo photogrammetry in concert with multiple overlapping image sources led to challenges with calibration and large data sets. With the growing number of commercial-off-the-shelf depth-cameras available MIT’s Resource Exploration and Science of our Cosmic Environment (RESOURCE) team is exploring new methods for depth-data collection and integration into virtual reality (VR) for future exploration missions.

Herein we explore four different types of depth-data collection techniques: 1) multi-camera stereo photogrammetry, 2) dual-lens 360 RGB imagery, 3) RGB in combination with short-range time-of-flight and 4) LiDAR. Using the Boston Dynamics Spot robot we collected depth-data in a lunar analogue terrain including geologically significant points of interest (POI) with the four depth-data collection techniques. We compare the techniques for depth-of-view, field-of-view, lighting condition capabilities, resolution, and bandwidth requirements. Each technique is tested for assessment capability of the POI. With this we provide a recommendation for a combination of RGB imagery integrated with an optimized time-of-flight camera for 3D mapping in a VR environment on future lunar rover exploration missions.

Related Content