Project

CocoVerse: A Playground for Cocreation and Communication in Virtual Reality

Scott Greenwald

Groups

Real-time collaborative self-expression in virtual reality. 

Real-time collaborative self-expression in virtual reality. 

We present CocoVerse, a shared immersive virtual reality environment in which users interact with each other and create and manipulate virtual objects using a set of hand-based tools. Simple, intuitive interfaces make the application easy to use, and its flexible toolset facilitates constructivist and exploratory learning. The modular design of the system allows it to be easily customized for new room-scale applications.

Introduction and motivation

While the potential of multi-user immersiveVR to facilitate collaborative learning is well-established, few research applications currently exist in this field. CocoVerse is intended to serve as a platform for collaborative experiences in VR, and provides a broad set of creative affordances to users in a shared virtual space.  The suite of functionality within this application provides users with the capability for both primary content authorship and interaction with pre-existing environments. 

Starting in a shared virtual space, users can:

  • create 3D sketches with a virtual paintbrush; 
  • create and manipulate objects; 
  • capture images with a camera, and place them as pictures; and
  • write phrases using a speech-to-text system. 

These affordances effectively provide a 3D whiteboard for teaching and learning. The interactions provided relate consistently to one another; for example, falling objects will rest on painted surfaces, and any user-created element can be moved or erased. This consistency ensures that users’ actions produce logical results, helping to build a strong sense of presence. The sense of immersion in the virtual space is further enhanced when users are also present in a shared physical space.

Real-time co-creation in VR enables a broad set of educational interactions. Teachers can develop and present 3D content to students. Users can learn by interacting with dynamic systems, or by exploring and annotating environments, models, and datasets. 

Since all of these interactions are fully realized in the virtual space, they can be recorded and played back in full for immediate or later review.

Design and Implementation

Our application utilizes the HTC Vive, which incorporates a head-mounted display and two handheld controllers. All three devices utilize a tracking system which maps the user’s physical movements onto a room-scale virtual space with six degrees of freedom.

Each user has a virtual toolbelt positioned at waist level, which they can operate in a hands-free fashion, thereby leveraging the spatial nature of the VR interface. This interaction model helps users to quickly explore the range of capabilities available to them, and to mix and match their active abilities, such as a brush and an eraser.

When using the paintbrush, users can choose to draw at one of a range of fixed distances from the controller, or to draw directly onto surfaces in their surroundings, allowing greater accessibility to those of different physical sizes and arm lengths. Users can remotely "feel" virtual objects through haptic feedback, and can also teleport to a new position in the virtual environment. 

Conclusions and future work

The CocoVerse application shows great promise as an engine for learning and creativity. At a time when VR lacks a set of canonical interface elements, such as the pinch-to-zoom functionality that is now ubiquitous in mobile applications, our tool-based interaction model and toolbelt are contributions that demonstrate robustness and extensibility. As the current feature set is polished, we intend to develop specific educational use cases, characterize the needs of collaborative teaching and learning, and offer appropriate design guidelines.