Digital creation tools tend to have imposing interfaces and steep learning curves allowing for the manipulation of a 3D environment on a 2D monitor with an abstract controller such as a mouse. VR overcomes this by making navigation and manipulation completely natural. Rather than ‘hold-right-click-mouse-then-WASD’ to move a camera around, you simply...walk! Crouch, lean, tilt your head! Instead of needing four viewport cameras and a three-axis gizmo to accurately move an object you just...pick it up and move it! Want to rotate it? Twist your hand!
Why is it so much easier to perform these tasks in VR than on a monitor? The short version is that our perceptual system has some fancy tricks (stereopsis, parallax, proprioception and many more) that VR hijacks. We can judge distances, positions, scale and how to move objects much more easily with these evolved perceptual cues, most of which are entirely absent when working on a flat monitor. Ultimately, we have evolved to view and manipulate objects in a certain way and VR takes advantage of this in a way 2D screens do not.
Some of the most rewarding VR experiences I’ve had have been watching other, non-technical people use VR and just get it. Seeing my partners mother pick up Tilt Brush and simply know how to fill the air with glowing paint was magical.
Similarly, apps like SoundStage have taken the complex and counterintuitive world of digital music creation software and made it accessible through VR. As a frustrated musician, I’ve tried multiple times to create music on a laptop, only to be confused by the interface. By turning the various elements of the software into objects that you physically wire together (a metaphor instantly familiar to anyone who’s ever plugged a guitar and effects pedal into an amplifier), abstract concepts become familiar in SoundStage.
This same power has been been extended to multiple creative domains from animation (Tvori, Mindshow) to 3D modelling (Oculus Medium, SculptrVR). Even at this early stage in VR’s development, most creative mediums have a VR application and almost all of them offer something unique in VR that cannot be achieved on a flat screen.
And what about creating VR itself? Making interactive applications has an extra level of complexity. At one of the VR Manchester events I help organise, an older gentleman asked me how to start making VR experiences. He had no prior game creation experience or technical knowledge but was keen to express himself in VR. Currently there is no realistic way for someone to build robust VR experiences without spending months or years becoming adept with an engine such as Unity or Unreal. However, this may soon change.
Both Unity and Unreal are developing VR editors that allow people to create interactive VR experiences from within VR in a natural way. Users enter VR, select objects from a palette and drop them into the world. Abstract concepts like scaling objects are achieved with familiar metaphors like ‘pinch to zoom’. Very quickly, even the most inexperienced user can create real time VR applications from within VR itself. Unity is looking to take this even further with a project called ‘Carte Blanche’ which abstracts the programming and logical aspects of interactive development into the visual metaphor of a deck of cards. The most compelling part of all this for me is how it democratises creation. While there is a surprising amount of nuanced work being created with these nascent tools, it’s exciting to see how much potential they have as a “gateway drug”. With almost no technical knowledge required, these tools are arguably more intuitive than MS Paint. In a future where VR is ubiquitous, it’s easy to imagine young children becoming engaged with creative acts in VR from a very early age. In the same way that we are all now familiar with the image of a toddler able to unlock and use an iPad with a few flicks of the finger, we are likely to become accustomed to kids navigating VR and enthusiastically painting the world.
Before we get carried away with our own hyperbole, you may be saying “hey, hasn’t this kind of natural interface been tried before? I thought Kinect and the Leap Motion sucked?” Well yes. There are a number of drawbacks to motion controller interfaces:
- Waving your arms around is tiring. As dramatic as the Minority Report interface looks, it soon wears you out. Not raising your arms above your chest for prolonged periods is vital. The Vive controllers do not need to be raised excessively to work, however tracking devices such as the Leap Motion need to be in view of the HMD, typically resulting in your hands being above your chest.
- Natural interface needs VR. The ability to naturally judge scale and distance with our eyes stereoscopic and parallax abilities is vital. When you can’t intuitively tell where an object is in space, you spend a lot of time flailing your arms around uselessly.
- The mouse is still a superhuman input device, but only if you're sitting down and manipulating a 2D plane. There will always be a place for mice in creation; a mouse was key in creating this document and that will likely never change.
We should remember that not all the complexity in digital creation tool interfaces is just for navigation and manipulation. The more nuances a tool has, the more options it has and therefore the more complex its interface. Many of the nascent VR DCC tools are deliberately simple but as they increase in complexity they will need to find ways to allow users to access that complexity. It may transpire that this complexity does not translate to VR and DCC tools may be hamstrung by an inability to manage complexity - slightly moving a mouse to select one of a hundred icons is faster and easier than waving my arm to perform the same action. However we don’t believe this will be the case. Natural metaphors for most nuanced actions can be discovered that rival or better their 2D equivalents. Physically moving your hand back and forth to change the diffusion of an airbrush seems quicker, more natural and more nuanced to me than the equivalent in photoshop.
Jon Dadley
05 October 2016