The goal of this post is to create a reference document for virtual reality. I wanted to focus on the technology itself and look for any compelling engineering applications that exist in the current swath of available consumer headsets. I’m excluding computer-aided virtual environment (CAVE) applications, which are used by companies like Jaguar Land Rover for training, maintenance, visual collaboration, simulation and immersive visualizations for customers and clients.
The promise of ubiquitous virtual reality (VR) has been around since the early 1990s, since the launch of one of the first commercial VR headsets called the Forte VFX1. This era of VR was unimpressive and was sidelined due to the ascending vortex called the Internet, which commanded the interest of the computing industry as a whole.
The idea of commercial VR popped back to life in 2012, when a new epoch of VR headsets began with the development and interest in the Oculus Rift. Since then, HTC Vive and PlayStation VR have emerged as commercially promising headsets. The global popularity of smartphones allowed interested parties to create demand and test the public’s appetite for VR headsets with projects like Google Cardboard and Samsung Gear VR.
Professionals like architects and engineers benefit immensely from crystal-clear visual communication of their designs. Without question, it is becoming more important to collaborate visually and immerse decision-makers and clients into designs now more than ever. Some industries have been using VR for quite some time and others are new to it, but the adoption of the technology will only increase as more impactful applications are created.
Right now, it’s being used primarily for job training and as a tool for immersive visualization. These two common uses are employed in order help a variety of audiences understand concepts, projects and complex processes by enveloping them in 3D.
VR is different from augmented reality in that it completely blocks out users’ views of physical reality, whereas augmented reality transposes digital layers of 3D data onto physical reality. If done alone, VR should be done [MM1] in a padded room (unless you’re just sitting down); augmented reality is only as limited as its tether is long (unless it’s wireless).
Coined as “augmented reality” by Boeing researcher Tom Caudell in 1990, the technology traces its roots back to the heads-up display, which was invented by legendary computer scientist Ivan Sutherland and Bob Sproull, who was his student at the time. Called “The Sword of Damocles,” the device needed the support of a mechanical arm to stay on a user’s head and created a very simple virtual environment of wireframe walls transposed over physical reality.
Interest has resurged in recent years for augmented reality headsets with consumer and industrial applications, beginning with developer headsets like the Microsoft HoloLens, Meta 2 and DAQRI Smart Helmet. Along with these, the rise of the smartphone and tablet devices has pioneered various augmented reality applications with both commercial and industrial purposes.
However, we’ll save the engineering applications of augmented reality for another day.
There are many terms that would be helpful for you to familiarize yourself with in order to navigate the fields of augmented reality and VR. Here are my top five, with some honorable mentions.
1. Field of View (FOV): This term refers to the number of degrees of visual angle that you can see when your eyes are fixed on one point. Monocular FOV describes the angle for one of our eyes, which ranges from 170 to 175 degrees. Nasal FOV is typically around 65 degrees, depending on the size of a person’s nose. Binocular FOV is the combination of monocular FOV in both eyes and typically gives humans a FOV of 220 degrees. For VR and augmented reality, the FOV where the two monocular FOVs combine yields approximately 114 degrees. This is called the stereoscopic binocular FOV. This is where humans can experience 3D immersion in VR and augmented reality.
2. Latency: If you could somehow add delay to the messages your motor system receives from your sensory system, you would experience what is most often complained about in today’s VR and augmented reality headsets, which is referred to as “simulator sickness.” This unpleasant phenomenon comes from institutions like the U.S. Navy and NASA in order to signify a limit to the length of time people can participate in a virtual environment before they get unbearably nauseous. When you move your head in VR and augmented reality, the digital data is slightly delayed compared to your head movement, which causes an imbalance in your inner ear, resulting in nausea.
3. Head Tracking: There are two categories of head tracking. You’ll find the more basic orientation-based head tracking on most mobile VR headsets such as Google Cardboard. This means that you can move left to right and up to down and roll in a circular fashion like the hands of a clock. Positional tracking is more sophisticated because it includes your head rotations and incorporates movements of your body, like walking or moving in any direction (don’t run). Positional tracking is achieved by combining the input of an infrared camera, gyroscopes and magnetometers.
4. Resolution: This refers to the image quality of a headset screen, measured in horizontal and vertical pixels per inch. Most of the popular nonmobile VR headsets have a single LCD panel whose resolution is split in the middle to maximize use of the 114-degree stereoscopic binocular FOV. When Sony announces, for example, that its PlayStation VR has a resolution of 1920 x 1080, this means that each eye is seeing a resolution of 960 x 1080 pixels per inch.
3. Input Controllers: These are the controllers wielded by users in virtual and augmented reality environments. They vary from an Xbox controller to a joystick, smart gloves or even a treadmill. These controllers are tethered to a computer much like the majority of VR headsets. There are also untethered wireless options for VR headsets; most popular are Oculus Touch and HTC’s controllers based off of Valve’s Lighthouse positional system (Valve and HTC developed the Vive, in case you didn’t know). Of the current available options, the HTC Vive’s controllers seem to yield the best experience, but this is speculative and based solely on word-of-mouth information. The system integrates two mounted base stations on opposite ends of a normal-sized room. These base stations emit lasers that hit on small nodule sensors that cover the head-mounted display (HMD) and the controllers. The Lighthouse system captures the relative position of the controllers as they are in motion compared to the stationary base stations, which gives it lightning-fast input control and positional head tracking.
Frame rate, or the rate at which a VR or augmented reality headset displays consecutive images in frames per second, is important to note for high-quality VR and augmented reality experiences. A lower frame rate is easier to process and costs less processing power, but a higher frame rate results in a better experience.
Refresh rate is another important term and it is measured in frequency (hertz or Hz). Broken down, the refresh rate is literally the number of times per second your monitor is able to redraw the screen. A refresh rate of 96 Hz means your TV or display screen can redraw the screen 96 times in one second.
Time warp, asynchronous time warp and judder are a bundle of terms that are important to unpack a bit because they affect the users experience tremendously. Time warp is a technique that manipulates a rendered image just before sending it to the display. It does this to make corrections for head motion that occurs right after the image was rendered. This is key to reducing latency. The most common form of time warp corrects the rotational change of a head pose, which is only a 2D change, meaning it doesn’t cost too much in terms of performance and experience. This saves the VR device from having to render a new frame altogether. Asynchronous timewarp does this same process on another thread parallel with rendering, generating a new timewarped frame from the latest frame completed by the rendering thread. Though this requires parallel processing, it can help reduce judder. Judder is the undesirable experience of visual stuttering, strobing or smearing and happens when you move your head rapidly in the headset. When the frame rate of an app dips out of sync with the frequency of the LCD panel display, you see judder.
This stands out because it allows users to view any online 3D models hosted on the popular forum in VR using mobile VR like Google Cardboard or Samsung Gear VR. Use a WebGL-enabled browser and click on the VR button after implanting your smartphone into your headset (we’re getting to those).
You can also download an app from Sketchfab that allows you to view a selection of 3D models in the HTC Vive, Oculus Rift, Google Cardboard and Samsung Gear VR. Download the app for whichever one of the headsets you’re using (you can also download WebVR’s experimental Chrome browser to view on desktop PCs).
Right now, Iris VR is the most promising VR application I’ve seen for the headset community interested in immersive CAD. Iris VR gives users access to an app, which allows you to transfer your own 3D models and view them in VR. It has a three-part questionnaire to get you started: What headset are you using? Then you select from Oculus DK1 or DK2, Samsung Gear, Microsoft HoloLens, Google Cardboard or HTC Vive. Then it prompts you to enter which 3D file formats you use most often from a selection of the following: SKP, RVT, ARC, 3DS, C4D, maya, 3DM, VWX, FBX and OBJ. After you select your GPU from a massive list, you get a custom set of recommendations configured from your choices. Users can then view their different CAD file formats in VR.
Smartphone-Enabled VR Headsets
It would take much more space in this article than it allows for to include every HMD and mobile VR device. So, to keep it straightforward, here is a cross-section of available headsets, both powered by smartphones and by mobile and desktop workstations with powerful GPUs.
1. Google Cardboard
Besides investing nearly half a billion dollars in Florida-based startup Magic Leap, Google has invested time in the mixed reality space by making noticeable efforts to popularize Google Cardboard. If you look through Google Play for engineering apps, you won’t find any at this time. One of the better uses for Google Cardboard from a collaborative standpoint is the ability to share and view 3D models in VR through online communities such as Sketchfab.
This can help a noncontiguous product engineering team visualize and collaborate on 3D models they’ve uploaded to the Sketchfab community. This could be of use to clients and customers as well to explain or display 3D models using Google Cardboard, Samsung Gear VR, Oculus Rift or the HTC Vive.
You can use Google Cardboard for iOS and Android smartphones. I didn’t find any good engineering applications besides sharing and visualizing 3D models in the Sketchfab community.
2. Samsung Gear VR
This South Korea–based electronics giant jumped into the mobile VR market with its mobile-powered headset after developing it jointly with Oculus. It has a 96-degree FOV and works with a limited number of phones. Currently, S7, S7 Edge, Note5, S6 Edge+, S6 and S6 Edge are supported.
No unique engineering applications exist as of yet.
A minor evolution from the Google Cardboard platform, Google Daydream was announced at Google I/O 2016 and is very similar to Samsung Gear VR. Daydream aims to be a gift to hardware manufacturers who use Google’s Android OS to power their smartphones. It is compatible with Samsung, LG, Xiaomi, ZTE, Asus, Huawei and Alcatel smartphones.
Even great lenses can’t increase the level of experience to that of the Oculus Rift or HTC Vive headsets. As more stereoscopic content and apps for VR are built, the potential usefulness could increase.
The design for the Avegant Glyph is interesting in that it is a pair of stereophonic headphones with display screens built into the band that normally sits on top of your head. You end up looking a bit like Jordi LaForge from Star Trek, but light leaks in, so you won’t be surrounded by darkness like a full immersive experience requires. The Glyph’s unique in that its micro-HDMI-to-HDMI port can be tethered to almost any device, from Android smartphones to MacBooks. The content is projected on virtual retinal displays. Light is projected onto two million micromirrors in the headset and then directly into your retinas. If you use a lightning-to-HDMI, you can connect it to your iPhone.
6. Freefly VR
Well-designed for a comfortable experience on Android and iOS smartphones, this headset comes with a wireless glide controller that gives Android users a way to control different apps. At this point, all iPhone VR apps are controlled by the headset alone.
7. LG 360 VR
This mobile VR headset has an interesting variation from another Android OS smartphone producer in that you do not stick your smartphone in the headset. However, this design results in tons of light leakage and ruins the immersion. Though this headset has its own screen that is tethered to your smartphone, the design is way off. If you can’t quite seal out the light to create the immersive environment people are used to in VR, then you have a bad VR project. This is a bandwagon device and is arrogantly expensive at $200.
1. HTC Vive
HTC Vive is getting the most buzz as an entertainment headset, but the company also released an enterprise edition recently: For $400 more (plus tax and shipping), you get the same hardware but with a dedicated Business Edition customer support line and can have a technician set up your HTC Vive. You can tell that it’s a Business Edition HTC Vive because it has blue markings on it.
Dassault Systèmes is transforming its VR operations from CAVEs to headsets, touring the globe through different events and showing off VR demos that focus on its 3DEXPERIENCE platform. I went to an event in Detroit at the Henry Ford Center and experienced a phenomenal demo that allowed you to sit in and make alterations to a new car.
The representatives from Dassault Systèmes made a good case for training workers to perform complex operations in the oil and gas industry through the HTC Vive. The HomeByMe 3DVIA platform, where users can walk through architectural and interior designs, is also a promising direction to move in.
2. Oculus Rift
Palmer Luckey’s Oculus VR first became a sensation for accepting a $2-billion buyout from Facebook after raising $2.5 million on Kickstarter. The first two development kits were shipped out in 2012 and 2014 to give developers time to create apps and content. In between the release of the DK1 (Developer Kit 1) and DK2, Oculus released two prototypes.
The first HD prototype addressed resolution concerns of early users by adding a 1080 LCD panel that increased the resolution twofold. This HD version was shown but not offered to the public as a developer kit.
The second prototype was called Crystal Cove and was released at CES 2014. It featured a unique OLED display with low persistence of vision and a motion-tracking system that leveraged an external camera tracking infrared dots on the headset. The net effect was that the headset could detect more motion, such as leaning, hunching or crouching, which was supposed to help reduce nausea and motion-sickness symptoms that would frequently occur.
As far as engineering applications go, developers have created an open-source CAD application called FreeCAD VR for use in headsets like the Oculus Rift. AutoCAD can be integrated as well, and SOLIDWORKS has VR support for the Oculus, and given its popularity with developers, so the outlook is promising.
3. PlayStation VR (Project Morpheus)
Sony has held an interest in VR since the 1990s and though the PlayStation VR headset is meant to be sold as a new experience that’s fundamentally rooted in PlayStation games, the technology of the headset is competitive with the HTC Vive and is expected to make an impact when it is released in October 2016.
Users wearing different headsets interact in different ways because video can be displayed differently from each headset and the television screen itself. This ability allows developers to create games that allow users to compete against each other or work together.
PlayStation VR works with its standard DualShock 4 controller or the PlayStation Move controllers. Nine positional LEDs are embedded on the surface of the headset, which allows the Playstation Camera to track 360-degree head movement.
FOVE is an interesting startup that won nearly $500,000 in Kickstarter funding. Despite delays due to the company’s decision to create its own version of HTC’s Lighthouse positional tracking system and trouble sourcing optical components, the company is set to ship their headset in the fall 2016.
It is using something called foveated rendering to mimic more accurately how human vision naturally focuses and blurs out different areas. To understand, just try to use your peripheral vision at any time. Things aren’t as clear when you notice the view outside of your central focus. The processors in our brains reroute power to focus and blur depending on our visual intentions. The promise of foveated rendering and controlling apps with your eyes is what initially made FOVE a hit.
As you can see, there are still a number of fundamental limitations with the user experience and hardware of VR that are preventing it from being used more consistently to view and share 3D models and perform training simulations. These fundamental flaws and the industry’s focus on gaming applications leave me feeling that despite some faint presence VR has in true engineering applications, its absence is more noteworthy at this point.