The future of immersive virtual reality gaming is in Macau, where cutting-edge gaming centers offer players the most realistic VR experience yet. At Zero Latency Macau, a 2,150-square-foot warehouse VR gaming space allows up to eight players to participate in first-person shooter games wearing VR headsets, earphones and military-grade backpacks while equipped with a gun controller.
The controller can switch from one weapon type to another with a press of a button, with automatic mini gun and sniper rifle options as well as beam and rail gun selections. Players can move anywhere on the gaming floor and even ascend from one floor to another as they fight zombies, robots and drones or cooperate on team challenges. Although the playing floor is flat, the VR display makes it feel like you’re walking uphill or downhill or even standing upside-down.
Virtual experiences such as this depend on technological innovations that make realistic VR immersion possible. Here’s a look at some of the technology that lies behind virtual reality gaming.
At the heart of virtual reality technology is the 3D graphics display technique used by VR headsets such as Samsung Gear VR, Sony PlayStation VR, HTC Vive and Oculus Rift. VR headsets receive a video feed sent from a computer, console or smartphone. Feeds from a computer or console are tethered to their source by a cable, while smartphone feeds use wireless mobile platforms. Today’s leading smartphone platforms, such as the Snapdragon mobile platform, are designed to support VR. Snapdragon not only enables a more immersive VR experience, it also reduces power consumption for longer lasting battery.
To achieve the illusion of 3D, feeds sent to VR headsets are split into either two feeds going to one display, or else one display per eye. Additionally, lenses are placed between the pixels and the user’s eyes, adjusting the image for each eye and angling the images. This creates a sense of depth by mimicking the way each eye normally sees physical objects from a slightly different angle. The angle of the display on high-end headsets typically mimics a 100-degree or 110-degree field of vision, which is sufficient to create the illusion of 360-degree immersion.
To maintain a sense of 3D when your eyes, head or body move, virtual reality gaming also depends on movement tracking technology. To sync your head movements with changes to the image you’re viewing, VR uses a method called six degrees of freedom (6DoF) to map your head onto X, Y and Z coordinates that represent movements of your head forward and backward, left and right, up and down and through rotations. Tracking your head’s movement through these coordinates is done through means such as gyroscopes, accelerometers, magnetometers and monitoring referential LED lights on the VR headset with cameras.
Some VR systems also track the movement of your hands or the movement of your body through space. This can be done through means such as placing sensors on headsets and controllers and setting up base stations around the room to bounce lasers off the sensors, allowing movements to be calculated based on the timing of the signals.
The current frontier of movement tracking in VR is eye movement tracking. While not yet integrated into most VR systems, some systems use an infrared monitor to detect where your eye is looking in virtual reality space. This allows VR displays to adjust the image you’re seeing so the area you’re looking at is more focused and the area you’re not focused on is more blurred, similar to the way the eye sees physical objects, in contrast to digital displays where everything is in sharp focus.
For greater realism, VR displays are designed to support integration of visual 3D with audio immersion. This is typically done using by using either binaural or 3D audio sound systems. Binaural sound systems are an advance on stereo systems, which use a pair of right and left speakers to create a sense of sonic dimensionality. Binaural sound systems typically record sounds using two special microphones set in ear-shaped molds and mounted on a dummy mannequin’s head, which represents a future listener’s head. This allows the recording to register where the sounds are in relation to the dummy’s head. The sounds are then played back to a real listener through a pair of speakers or headphones, replicating how they sounded in their original 3D environment.
3D audio sound systems are a newer approach to audio immersion designed to integrate with VR displays. In a 3D audio system, the VR game designer places sounds at locations within the VR visual environment. The sounds then play when the user looks at that part of the VR display. This can be done in conjunction with binaural recording for greater realism.
Another emerging dimension of virtual reality gaming is haptic feedback, which simulates the sense of touch. Haptic feedback creates a sense of touch by using means such as special gloves or suits fitted with pneumatic systems to create pressure against the skin.
For instance, HaptX has developed gloves fitted with a system of air channels and air sacs that can inflate or deflate against the skin, simulating the pressure you would feel if you were touching something. In demonstrations, users have felt the sensation of a virtual spider crawling over their palm. Disney is currently developing a haptic jacket based on modifying a lifejacket, which can inflate or deflate to replicate sensations such as being struck by a snowball.
3D displays form the foundation for immersive gaming by creating a sense of visual depth. Movement tracking coordinates the visual VR world with movements of the head, hands, body and eyes in the physical world. Audio immersion integrates realistic sound with VR. Haptic technology adds a sense of touch to the VR experience. As these different techniques for simulating real-life experience continue to advance, gaming experience will become progressively more realistic, more engaging and more entertaining.