NVIDIA’s Newest GPU Extends VR Senses

NVIDIA has announced the release of its most advanced GPU for virtual reality (VR) applications, the Geforce GTX 1080.

Although NVIDIA is touting the GTX 1080 as a gaming GPU, with its Pascal architecture and double the performance of its predecessor, the GTX TITAN X, make no mistake, the GTX 1080 has all the power and infrastructure to be a CAD/CAE workhorse too.

Underpinning the breakthrough performance of the GTX 1080 are a number of new technologies. First is NVIDIA’s Pascal architecture, which allows the GTX 1080 to maximize its performance per watt consumed. In fact, the Pascal architecture is over three times more efficient than Maxwell architectures, making it cooler, more quiet and less expensive to run. Moving forward, the GTX 1080 compounds efficiency with cutting edge technology by using the 16 nm FinFET micro-etching process that can pack smaller and faster transistors onto a chip. All told, the GTX 1080 has 7.2 billion transistors at its core. Finally, the GTX 1080 comes equipped with a ton of memory muscle, 8 GB of GDDR5X RAM in total.

But aside from its raw power, the GTX 1080 might be equally as suitable a GPU for CAD applications now and in the future because NVIDIA has built a robust VR development suit to work in concert with its most advanced GPU.

Named VRWorks, NVIDIA’s VR development software first began as a means for addressing performance issues like lag that can quickly mar the delicate tapestry that is a VR experience. Since those early days, NVIDIA has added even greater functionality to its VRWorks suite. In its latest release, two new methods for rendering images are added to the VR developer’s toolkit, ensuring that complex virtual geometries and environments can be rendered efficiently and without experiential interference.

Beyond VRWorks, NVIDIA is also adding greater support for the other senses that help complete one’s experience with reality. To that end, the graphics giant has also developed a path-traced audio suite named VRWorks Audio to tune a VR audio experience into something that sounds more authentic. With VRWorks Audio, engineers can develop sonic environments that react to materials, space and other properties that force sound to be muffled, tinny or shifted. With the ability to tailor the acoustics of an environment, additional layers of carefully crafted simulacra can be added to an environment, making it all the more real.

Finally, because accurate physics and touch interactions are crucial to maintaining the illusion of VR, NVIDIA’s team has also built a physics engine named PhysX. With PhysX, developers can create realistic haptic experiences that make objects and interaction in a virtual world mimic reality with even more crispness.

While NVIDIA’s latest tools are obviously primed for the gaming market, I don’t think it’ll be too long before engineers begin using these extended sensory modules to make even more immersive VR experiences. With these modules, engineers and design teams might be able to analyze not only a product’s design, but how it might be used by its target audience in the real world.

Taking that idea a little bit further, it’s not too hard to image how robust virtual models and environments might be able to shift prototyping to a completely virtual experience. In the end, truly convincing virtual spaces might ultimately render many physical products obsolete, moving them from the real to the virtual without losing any of the satisfaction that a user derives from their interaction with the physical object. If that reality manifests itself, then it might be time to question what matters most, the physical or the digital.