The Omniverse is Expanding: The Latest From NVIDIA GTC

(Source: NVIDIA.)

NVIDIA’s Fall GTC event kicked off this week, and today the company’s leather-clad CEO Jensen Huang delivered a jam-packed keynote with all the visual panache you’d expect from the graphics giant. NVIDIA remains the brand to beat in professional graphics hardware, with their latest Ampere-based RTX cards filling in the top of the line for engineering workstations. But graphics is only half the story for modern-day NVIDIA. What you’ll probably hear about most at GTC this week is artificial intelligence.

For the past few years, NVIDIA has been solidifying itself as an AI company. And why not? You can’t find a single tech company these days that doesn’t have those two magical letters stamped indiscriminately across their websites. NVIDIA, at least, has some real AI cred, designing its RTX cards for deep learning just as much (if not more) as for rendering computer graphics.

Bridging both sides of the business is NVIDIA’s new darling, NVIDIA Omniverse, a broad and scalable software platform that ambitiously wants to be the environment that unifies all your 3D data. Omniverse is built on Universal Scene Description (USD), an open-source framework that NVIDIA often likens to the “HTML of 3D.” The idea is to be able to bring in any 3D asset, from any program, and render, simulate, and collaborate with that data in real-time. Think of a game engine and then think bigger.

“Omniverse is very different than a game engine,” Huang emphasized. “Omniverse is designed to be datacenter scale, and hopefully someday, planet scale.”

What’s New With NVIDIA Omniverse

NVIDIA’s Omniverse is expanding faster than the speed of simulated light, with the company today dropping several new features, updates, and case studies. One new feature that caught my eye was Omniverse VR, the forthcoming ability to create virtual reality apps in Omniverse with—wait for it—real-time ray tracing. NVIDIA has had real-time ray tracing capabilities since 2018, but as far as I know this will be the first time real-time ray tracing is available in VR.

Also new to the platform is Omniverse Avatar, which uses AI to generate interactive 3D avatars for use in customer service and other applications. Here’s what Huang himself looks like as an Omniverse Avatar:

An Omniverse Avatar of NVIDIA CEO Jensen Huang. (Source: NVIDIA.)

That cute little guy is pretty excited about his potential. “The dawn of intelligent virtual assistants has arrived,” Huang proclaimed. “Omniverse Avatar combines NVIDIA’s foundational graphics, simulation and AI technologies to make some of the most complex real-time applications ever created. The use cases of collaborative robots and virtual assistants are incredible and far reaching.”

The fun doesn’t stop there. NVIDIA has also launched Omniverse Replicator, an engine for generating synthetic data to train deep neural networks. Real-world data is hard to come by, especially for infrequent or dangerous edge cases, so the ability to pull synthetic data from thin air vastly opens up the range of scenarios for which a neural network can be trained.

“Synthetic data is essential for the future of AI,” noted Rev Lebaredian, vice president of simulation technology and Omniverse engineering at NVIDIA. “Omniverse Replicator allows us to create diverse, massive, accurate datasets to build high-quality, high-performing and safe AI. While we have built two domain-specific data-generation engines ourselves, we can imagine many companies building their own with Omniverse Replicator.”

The two domain-specific engines Lebaredian mentioned are for NVIDIA DRIVE Sim, a virtual world for training autonomous vehicles, and for NVIDIA Isaac Sim, a virtual world for training industrial robots. For the former, Lebaredian gave the evocative example of simulating a volcanic eruption to train an autonomous vehicle how to cope with that scenario. Unless you live in Iceland or Pompeii, it’s not the kind of data that’s easy to gather in the field.

Synthetic data created in Omniverse Replicator for NVIDIA Isaac Sim (left) and NVIDIA DRIVE Sim (right). (Source: NVIDIA.)

NVIDIA announced several other new features for Omniverse, including Omniverse XR Remote for ray-traced augmented reality in iOS and Android; Omniverse Farm, which harnesses multiple workstations or servers for big jobs; Omniverse Showroom, an app for non-technical users to play with the platform; and NVIDIA CloudXR, an addition to Omniverse Kit that lets users stream Omniverse to mobile AR/VR devices.

Finally, Omniverse has expanded the range of applications it supports with six new connectors. The platform now supports Esri ArcGIS CityEngine, PTC Onshape, Reallusion iClone, Replica AI Voice, Radical AI Pose Estimation, and Lightmap HDR Light Studio. An additional 15 applications have announced support for the USD framework with plans to build Omniverse connectors, according to NVIDIA.

Omniverse Enterprise for Digital Twins

Omniverse Enterprise is now officially out of beta after almost a year of testing by several enterprises. One of them was BMW Group, which played a starring role at the Spring GTC event wherein the automaker demonstrated an aspirational vision of a digital factory designed and simulated in Omniverse. This week, NVIDIA is showing off two other impressive demos of Omniverse from partners Ericsson and Lockheed Martin.

Telecommunications company Ericsson is using Omniverse for the trivial task of simulating full-scale digital twins of cities and their cell towers. With this intuitive visualization, Ericsson hopes to optimize the placement of 5G cells for maximum coverage. The digital twins incorporate features of the municipal environment such as buildings, trees, and other obstacles that block line-of-sight to 5G antennas.

“Before Omniverse, coverage and capacity of networks was analyzed by simplifying many aspects of the complex interactions, such as the physical phenomena and mobility aspects,” said Germán Ceballos, a researcher at Ericsson. “Now we’ll be able to simulate network deployments and features in a highly detailed scale using Omniverse.”

Ericsson is using NVIDIA Omniverse for city-scale digital twin simulation of 5G signals. (Source: NVIDIA.)

As cool as Ericsson’s 5G project is, Lockheed Martin’s use of Omniverse may have it beat. The aerospace company has partnered with NVIDIA, the U.S. Department of Agriculture Forest Service, and the Colorado Division of Fire Prevention & Control (DFPC) to simulate wildfires in Omniverse. The organizations will create digital twins of wildfires and use AI to better understand their behavior and how to control them.

“In Omniverse, you’ve got a virtual world that represents a photorealistic digital environment. In it, we can visualize the fires and aspects that affect them, like the terrain, slope, wind and more,” explained Shashi Bhushan, Principal AI Architect at Lockheed Martin. “The combination of Lockheed Martin and NVIDIA technology has the potential to help crews respond more quickly and effectively to wildfires while reducing risk to fire crews and residents.”

A digital twin of a wildfire simulated in NVIDIA Omniverse. (Source: NVIDIA.)

Omniverse Enterprise subscriptions are available through six of NVIDIA’s hardware manufacturing partners: BOXX, Dell, HP, Lenovo, PNY, and Supermicro. Subscriptions start at $9000 per year.

NVIDIA GTC is ongoing virtually until November 11, 2021, and is free to attend. If you feel like being absolutely bashed over the head with AI presentations, check it out. That’s where we’ll be—at least until we can send an AI avatar instead.