Bentley Systems Brings Construction, Engineering Projects to Life with NVIDIA CloudXR

Bentley Systems has sponsored this post.

Research and development of XR technologies for the construction industry may pave the way for more digitization and efficiency. (Image courtesy of Bentley Systems.)

The architecture, engineering, and construction (AEC) industry is steadily transforming thanks to digital innovations. Bentley Systems aims to lead the next phase with enhanced extended reality (XR) technologies powered by NVIDIA that can be used on large projects without requiring a local workstation. 

NVIDIA GTC, a global AI conference held in March, featured a presentation by Greg Demchak, director of the Digital Innovation Lab at Bentley Systems, and William Cannady, technical product manager for extended reality at NVIDIA. They spoke about how Bentley Systems has been using NVIDIA CloudXR to stream an immersive experience for designers, construction workers, and others collaborating on the massive nuclear fusion project, ITER, in France. 

Seamless AR, VR and MR 

While the AEC industry is not a stranger to virtual reality (VR), augmented reality (AR), and mixed reality (MR), adoption has been relatively slow. Much of this lag in adoption is due to the fact that, historically, tethered XR was the only feasible option because of the established latency threshold of 20 milliseconds for motion to photon. For large projects, being tethered to a computer created limitations that hampered XR’s growth in the industry. 

NVIDIA CloudXR allows for AR, VR and MR to seamlessly be used from any location. (Image courtesy of NVIDIA.)

“It is important to remember that an extended reality experience becomes the most immersive when people are not thinking about the technology,” Cannady said. “Being tetherless, not having to be tied down to a computer, is really powerful. Wireless devices have become very popular, with an expanding market, but it comes with a dramatic tradeoff in visual performance.”

Wireless technology, while popular for 2D XR, has been complicated and difficult to implement for 3D XR. A tetherless device is mobile, so it has limited processing power and memory. This means a mobile device will deliver low-fidelity graphics—a deal-breaker for any organization that demands realism. The more realistic the XR graphics, the more accurately you can evaluate the real environment in which you’ll be working. 

The challenge was to find a better way to deliver a high level of performance while allowing something other than the mobile device to render the information, send the rendered frames to a device, and effectively return the pose information to the server to determine which pose should be rendered and sent back to the device. Enter NVIDIA CloudXR

“This technology allows you to stream your remote application,” Cannady said. “It could be hosted locally, on the cloud, at your server data center, or even from your personal laptop. CloudXR works to optimize the local client device, the transmitting network, and the remote application server. The result enables a seamless experience, whether you are a virtual reality professional or using consumer devices.”

CloudXR allows for delays greater than 20 milliseconds on its network, making it possible to stream from any location, even if the server is far away and needs to use public internet, or in situations where there is a need to leverage radio networks and connect using a 5G antenna to access the remote server.

“Being able to offer that flexibility in various network conditions sets apart the level of experiences you can make available to users when working with CloudXR,” Cannady said. 

The result meets the requirements for an end-to-end solution with two major parts. First, there is a server handling communications. Second, a client library receives audio and video data from the server, decodes each piece of data, and displays frames and plays audio for each. 

The client SDK can work across different XR mediums. For developers, it allows them to easily build custom clients and applications. For designers, it offers increased agility without the need to worry about device capabilities. End users can take any OpenVR device and have an immersive experience throughout the virtual world. 

Digital Construction Use Case: ITER 

The ITER project is based in France and involves 35 countries collaborating to build a large-scale electricity generation facility based upon the same principle that powers our sun and stars: nuclear fusion. When completed, this facility will seek to prove the feasibility of fusion as a utility-scale, carbon-free source of energy.

Bentley has been working with the ITER project in France to create a digital twin and immersive XR experience to enhance communications and virtually teleport workers to the worksite prior to physically completing the work. (Image courtesy of Bentley Systems.)

The facility requires construction of the world’s largest magnetic fusion device, called a tokamak. This device is being constructed offsite, then will be assembled by one of the largest overhead cranes in the world—one that is load tested at 360 tons, equivalent to a fully laden 747. Lifting the tokamak into place is a key part of an enormous design and construction project involving complex engineering models and millions of parts.

Along with the massive tokamak come complex geometries, detailed wiring, and installation work, among many other facets. Considering this, ITER wanted to create a digital experience of this project and simulate how to construct the machine and assemble the pieces.

Bentley Systems set out to make that digital experience happen using its own platforms, such as SYNCHRO 4D and iTwin, along with NVIDIA Omniverse, Unreal Engine for Oculus Quest2 and Azure Remote Rendering for Microsoft HoloLens2. According to Bentley Systems, this “combination allows engineering-grade, millimeter-accurate digital content to be visualized with physically accurate lighting and environmental effects on multiple devices and form factors such as web browsers, workstations, tablets and virtual and augmented reality headsets from anywhere in the world.”

“We have been working with end users on a daily basis getting early, critical feedback,” Demchak said. “Some of the early experiments didn’t look good. We defined things that could break the system and confronted those head-on.”

A key challenge was ensuring a user could go into the model and have it be interactive and immersive in a game engine, to break out of 2D projection and get into the scene at a high frame rate. The goal became to use 3D models to bring up the quality into a globally illuminated scene and conduct a multiuser design review at 90 frames per second without reducing the complexity of the model.

Bentley Systems had the first part of the puzzle solved. Since the models were coming into SYNCHRO, there was already a layer in the iTwin platform. The metadata is part of its core structure. 

“We developed a ground-breaking plugin for Unreal Engine that could connect with iTwin and request and render at run time with algorithms to get polygons and triangles right when we need it,” Demchak said. “That allowed us to get this high frame rate. We can build that app, which can run locally or in a server and stream graphics over Wi-Fi to the Oculus Quest 2, leveraging the fact that we are streaming everything rather than loading it all onto the device itself.”

Demchak provided a demo of the XR app created. Rather than using the traditional video or rendering, a VR experience allows users to get inside the 3D model and have a more human sense of scale and space--leading to a better understanding of the design. Stainless steel materials were applied to the portions that have already been installed, while vivid colors are used to show the work that is yet to be done—all without having to delete or remove geometry. 

“The plan is to progressively keep adding more detail, take this on site and let construction teams look at it and get their feedback,” Demchak said. “Will this help workers if they can have a digital rehearsal before going on site? Get some muscle memory to be more efficient, aware and conscious of what they need to do, or avoid mistakes and issues in the field by allowing people to have that experience beforehand.

Because it is immersive, it is truly an experience of one-to-one immersion that you don’t get in a game or a CAD software package.”

The innovation team headed by Demchak recently visited the site and field-tested the solution with engineers and the communication team, where the feedback has been very positive.

The Future of XR 

The ITER project shows signs of real-world success—though it has not been tested with field users quite yet. It indicates that XR may become an integral tool in the AEC industry. The growing number of XR capabilities may be the needed solution for visualization issues while providing everyone involved with a shared understanding of a project. 

For example, AI is becoming a key part of interfacing with XR applications. 

“It makes a lot of sense to use things like natural language understanding to describe the things you want to happen in the scene,” Cannady said. “You won’t have a keyboard, so your voice becomes a natural extension of the interface. Second, when rendering, local devices and headsets are going to become more powerful. At some point, it makes sense to think about what types of renderings should be done on the device and what should be done remotely. Third, take advantage of AI to bring object awareness to your actual environment. When you look at a scene, it is no longer a collection of polygons but has a real object state.”

Applications that take advantage of XR, including the Bentley iTwin platform and CloudXR, will help users visualize and optimize their groundbreaking projects in ways we could not have been able to imagine just a few years ago. This rapidly evolving medium will change how we work and solve critical problems.

To learn more about XR in the AEC industry, visit Bentley Systems.