Real-Time 3D Engines Launch Aerospace into Immersive Virtual Universe

Unity Technologies has sponsored this post.

(Image courtesy of Varjo.)

Back in the ‘90s, NASA used to build virtual reality (VR) systems to train astronauts on spacewalks around the International Space Station (ISS). These systems would employ bulky headsets that would generate audio and video from a large computer, and body movements would be tracked. While the technology would somewhat provide users the experience of interacting with a virtual world, there were definitely flaws present in these early days of VR. The computers would have trouble keeping up with human movement. As a result, the lag in visuals would often produce motion sickness. The graphics themselves were not realistic; they resembled cartoons more than anything in the real world.

As if all this wasn’t challenging enough, these early systems cost $30,000 to $40,000 to build, and required updates and maintenance from expensive NASA engineers. It was a logical progression then, when in 2018, ISS residents were supplied with fresh gear: $500 Oculus Rift VR headsets powered by $2,000 HP ZBook laptops. With VR systems becoming increasingly democratized, NASA continued its transition towards real-time 3D platforms to help teams collaboratively design space stations, train crew members on critical safety procedures, and more. Rather than spending time and money on reinventing the virtual wheel, NASA could now take advantage of rapidly evolving commercial extended reality (XR) technology as soon as it came to market.

Unity is one game engine that has been working with aerospace giants like NASA for a variety of use cases. WorkLink, a mixed reality platform built on Unity and developed by Scope AR, is used by Lockheed Martin technicians to enhance efficiency when building the Orion spacecraft for NASA’s Artemis space program. For example, the platform was leveraged for stress monitoring of the spacecraft shell, where mixed reality (MR) goggles provided engineers with digital instructions and diagrams overlaid on equipment. Using these visualization aids, operators were able to align and attach strain gauges and transducers in a process that was vastly reduced from an entire shift to 45 minutes—resulting in time savings of 91 percent.

Using Microsoft HoloLens 2 during the build of the Orion spacecraft. (Image courtesy of Scope AR.)

Lockheed Martin further used mixed reality to reduce torque application processes for threaded fasteners from six weeks to two weeks, while minimizing damage to fastened materials. The MR technology also came in handy for simplifying drilling tasks that required a high degree of accuracy, while simultaneously reducing ramp-up time by 85 percent.

In engineering.com’s latest white paper, Reaching New Heights in Aerospace with Immersive Technologies, we delve into how real-time 3D technology is taking off within all stages of the aerospace industry. The paper provides valuable insights from aerospace experts and explores real-world case studies for organizations ranging from Airbus to the U.S. Navy.

Streamlining Processes with Real-Time 3D Platforms

Real-time 3D platforms, such as Unity, are increasingly being used in industrial applications and can integrate with other software tools. Unity, for example, enables users to easily develop virtual environments at scale and deploy to different devices and platforms without the need for solutions to be redeveloped separately for each use case.

Top Uses of Real-Time 3D Technology in Aerospace

When it comes to aerospace, real-time 3D technology has enhanced processes across almost every stage of the product lifecycle—from design to operations and marketing.

In the design use case, real-time 3D platforms enhance the visualization of data and lead to the identification of potential issues without the need for multiple prototypes. Reductions in design cycles can save millions of dollars where large-scale products like aircraft are concerned.

One key application of real-time 3D technology is immersive learning.

“[Virtual training] is engaging, and leads to better learning retention rates than lectures do,” said Josh Swanson, head of extended reality at Pace Aerospace Engineering and IT. The company has developed the Pacelab WEAVR platform to create immersive training programs for a variety of industries.

By operating simulated systems, users are able to exercise motor control and decision-making skills—ultimately accelerating training and performing better when they visit the actual aircraft. Pacelab WEAVR makes training content accessible on a range of devices, from desktop PCs and tablets to AR/VR headsets. The platform supports free play, guided instruction and even assessment.

Pacelab WEAVR’s platform provides feedback to users on how well they performed during training. (Image courtesy of Pacelab WEAVR.)

“You can create ‘if-this/then-that’ type of programming as well,” added Swanson. “It’s extensible in the sense that you can bring in other aspects like fire animations without having to create them inside of the tool. You could buy assets from the Unity Asset Store, for instance.”

Despite the complexity of technical procedures involved, it takes a surprisingly short amount of time to train for even creating these sophisticated end-to-end scenarios.

“Our standard training program is one week of training for people who already have some familiarity with the Unity Editor, and two weeks if they don’t,” said Swanson. “We have customers who are instructional designers—not programmers—and they’re now using [Pacelab WEAVR] along with Unity to create VR training.”

Artificial intelligence (AI) and machine learning (ML) solutions take training one step further through the procedural generation of scenarios for simulations. This enables users to safely prepare for edge cases such as natural disasters without putting their own lives at risk. Lockheed Martin is leveraging Unity’s platform to develop a solution called BattleViz, which is used for battle simulations.

“BattleViz allows users to view different battle scenarios through real-time data feeds or playback, with the option of conducting a review after the fact,” explained Scott Robertson, senior virtual prototyping engineer for Skunk Works at Lockheed Martin.

BattleViz is supported by Unity’s geospatial capabilities, which deliver Round Earth visualization for the creation of high-fidelity real-time 3D training environments. Unity’s AI/ML software is also being employed by the likes of Boeing to generate synthetic data for training computer vision models. This synthetic data is beneficial in conjunction with real world data to increasing the accuracy of computer vision models, as real-world data isn’t diverse enough to capture all of the possible scenarios, lighting, and occlusion.

Computer vision applications in aerospace. (Image courtesy of Unity.)

Developing Trends of Real-Time 3D Technology

Due to the efficiencies that real-time 3D technologies are creating within workflows, there is a clear shift towards the adoption of these solutions in industry. As the software becomes ubiquitous, many software development companies are looking to partner with real-time 3D platforms like Unity for the building of virtual environments.

For more information on real-time 3D technologies currently impacting aerospace, check out our white paper: Reaching New Heights in Aerospace with Immersive Technologies.