VR in CAD: Where Are We Now?

Virtual reality (VR) has come a long way since the offerings of the late 1980s and 1990s.

Gone are the nausea-inducing, low resolution outings such as Virtuality’s “Dactyl Nightmare” and the monochrome VirtualBoy from Nintendo, and in their place, we all have access to high resolution VR capabilities in our own pockets.

Yes, assuming you own a smartphone with gyroscope capability (which is most smartphones), you can slide your device into a $15 Google Cardboard headset and have access to VR worlds that wouldn’t have been conceivable 20 years ago.

Figure 1.Though she appears to be laughing, she has crippling motion sickness.

This has all been driven by the reduction in processor size, reduction in sensor size, and an increase in resolution offered by fancy new device screens. No longer do you require a bulky Commodore Amiga 3000 to run your 756x244pixel first-person shooter. Instead, you can experience full HD glory anywhere you like.

So, with the increase in accessibility of VR systems, we also see an increase in applications—not just in gaming, but in engineering circles.

For this article, we spoke to five engineering CAD companies about how they have adopted AR/VR to enable engineers a new dimension when it comes to technical communication, design and presentation.

SOLIDWORKS

We have covered quite a lot about SOLIDWORKS Visualize Professional over on our sister site EngineersRule. The latest release of Visualize Professional features AI denoising and even a vehicle physics simulator, allowing not only photorealistic rendering but also rapid, noise-free rendering to boot.

And naturally, such high quality rendering would befit some form of VR capability…and it does.

Figure 2. Sun study in SOLIDWORKS Visualize. (Image courtesy of Brian Hillner.)

“Dassault Systèmes SOLIDWORKS has a 20+-year legacy of providing only the best CAD tools to our customers and as such we are pioneering development of bleeding edge technologies,” said Brian Hillner, SOLIDWORKS product portfolio manager.

“Virtual reality has proved to be much more than a gimmick, being used by aerospace, automotive, product design and even medical industries throughout design development and even sales. Using a new 360-VR camera, SOLIDWORKS Visualize Professional can create jaw-dropping photo-accurate images and animations, even supported for simple playback with any smartphone and a $15 Google Cardboard.”

And if you combine this new 360-VR camera with the AI denoiser, then what you have is the ability to render ultra high definition, 360-degree content at up to 10x the pace as without the denoising magic.

“This allows our customers to develop products faster than ever before, all while reducing physical prototyping costs and expensive manufacturing errors. This results in dramatically shorter development cycles and an overall better final design.”

Figure 3. Rendered and ready for VR. (Image courtesy of the author.)

You can see one of the animations in the video below. Just put your VR headset on, be it a GearVR, Cardboard, or whatever you own, and start it up. And if you don’t yet have a headset, you can still play the video—just play it in your browser as you would any other video and drag your mouse cursor around the video screen playback area to have a look around you.

And if you’d like to know how to create these VR experiences for yourself in SOLIDWORKS Visualize Professional, you can check out our tutorial over at this link.

Onshape

Cloud-based CAD platform Onshape has also been getting involved with VR/AR, and has recently teamed up with AR headset system company MagicLeap to help bring CAD to the quasi-holographic realm. The application has been referred to as a “special computing CAD app” and is good for not only viewing CAD designs but also editing them, which is really a considerable step beyond what most other companies are offering at the moment. While the other CAD companies are focused on bringing high-end visualization and even a fair amount of interactivity to their VR offerings, these products are still pretty much restricted to viewing VR content, rather than creating it in VR itself.

Onshape for Magic Leap was unveiled at the L.E.A.P Conference in October 2018, where Onshape CEO Jon Hirschtick gave a preview of the system to attendees.

The collaboration between Magic Leap and Onshape will enable the manipulation of parts in a more natural way, with gesture recognition allowing sketching, resizing and various other CAD functions that we are used to performing with a mouse in two dimensions.

Figure 4. Onshape in Magic Leap. (Image courtesy of Onshape.)

“There is something unique here,”said Joe Dunne, Developer Relations at Onshape.

“The full Onshape app is running directly on the Magic Leap device. This is not a new viewer application or something like that. It’s full CAD running on the device. Because of OnShape cloud architecture, supporting Magic Leap is no different for us than supporting iPad, or iPhone, or Android.”

And because it is cloud based and running in your browser, it has cross-platform functionality too.

“As an example, if three users are all working on the same project,one can be using a browser, another can be on a train using their mobile device, and a third can be in a different office wearing the Magic Leap headset, all working on the exact same model concurrently,” said Dunne.

“Magic Leap is a cloud device, so Onshape supports it. No other CAD company will be able to do this (that I am aware of, anyway).

“I think it's safe to say other devices can also be supported by OnShape in a similar way. We are going to let customer demand dictate how we allocate our development resources.”

You can see the presentation from CEO Jon Herschtick at the L.E.A.P Conference in the video below (his talk begins at around the 2 hour 40 minutes mark).

Unreal Engine

We have been taking a look at the real-time rendering video game engine Unreal Engine 4 (UE4) from Epic games quite a lot recently. And we have also explored the company’s new Unreal Studio feature in a series of articles aimed at showing you, the engineer, how to bring your designs into hyper-realistic life within Unreal Engine.

In case you missed those articles, here is a recap on Unreal Studio.

Unreal Studio is a collection of tools designed to help engineers and architects import technical CAD files into Unreal Engine, which has traditionally been geared to more nontechnical CAD formats, such as Blender, Rhino or Maya. The problem that industrial users of Unreal Engine found was that converting technical CAD data into a format that Unreal Engine 4 (UE4) was a bit of a chore. Technical CAD files, especially those used for manufacturing, can take up a lot of file space due to the accurate curved geometry.

Luckily, Unreal Studio makes the importing workflow easier, so now engineers can import CAD data designed for the shop floor into UE4 with minimum fuss while retaining all the features of the original models.

And given that it’s a video game engine, it’s not surprising that UE4 has some VR capability. Scratch that…it has tons of VR and AR capability, and a whole plethora of tools to help port to your chosen VR or AR platform.

Figure 5. One of these cars is real. One is not. (Image courtesy of Epic Games.)

Users can 3D design scenes on their laptops, change the settings to fit the destination format requirements, compile the executable file, and run it on pretty much any VR or AR system, including Samsung GearVR (we have tested it for ourselves), HTC Vive, Oculus Rift, and even Google Cardboard and other Android-based VR systems.

Engineers, architects and product designers alike have been using UE4 for technical communications (and more) for a long time…hence the demand to develop a product such as Unreal Studio.

“We’ve seen incredible innovation in the enterprise space with engineers combining the real-time capabilities of Unreal Engine with virtual and augmented reality solutions to solve complex problems and visualize their designs with more detail and clarity,” said Marc Petit, general manager for Unreal Engine, Epic Games.

“From automotive design to subsea exploration, a wide array of engineers are finding that real-time, immersive visualization is a gamechanger for increasing efficiency, quality and safety in their industries."

One such industrial user of Unreal Engine’s VR capabilities is Germany car manufacturer BMW.

BMW has set up a mixed reality lab, consisting of Unreal Engine-powered VR, mixed with real-life, physical hardware, to provide users with an even greater degree of immersion than is possible from using VR alone. Its Mixed Reality test rig consists of a simulated car interior, complete with wheel, pedals and reclining car seats, and a VR system running Unreal Engine 4. The computer powering this mock-up is a water-cooled gaming rig, running two NVIDIA TITAN X GPUs, enabling real-time rendering of 90 frames per second.

Figure 6. BMW Mixed Reality Lab. (Image courtesy of BMW.)

With the Mixed Reality Lab, engineers at BMW can optimize the aesthetics of a car interior, as well as focus on actual spatially dependent tasks such as interface design, usability and other human factors-related design problems that would be otherwise impossible with mere two-dimensional screen-based CAD workflows.

For example, a designer may wish to know how easy it is to access the in-car entertainment system while he has the steering wheel at full lock…while his seat is reclined. It’s a situation we all take for granted, but a poorly designed user experience can dramatically affect human factor elements ranging from safety to accessibility to comfort. Thanks to the BMW Mixed Reality Lab, tasks such as this, and a huge variety of others, can be simulated visually and physically.

You can see more about how BMW is using mixed reality technologies to develop its systems and communicate designs in the video below.

Another interesting application of AR using Unreal Engine comes from marine engineering company Oceaneering.

Oceaneering manufactures high-end underwater remotely operated vehicles (ROVs) for use in underwater cable and pipe inspection and installation (among other sub aqua uses).The company is using an advanced 3D Visualization and Operation Management System named Abyssal OS Offshore to help navigate the often murky ocean depths.

One issue ROVs are constantly having to deal with is the poor visibility they encounter when working at depth. Not only does the bottom of the ocean come with its own visibility issues, but when ROVs are working at depth they can also churn up sediments that blind the ROV cameras. This sediment can reduce visibility for hours (or even days) and can cost ROV operators a lot of money in downtime. ROV operators are not cheap to hire, and neither are ROVs, so companies pay a financial penalty with this visibility-related downtime.

Abyssal OS has recently switched to Unreal Engine to assist with its visualizations, allowing engineers to recreate the real-world assets from the ocean floor in the gaming engine, which in turn permits navigation based on the 3D model overlay. You can see an example of this in Figure 7.

Figure 7. Underwater AR. (Image courtesy of Oceaneering.)

If you’d like to read more about how you can import engineering CAD data into Unreal Engine 4, then click this link to see the first of our tutorials and to also find out how to get yourself a copy of Unreal Studio Beta.

Siemens NX

With the release of Siemens NX 12.0.2 last year, the company finally took the plunge and entered into the VR CAD realm.

The release features the VR Design Review feature, which allows users to jump into VR from their NX session at the click of a button.

NX’s VR capabilities require the installation of Steam and Steam VR software, and Siemens is currently supporting the HTC Vive and Vive Pro HMD systems, with wider support for other systems presumably on the way.

While Siemens does state that it’s a design review tool and not a full-blown CAD editor, based on the graphics from the company’s promo videos, it still looks pretty cool and offers a bunch of tools and features for interacting, inspecting and validating models, as you can see in Figure 8.

Figure 8. Inspecting a large gas turbine in NX. (Image courtesy of Siemens.)

Another Siemens product to recently get in on the VR action is Simcenter STAR-CCM+. Last year the company added VR capability to the multiphysics simulation platform, enabling users to get far deeper insights than were available from a flat screen.

Every simulation engineer knows that a good simulation starts with a good mesh. Or, to put it another way, a bad simulation can certainly begin with a bad mesh.

In STAR-CCM+ VR, engineers can perform their meshing task, and then fly through the component, navigating and checking the mesh quality in ways never before possible. They can now inspect a mesh by moving through the component as if they were there in person, enabling them to turn their heads and fly through all the nooks and crannies where the mesh may be of questionable quality.

Take, for example, the model of the turbine blade mesh shown in Figure 9. Turbine blades have miniscule cooling channels and a myriad of complex internal cavities. Viewing this mesh on a 2D screen, rotating it and slicing it with a mouse could be cumbersome, but not anymore. A fly through of the turbine blade cavities can reveal the mesh with unprecedented detail and control.

Figure 9. Flying through a turbine blade in STAR CCM+. (Image courtesy of Siemens.)

“You can get a complete sense of the geometry in just a few minutes,” said Matt Godo of Siemens PLM Software. “It gives you a sense of scale and scope.…This is just not something that you can do with your monitor and mouse.”

You can see the results of a CFD simulation in Figure 10, which shows the exhaust gas flowing out from a pipe onto an oil rig. Putting that simulation into the HTC Vive and simulating it from a real human perspective allows for more intuitive insights, which may be absent from traditional CFD visualization methods.

Figure 10. A CFD simulation in VR! (Image courtesy of Siemens.)

Autodesk

If you take a look on the Autodesk VR pages, you will see that VR/AR is supported in products such as Revit, 3ds Max, VRED, Maya LT, and Forge, which offer a range of VR/AR tools to architects, automotive designers and engineers alike.

Let’s take a look at what VRED is bringing to the VR table. Because cars are cool, and they render pretty well.

VRED, for those who haven’t yet tried it, is a prototyping and 3D visualization platform aimed mostly at those working in the automotive sector. So you can expect high quality rendering, a wide range of photorealistic materials and paint schemes, and all those other things that make automotive designs pop.

For a while now, VRED has included VR capability, and even includes fully articulated hands, to be controlled in the virtual world with the appropriate controllers, such as those available for HTC Vive. VRED also includes extended support for more advanced VR HMDs such as VRgineers VRHero and XTAL VR.

Figure 11. Glowing hands. (Image courtesy of Autodesk.)

Along with the strange-glowing hands, which afford greater levels of interaction, VRED enables users to also access a range of tools for manipulating their designs in VR. Such tools include a virtual flashlight, which can help designers visualize lighting conditions that may be problematic in real life, as well as a slicing tool, which allows users to create a cutaway of the virtual model.

Figure 12. Slice your car up in VRED. (Image courtesy of Autodesk.)

But what of Autodesk’s flagship CAD product, Fusion 360?

Surely the revolutionary design tool that bought generative design to the masses must have some capacity for VR, right?

Not for the moment.

Although users wishing to explore their designs through VR can do so with the aid of a third-party plug-in named ENTiTi, which is available through the Autodesk App Store.

Final Thoughts

So there you have it. There are a few examples of VR uses in engineering, but are they just cool novelties and tech demos of a fad technology awaiting obsolescence at some point in the not-too-distant future? Or is there a real financial benefit to using VR?

Clickbait articles like this one or this one or this one here might convince you that VR is indeed dying, but they are basing their opinions largely on sales of gaming peripherals and other irrelevant stats. While gaming has been a major driver in recent VR developments, it doesn’t tell you the full story.

For example, what these articles aren’t telling you is that companies such as defense giant BAE Systems has managed to significantly reduce assembly time of battery components, and has improved the training efficiency of workers by 30 to 40 percent.

They’re also not telling you that companies such as Lockheed Martin have also used VR/AR to train satellite engineers, enabling a reduction in cable fastener assembly time by a staggering 93 percent. That translates into a saving of $10 million per year.

The list of success stories in high-end manufacturing goes on and on.

So who cares if gaming VR sales occasionally dip beyond expectations?

The defense and automotive industries sure don’t. And that’s where the real big bucks are. Like GPUs, which grew out of the gaming industry and now power science and technology research, VR also looks to be on a similar path.

So, it is fair to say that VR is really only just getting started as far as manufacturing is concerned, and we expect it to grow a lot more, especially when it comes to training for manufacturing…and even simulation.

The killer app that VR fans have long been waiting for may come in a different form, seemingly outside of the realm of gaming.