From CAD to Digital Twins to XR

Is a continuous digital thread possible or necessary?

Is a continuous digital thread possible or necessary?

Siemens and NVIDIA demonstrated the ability to visualize large industrial datasets in NVIDIA Omniverse at GTC 2024. Image courtesy of Siemens.


NVIDIA’s metaverse, called NVIDIA Omniverse, is a critical building block in Siemens’ digital twin vision. Accompanying that vision, in April at the NVIDIA GPU Technology Conference (GTC) in San Jose, CA, Siemens announced, “In the next phase of our collaboration with NVIDIA, the company will release a new product later this year—powered by NVIDIA Omniverse Cloud APIs [application programming interfaces]—for Teamcenter X, our industry-leading cloud-based product lifecycle management [PLM] software, part of the Siemens Xcelerator platform.”

NVIDIA released a Beta version of Omniverse in 2019, then officially launched it in 2021. This year, the company revealed its plan to offer components of it as microservices, or cloud APIs, available to other software vendors. These services address a range of functions applicable to digital twins, from collaborative design review to visualization in XR (extended reality, mixed reality) devices. In this article, we examine whether it’s possible to maintain an unbroken digital thread from computer-aided design (CAD) to metaverse to XR.

CAD to Metaverse

Mainstream CAD design programs describe geometry primarily in mathematically accurate lines, arcs and surfaces, along with parametric history. On the other hand, metaverse platforms tend to favor lighter mesh models. NVIDIA Omniverse, for instance, uses Universal Scene Description (USD), a format pioneered by the animation studio Pixar.

The number of mainstream CAD software supporting USD is growing. Autodesk, NVIDIA, Pixar, Adobe and Apple are founding members of Alliance for OpenUSD, which functions as the steering group for the format. General members include Ansys, Siemens and Hexagon.

CAD models imported into NVIDIA Omniverse—or any metaverse, for that matter—usually go through a detail reduction as part of the transformation.

“It’s reasonable to reduce these details, if it aligns with your objective. A purist might insist on always looking at the model with full details, but if you don’t need these details for your metaverse usage, you’re carrying a lot of overhead in computing,” says Dale Tutt, vice president of industry strategy for Siemens.

In the rare cases where a manufacturer needs to visualize, simulate and interact with a large model—a ship or a plane, for instance—with all the engineering details, metaverses will likely be able to accommodate using graphics processing unit (GPU) acceleration. At GTC, Siemens and NVIDIA demonstrated that it was possible to visualize in real time a massive dataset from Siemens customer HD Hyundai, which develops ammonia- and hydrogen-powered ships.

Game engines are also expected to become metaverse enablers. The two leading game engines—Epic Games and Unity—address the CAD-to-metaverse connection in their own ways. Starting in late 2022, Epic Games began using Tech Soft 3D’s HOOPS software developer kit (SDK) to facilitate the conversion. Unity uses the PIXYZ plug-in to allow CAD users to import models.

Spawning More Twins From Digital Twins

Tutt says a continuous syncing of changes from CAD to metaverse may not be necessary. However, “when the CAD model changes, there should be some triggers that alert you and ask you if you want to update the downstream models. You may choose not to, if you decide the change is not critical, but there should be a mechanism to maintain these live linkages; that’s the only way to ensure you’re always working with the latest design.”

Sony and Siemens partnered to develop a way to design and edit in NX using Sony VR headsets. Image courtesy of Sony.

The alerts to updates—and the choice to decide when to adopt these changes—are essential features of what Tutt considers “the industrial metaverse.” Such a setup, he feels, allows digital twin operators to spawn various versions of a product, place or process for simulation and testing.

“Let’s say you have a factory up and running, and you’re planning to make changes to it. While version 1.0 of your factory is up and running, you may also be evaluating the changes you want to make in context. So you need to have the ability to manage these versions separately in the metaverse,” says Tutt.

CAD to XR

For collaboration around large digital twins NVIDIA offers various SDKs for XR. Among them is CloudXR, for streaming virtual reality (VR) or augmented reality (AR) content from a remote server. NVIDIA VRWorks is a set of APIs and libraries for VR hardware and software developers to create applications. At the same time, many CAD vendors are also enabling the CAD-to-XR workflow with various solutions.

Siemens describes its approach as “one click to VR.” With it you can “inspect and review designs in full scale using a selection of familiar tools all connected directly through NX to the digital thread,” according to Siemens.

“What you’re looking at [in VR] is a stripped-down version of the original CAD model,” says Tutt. “You can still see the details, but internal parts and the parametric history may be removed. Considering the computing power available in the headsets today, it’s better to use that approach so you can run VR experiences in real time.”

The built-in memory and computing power in the XR headsets are usually limited due to space constraints and the need to keep the headsets from overheating. To interact with highly detailed models, users tend to choose headsets powered by workstations, via wired connections or backpacks. CAD-to-metaverse or CAD-to-XR workflows are mostly a one-way street. In other words, changes made in CAD may be pushed to the metaverse, but bidirectional connection is currently not the norm. Siemens, however, is making it possible in an application developed for Sony head-mounted display (HMD) users.

“When you’re in the Sony HMD with NX Immersive Designer app, you are effectively designing in Siemens NX,” says Tutt. “There are still some intermediary steps required, but if you make a change in your design in XR in the Sony headset, you’re in fact modifying the NX model.”

Go Through the Metaverse or Take a Shortcut?

Unlike the CAD-to-metaverse-to-XR route, the direct CAD-to-XR route bypasses the need to rely on the metaverse. With one less step in the digital thread, this approach arguably gives a more faithful rendition of the original CAD design. But Tutt points out that the path through the metaverse, though circuitous, serves a special purpose.

“With the metaverse approach, you can bring in data from many different sources besides CAD,” he says. “For example, if you’re operating a plant, there may be data analytics you need to incorporate.”

Other industrial metaverses may emerge to address the digital twin operators’ needs, but for Tutt, “With what it has been able to accomplish with GPU computing, NVIDIA is the market leader. That’s why we’re partnering with it.”

More NVIDIA Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


#29001