Nvidia has announced several significant upgrades to its scientific computing for digital twins platform and released these capabilities for widespread use.  Highlights include the general release of Modulus, a physics-informed AI tool, support for new Omniverse integrations and support for a new 3D AI technique called adaptive Fourier neural operators (AFNO).  Both Modulus and Omniverse are downloadable today.

Image Credit: Edited version of image by yoggy0 via Flickr

These advances promise to change the way engineers think about simulation from an occasional off-line process to operational models baked into ongoing operations, Dion Harris, Nvidia lead product manager of accelerated computing, told VentureBeat.

These recent efforts complement other recent announcements, such as the intention to create Earth 2, ongoing collaborations with climate change researchers and ongoing efforts to simplify engineering design, test and development within the metaverse.  Nvidia has also collaborated with leading climate research supercomputing programs such as the European Centre for Medium-Range Weather Forecast (ECMWF) on Destin-E.

Nvidia digital twin announcement highlights

Nvidia announced Modulus at GTC last fall, which is now live.  It’s a physics-informed neural network model that allows you to train models for complex systems using physics-informed instructions.  This will improve climate simulations and explore physical, mechanical and electrical tradeoffs in designing products and buildings.  It helps accelerate the creation of AI-based surrogate models that abstract physics principles from real-world data.

The new Omniverse integration allows teams to feed the output of these AI physics models into the Omniverse.  This makes it easier to combine better AI models with visualization tools built into Omniverse.  More significantly, these new models are much faster than conventional physics models, making it easier to run them in real-time or explore more variations as part of scenario planning.  “It creates a different operational model for how you would engage with these data sets and simulation workflows,” Harris said.

The integration with Omniverse will make it much easier for engineers to weave digital twins capabilities into existing workflows.  Nvidia is building out a variety of connectors that allow engineers to ingest models from existing product engineering, architectural and simulation tools. Omniverse also helps allow teams to ingest data from AI models as well.

Omniverse provides a centralized hub for collecting data in interactive collaboration across data sets and disciplines. It ingests data from a variety of sources and uses the universal scene description format for organizing data on the platform.  For example, a better model in climate research may involve atmospheric data, geospatial data and human interaction data.  Harris said there is still work to be done in building universal scene description plugins for various platforms, which is one reason Omniverse is free for developers.

Another major upgrade is support for adaptive Fourier neural operators (AFNO).  This scientific term describes training neural networks that reflect 3D spatial states.  AFNO is part of a wider class of new approaches, including Fourier neural operators (FNO) and physics informed neural operators (PNO).  These techniques encode the 3D spatial relationships based on partial differential equation models, allowing teams to create more accurate surrogate AI models.  Traditional AI models that use convolution or other pixel-based approaches that less accurately encode the arrangement of 3D objects.

Better climate models with AI

 Nvidia also announced early results of these tools applied to climate research as part of the FourCastNet project.  This collaboration between Nvidia and leading climate researchers at Purdue, Lawrence Berkeley, the University of Michigan and others.  FourCastNet is an AI surrogate model used to perform midrange climate change forecasts at a global scale.  The research paper describes how the team uses AFNO to produce a very fast yet very accurate model that could be used for some of these midrange models.

In climate and weather research, the resolution is characterized in terms of kilometer squares, which are like pixels. The smaller the squares the better.  The state-of-the-art first-principles models such as the ECMWF’s integrated forecasting system (IFS) can achieve a 9-km resolution.  The state-of-the-art FourCastNet model is faster but less accurate than the state-of-the-art models built using traditional first principle approaches.

Today FourCastNet can achieve an 18-km resolution 45,000-times faster and uses 12,000-times less energy at the same accuracy as IFS.  Prior surrogate models maxed out at 25-km resolution. One factor in improving accuracy is the tremendous data requirements for training surrogate models compared to traditional approaches.  For example, the process of enhancing resolution from 18-km to 9-km will require about 30-times as much data.

There are two scales of weather and research centers, including about 17 larger climate change centers and about 175 smaller regional weather research groups.  The smaller centers have tended to focus on well-defined boundaries that neglected the impact of adjacent weather phenomena.  The new FourCastNet model will enable the smaller weather centers to simulate weather patterns that move across boundaries.

“This will democratize climate change research,” Harris said.

One caveat is that this model was trained on 40-years of climate data, which required a lot of processing time and energy. But once trained, it can be run on low-cost computers.  For example, the FourCastNet researchers were able to run a simulation on a 2-node Nvidia cluster that previously required a 3060-node supercomputer cluster.

Harris expects that first principle models and surrogate models will coexist for some time.  First-principles approaches will form a sort of ground truth, while the surrogate models will allow engineers to iterate on simulation scenarios a lot faster.  Nvidia has been working on ways to improve both.  For example, Nvidia has tuned its software to accelerate weather research and forecasting (WRF) and consortium for small-scale modeling (COSMO) models.

An ensemble of earths

 This FourCastNet work complements the Earth-2 announcement Nvidia made at Fall GTC.  Earth-2 is a dedicated system Nvidia is building to accelerate climate change research.  Earth-2 will combine Modulus, Omniverse and Nvidia hardware advances into a cohesive platform.  Omniverse integration will make it easier to ingest AI models, climate data, satellite data and data for other sources to build more accurate representations using all these inputs.

“Earth-2 system will integrate everything we are building into a cohesive platform,” Harris said.

This will make it easier to combine a variety of scientific disciplines, research techniques and models into a single source of truth.  The collaborative aspect of Omniverse will help researchers, policy planners, executives and citizens work together to solve some of the world’s most pressing problems.

Discovering new unknowns

Faster simulations also mean that researchers can explore the ramifications of simulation with slightly different assumptions within a model.  Climate change researchers use the term ensemble to describe a process of testing multiple models with slight variations.  For example, they might run a simulation 21 times to explore the impact of minute variations of assumptions on the overall projection.  FourCastNet will allow researchers to simulate 1000 member ensembles, providing much higher confidence in the prediction.

Harris said, “it’s not just about being able to run the models faster.  You can also run it more to get a more accurate estimate of the outcome.  You get a new understanding of how to think about it once you’ve seen this complex system in motion in 3D space.”

Siemens had already been running similar kinds of models, but only at the design phase.  These faster simulation techniques allowed them to run similar types of models continuously during operations.  For example, Siemens has used these techniques to model heat transfer systems in a power plant and the performance of wind turbines more efficiently.  A new surrogate wind performance model is expected to lead to optimized wind park layouts capable of producing up to 20% more power than previous designs.

“We see digital twins being adopted in everything from medical to manufacturing, scientific and even entertainment applications,” Harris said.

 

Author: George Lawton

VentureBeat