FS Studio Logo
Mar 28, 2023

Omniverse Just Got a Lot More Powerful and Will Reshape How We Work

By Bobby Carlton

Nvidia introduces new connectors for 3D design, simulation, and collaboration, as well as Microsoft 365 integration through omniverse.

Through its Omniverse platform, which provides 3D simulation and collaboration capabilities, Nvidia has introduced new connectors that allow developers to easily connect their applications to each other using a Universal Scene Description framework.

The new connectors are designed to work seamlessly with various applications, such as Cesium, Unity, Blender and Vectorworks. Their roadmap also shows the connectors working with Blackshark.ai, NavVis, and Azure Digital Twin at a later time. This move will add to the hundreds of other connectors that are already available, such as Revit, SketchUp, Archicad, and 3ds Max.

Announced during Nvidia's GTC 2023 keynote , Nvidia CEO Jensen Huang said “The world’s largest companies are racing to digitalize every aspect of their business and reinvent themselves into software-defined technology companies." Jensen added, “NVIDIA AI and Omniverse supercharge industrial digitalization. Building NVIDIA Omniverse Cloud within Microsoft Azure brings customers the best of our combined capabilities.”

Through the upcoming release of Omniverse Kit 105, which is scheduled to arrive in the next couple of months, the company will introduce new features that will allow developers to create 3D models that are dynamically distributed across different surfaces. According to Richard Kerris, the VP of the platform development, the new subsurface scattering shader will allow them to perform various functions such as splitting and refracting light.

"When light hits an object, depending on what that object is, a light can be refracted or split or shattered through the different types of surfaces," said Kerris, adding "So when light hits marble or it hits something like skin, it doesn’t just bounce off of it, there’s actually parts where the light goes in, and it scatters around, but it’s very computationally hard to do."

Two years ago, Nvidia became the first company to implement real-time ray tracing. Through the new subsurface scattering feature, the company will continue to provide the industry with the first truly real-time ray tracing.

The company also introduced new features that will allow developers to create large 3D models that are dynamically distributed across different surfaces. These new features include the ability to transfer data between different regions and the ability to optimize their assets.

Omniverse
Image: Nvidia

Through its partnership with Microsoft, Nvidia has been able to bring the company's Omniverse Cloud to the Microsoft Azure platform. The next step is to make it available in the Microsoft 365 ecosystem. This will allow teams to use the platform to create and manage 3D models. According to Kerris, the ability to create and manage these models will allow participants to get a deeper understanding of what's happening in the team.

Each of them will have their own experience in that 3D environment, collaboratively,” says Kerris.

The ability to integrate the Omniverse platform into teams will allow them to easily create and manage 3D representations in the same way we can in a 2D web experience. According to Kerris the ability to do this will allow participants to get an improved understanding of the virtual world around them. This eliminates the need for local processing.

Nvidia Siemens
Image: Nvidia

According to Kerris, users will be able to access the cloud in the same way they would if they were using a browser. The company also announced that its Omniverse Cloud will be able to connect to the Microsoft Azure IoT ecosystem. Through this partnership, users will now be able to receive real-world sensor inputs from the platform.

Another big announcement was that Nvidia is very focused on bringing ChatGPT in as part of the Omniverse experience. Kerris explains that end users will be able to use ChatGPT and instruct it to write code which they can then drop into Omniverse. This means everyone can be a developer.

“You’ll have an idea for something, and you’ll just be able to tell it to create something and a platform like Omniverse will allow you to realize it and see your vision come to life,” said Kerris.

Through ChatGPT, developers can now use AI-generated data to create extensions for Omniverse, such as Camera Studio, which can generate and customize cameras.

In addition, Nvidia introduced the Nvidia Picasso, which is a cloud service that allows software developers to create AI-powered 3D and image applications. According to Kerris, this will allow them to create models that are based on a specific keyword and send them to Omniverse.

The company also introduced its third-generation OVX computing system, which is designed for large-scale computational twins running in the Omniverse Enterprise platform.

During the last moments of his GTC keynote, Huang said "Omniverse can unify the end-to-end workflow and digitalize the 3 trillion dollar and 14 million employee automotive industry."

The impact of all of this will surely reshape how every single industry operates that includes food, medical, manufacturing, entertainment, and more, moving forward as more companies become automated and look for solutions to streamline workflow.

Omniverse is leaping to the cloud. Hosted in Azure, we partnered with Microsoft to bring Omniverse cloud to the worlds industries."

Related Posts:

crossmenu