What is it?
At its GTC 2021, Nvidia officially announced its Omniverse Enterprise, a platform that “aims for universal interoperability across different applications and 3D ecosystem vendors.”
Here’s a high-level explanation: Omniverse Enterprise enables teams to “create and share virtual 3D worlds that obey the laws of physics,” worlds where designers, architects, engineers, and other “creators” can collaborate in real time.
Crucially, each trade can use connectors to work from their own software, including flagship applications from Adobe, Epic Games, Blender, Bentley Systems, ESRI, Trimble, Graphisoft, and more. A user makes a change in one application, and collaborators will see the results automatically their own application.
The tech behind it
The platform exploits Nvidia’s real-time ray tracing technology and its physics engine for realistic visualization and simulation. It also runs on a Pixar technology called universal scene description (USD), an open framework for sharing different kinds of 3D information among multiple parties.
The money problem
To be blunt, Omniverse is not cheap. AEC Mag broke it down in their excellent coverage of the announcement for architecture, engineering, and construction users:
“Nvidia Omniverse Enterprise looks to be focused on large, global firms. For workgroups of about 25 people, prices start at $1,800 per user, per annum, plus $25,000 for the Omniverse Nucleus server, which forms the backbone to the collaborative platform.”
The web browser for 3D
In addition to enabling real-time collaboration, Nvidia says Omniverse may help reduce the time suck of working in multiple 3D applications.
As Nvidia general manager of media and entertainment Richard Kerris explained, professionals often use multiple 3D “packages” in their work—for instance 3D modeling packages, or 3D painting packages. Since no two applications use the same “language” that means professionals spend a lot of time exporting data from one application, bringing it into another, exporting it again, etc. And that’s not only true in large global teams, it’s a problem even when a person is working on a project by themselves.
That needs to change in order to build the metaverse, Kerris says. To that end, Omniverse puts these applications on the same platform, so users can move between the applications seamlessly. “You can think of USD as the HTML of 3D,” he says, “[and] think of Omniverse as the browser for that HTML of 3D.”
A virtual testing ground
It’s important to note that Nvidia has set its sights on more than design collaboration—they also want enterprises to use Omniverse for virtual simulation.
BMW is already using the platform for R&D. They are viewing and simulating all elements of their 31 factories, “including the associates, the robots, the buildings, and the assembly parts.” This enables applications like predictive maintenance, big data analytics, and even the virtual planning of their next factory.
Volvo is creating an “in-car experience” for potential customers in their on-line configurator project, and testing cars in virtual environments before committing to a design. Communications company Ericsson is using it to simulate and model 5G networks.
Kerris argues that the uses could be even bigger. You could use Omniverse to simulate virtual worlds for testing and training autonomous vehicles, he says. You would have the car drive from coast to coast, in real-time, feeding a virtual world into the car’s actual hardware to give it a VR experience.
Venturebeat notes in their own in-depth coverage of Omniverse, “you can simulate the creation of robots through a tool dubbed Isaac. That lets engineers create variations of robots and see how they would work with realistic physics, so they can simulate what a robot would do in the real world by first making the robot in a virtual world.”