GTC: NVIDIA Omniverse Enables Real-Time, Virtual Collaboration

New Omniverse platform enters open beta; NVIDIA announces DPU products.

New Omniverse platform enters open beta; NVIDIA announces DPU products.

Fans of science fiction and comic books are already familiar with the concept of the multiverse (multiple parallel universes). The developers at NVIDIA are bringing another type of parallel universe—the Metaverse—into reality. As CEO Jensen Huang outlined in the Kitchen Keynote that opened the NVIDIA GTC virtual conference this week, “The next 20 years will seem like nothing short of science fiction.”

In his keynote, Huang emphasized the importance of enabling advanced artificial intelligence (AI), simulation, and virtual reality to enable new levels of collaboration and new platforms for developing advanced technology. 

The metaverse, as NVIDIA defines it, is a place where human and software agents interact in a virtual reality space. “Gamer inhabitants of these early metaverse are building cities and gathering for concerts,“ Huang said. But this will not just be a place for gaming—“We will create the future in these metaverses before downloading the blueprints to be fabbed in the physical world,” he said. 

A key part of this vision is the NVIDIA Omniverse platform, which is now in open beta. “This is a platform for collaboration,” Huang said. “It is designed to connect to many worlds like Maya, Blender, Autodesk and others. Omniverse allows designers and creators and AIs using different tools in different worlds to connect in a commons and collaborate.”

Omniverse will allow millions of designers to work together in real-time both on-premise and remotely in an NVIDIA RTX-based 3D simulation and collaboration platform that “fuses the physical and virtual worlds to simulate reality in real time and with photorealistic detail,” the company said.

Huang also provided a demonstration of NVIDIA DRIVE Sim autonomous vehicle simulation tool operating within Omniverse. According to the company:

“The video shows a digital twin of a Mercedes-Benz EQS driving a 17-mile route around a recreated version of the NVIDIA campus in Santa Clara, Calif. It includes Highways 101 and 87 and Interstate 280, with traffic lights, on-ramps, off-ramps and merges as well as changes to the time of day, weather and traffic.

NVIDIA CEO Jensen Huang.

“To achieve the real-world replica of the testing loop, the real environment was scanned at 5-cm accuracy and recreated in simulation. The hardware, software, sensors, car displays and human-machine interaction were all implemented in simulation in the exact same way as the real world, enabling bit- and timing-accurate simulation.”

The open beta follows a yearlong early access program that NVIDIA says involved Ericsson, Foster + Partners, ILM and more than 40 other companies evaluating the platform and providing feedback to the NVIDIA engineering team.

“Physical and virtual worlds will increasingly be fused,” Huang said. “Omniverse gives teams of creators spread around the world or just working from home the ability to collaborate on a single design as easily as editing a document. This is the beginning of the Star Trek Holodeck, realized at last.”

Omniverse is based on Pixar’s Universal Scene Description (USD) format for universal interchange between 3D applications. The platform also uses NVIDIA technology such as real-time photorealistic rendering, physics, materials and interactive workflows between industry-leading 3D software products.

“Omniverse represents the platform of the future for all aspects of virtual production,” said Bill Warner, Avid Technology founder and chairman of Lightcraft Technology. “We’ve been actively evaluating this platform from NVIDIA and have made the decision to base our entire product line on this amazing new technology.”

Industrial Light & Magic (a Lucasfilm company) has utilized Omniverse, as have several architectural design and engineering firms. Foster + Partners, a U.K. architectural design and engineering firm, is using Omniverse to help with data exchange workflows and collaborative design processes. Woods Bagot, a global architectural and consulting practice, is exploring the Omniverse platform to have a hybrid cloud workflow for the design of complex models and visualizations of buildings. Telecommunications giant Ericsson is using Omniverse to simulate and visualize the signal propagation of its 5G network deployment using real-world city models.

Omniverse also has support from software partners including Adobe, Autodesk, Bentley Systems, Robert McNeel & Associates and SideFX. Blender is working with NVIDIA to add USD capabilities to enable Omniverse integration with its software. 

“The importance of our two-year collaboration with NVIDIA cannot be overstated,” said Amy Bunszel, senior vice president for Design and Creation Products at Autodesk. “Projects and teams are becoming increasingly complex and we are confident Autodesk users across all industries will share our enthusiasm for Omniverse’s ability to create a more collaborative and immersive experience. This is what the future of work looks like.”

A complete list of software partners is available at nvidia.com/omniverse.

Users can sign up for the Omniverse open beta program at nvidia.com/omniverse. It will be available for download this fall.

Data center on a chip

At GTC, NVIDIA also announced a new kind of process called the data processing unit (DPU) that is supported by DOCA, a data-center-infrastructure-on-a-chip architecture that enables enhanced networking, storage and security performance.

“The entire data center can now be provisioned as a service using virtual switches and routers,” Huang said. “That is a huge tax on the CPUs. At least 30 percent of CPU cores are consumed running the infrastructure.”

Huang laid out the company’s three-year DPU roadmap at GTC that includes the release of the new NVIDIA Bluefield-2 family of DPUs and the NVIDIA DOCA software development kit for building applications on DPU-accelerated data center infrastructure services.

“The data center has become the new unit of computing,&rdqurdquo; said Huang. “DPUs are an essential element of modern and secure accelerated data centers in which CPUs, GPUs and DPUs are able to combine into a single computing unit that’s fully programmable, AI-enabled and can deliver levels of security and compute power not previously possible.”

Optimized to offload critical networking, storage and security tasks from CPUs, BlueField-2 DPUs enable organizations to “transform their IT infrastructure into state-of-the-art data centers that are accelerated, fully programmable and armed with “zero-trust” security features to prevent data breaches and cyberattacks,“ according to NVIDIA. 

According to the company, a single BlueField-2 DPU can deliver the same data center services that could consume up to 125 CPU cores. This frees up CPU cores to run a wide range of other enterprise applications.

You can read more about NVIDIA DPUs here.

BlueField-2 DPUs are sampling now and expected to be featured in new systems from server manufacturers in 2021. BlueField-2X DPUs are under development and are also expected to become available in 2021.

DOCA is available for early access partners now.



 

About the Author

Brian Albright's avatar
Brian Albright
Brian Albright is the editorial director of Digital Engineering and a contributor to Robotics 24/7.
Follow Robotics 24/7 on Facebook
Follow Robotics 24/7 on Linkedin


Email Sign Up

Get news, papers, media and research delivered
Stay up-to-date with news and resources you need to do your job. Research industry trends, compare companies and get market intelligence every week with Robotics 24/7. Subscribe to our robotics user email newsletter and we'll keep you informed and up-to-date.


Robot Technologies