Nvidia anoints itself a creator of the metaverse

That's so Meta

Nvidia sees itself as a hardware overlord of the "metaverse" and dropped some hints about the operation of a parallel 3D universe in which our cartoon selves can work, play and interact.

The chip biz has added new plumbing in Omniverse – an underlying hardware and software engine that acts as a planet core fusing together virtual communities in an alternate 3D universe. Omniverse is also being used to create avatars to enhance real-world experiences in cars, hospitals and robots.

"We're not telling people to replace what they do, we are enhancing what they do," said Richard Kerris, vice president of the Omniverse platform, during a press briefing.

The Omniverse announcements came during the company's GPU Technology Conference this week. Nvidia CEO Jensen Huang will talk about many of these announcements on Tuesday.


Jensen as you've never seen him before. Source: Nvidia. Click to enlarge

One such announcement is Omniverse Avatar, which can generate interactive, intelligent AI avatars for things like aiding diners order food, or helping a driver self-park or navigate the roads better.

Nvidia gave an example of a conversational avatar to replace servers in restaurants. When ordering food, an AI system – represented by an on-screen avatar – could converse in real time using speech recognition and natural intelligence techniques, and use computer vision to capture a person's mood, and recommend dishes based on the knowledge base.

For that, the avatar will need to run several AI models – for example, speech, image recognition and context – simultaneously, which can be a challenge. The company has created the Unified Compute Framework that models AI as microservices, so apps can run in a single or hybrid systems.

Nvidia already has underlying AI systems like the Megatron-Turing Natural Language Generation model – a monolithic transformer language jointly developed with Microsoft. The system will be now offered on its DGX AI hardware.

Omniverse Avatar is also the underlying technology in Drive Concierge – an in-car AI assistant that is a "personal concierge in the car that will be on call for you," said Deepu Talla, vice president and general manager of Embedded and Edge Computing.

AI systems in cars represented by interactive characters can understand a driver and the car's occupants through habits, voice and interactions. Accordingly the AI system can make phone calls or offer recommendations of nearby places to eat.

Using cameras and other sensors, the system can also see if a driver is asleep, or alert a rider if they forget something in the car. The AI system's messages are represented through interactive characters or interfaces on screens.

Old dog, new tricks

The metaverse concept isn't new – it has existed through Linden Lab's Second Life, or games like The Sims. Nvidia hopes to break proprietary walls and create a united metaverse so users to theoretically jump between universes created by different companies.

During the briefing, Nvidia did not make reference to helping Facebook meet its vision of a future around the metaverse, which is at the center of its rebranding to Meta.

But Nvidia is roping other companies into bringing their 3D work to the Omniverse platform through its software connectors. That list includes Esri's ArcGIS cityEngine, which helps create urban environments in 3D, and Replica Studio's AI voice engine, which can simulate real voice for animated characters

"What makes this all possible is the foundation of USD, or Universal Scene Description. USD is the HTML of 3D – an important element because it allows for all these software products to take advantage of the virtual worlds we are talking about," Kerris said. USD was created by Pixar to share 3D assets in a collaborative way.

Nvidia also announced Omniverse Enterprise – a subscription offering with a software stack to help companies create 3D workflows that can be connected to the Omniverse platform. Priced at $9,000 per year, the offering is targeted at industry verticals like engineering and entertainment, and will be available through resellers that include Dell, Lenovo, PNY and Supermicro.

The company is also using the Omniverse platform to generate synthetic data on which to train "digital twins," or virtual simulations of real-world objects. The ISAAC SIM can train robots through synthetic data based on real-world and virtual information. The SIM allows the introduction of new objects, camera views and lighting to create custom data sets on which to train robots.

A automotive equivalent is Drive SIM, which can create realistic scenes through simulated cameras for autonomous driving. The SIM factors in real-world data to train autonomous driving AI models. The camera lens models are simulated and take in real-world phenomena like motion blur, rolling shutter and LED flicker.

Nvidia is working closely with sensor makers to replicate Drive SIM data accurately. The camera, radar, lidar and ultrasonic sensor models are all path-traced using RTX graphics technology, according to Danny Shapiro, Nvidia's vice president for automotive.

The company intertwined some hardware announcements in the overall Omniverse narrative.

Join the new generation

The next-generation Jetson AGX Orin developer board will be available to makers in the first quarter next year. It has 12 CPU cores based on Arm Cortex-A78 designs, 32GB of LPDDR5 RAM and delivers 200 TOPS (Tera Operations Per Second) of performance.

The Drive Hyperion 8 is a computing platform for cars which has dual Drive Orin SoCs and delivers performance of up to 500 TOPS. The platform has 12 cameras, nine radars, one lidar, and 12 ultrasonic sensors. It will go into vehicles produced in 2024, and has a modular design so auto makers can use only the features they need. Cars with older Nvidia computers can be upgraded to Drive Hyperion 8.

Nvidia also announced the Quantum-2 InfiniBand switch, which has 57 billion transistors and is being made using Taiwan Semiconductor Manufacturing Co's 7nm process. It can process 66.5 billion packets per second, and has 64 ports for 400Gbit/sec data transfers, or 128 ports for 200Gbit/sec transfers, we're told.

The company also talked up Morpheus – an AI framework it revealed earlier this year that allows cybersecurity vendors to identify and alert companies to irregular behavior in a network or data center. The framework identifies subtle changes in applications, users or network traffic to identify anomalies and suspicious behavior.

Morpheus draws the data it needs from Nvidia’s BlueField SmartNICs/Data Processing Units, which have been imbued with new powers thanks to an upgrade to the DOCA SDK that is to Bluefield DPUs as CUDA is to Nvidia's GPUs.

The DOCA upgrade – version 1.2 – can also "build metered cloud services that control resource access, validate each application and user [and] isolate potentially compromised machines." DOCA 1.2 also lets Bluefield devices authenticate software and hardware authentication, apply line-rate data cryptography, and support distributed firewalls that run on the SmartNIC. Nvidia told The Register Palo Alto has seen a 5x improvement in firewall performance when running the tools in distributed mode on SmartNICs.

Talking of AI, the GPU giant also expanded its Launchpad program, where it will offer short-term access to AI hardware and software through Equinix data centers in the US, Europe, Japan and Singapore. The latter three locations are Launchpad's first presences outside the USA, giving Nvidia hope that its function as an AI on-ramp might be more widely adopted.

Another new offering is a new cut of the RIVA conversational AI tool that is said to be capable of creating a custom human-like voice in a day, based on just 30 minutes of sample speech. Nvidia thinks that's just the ticket for orgs that want to offer custom speech interfaces. ®

Similar topics

Broader topics

Other stories you might like

  • Prisons transcribe private phone calls with inmates using speech-to-text AI

    Plus: A drug designed by machine learning algorithms to treat liver disease reaches human clinical trials and more

    In brief Prisons around the US are installing AI speech-to-text models to automatically transcribe conversations with inmates during their phone calls.

    A series of contracts and emails from eight different states revealed how Verus, an AI application developed by LEO Technologies and based on a speech-to-text system offered by Amazon, was used to eavesdrop on prisoners’ phone calls.

    In a sales pitch, LEO’s CEO James Sexton told officials working for a jail in Cook County, Illinois, that one of its customers in Calhoun County, Alabama, uses the software to protect prisons from getting sued, according to an investigation by the Thomson Reuters Foundation.

    Continue reading
  • Battlefield 2042: Please don't be the death knell of the franchise, please don't be the death knell of the franchise

    Another terrible launch, but DICE is already working on improvements

    The RPG Greetings, traveller, and welcome back to The Register Plays Games, our monthly gaming column. Since the last edition on New World, we hit level cap and the "endgame". Around this time, item duping exploits became rife and every attempt Amazon Games made to fix it just broke something else. The post-level 60 "watermark" system for gear drops is also infuriating and tedious, but not something we were able to address in the column. So bear these things in mind if you were ever tempted. On that note, it's time to look at another newly released shit show – Battlefield 2042.

    I wanted to love Battlefield 2042, I really did. After the bum note of the first-person shooter (FPS) franchise's return to Second World War theatres with Battlefield V (2018), I stupidly assumed the next entry from EA-owned Swedish developer DICE would be a return to form. I was wrong.

    The multiplayer military FPS market is dominated by two forces: Activision's Call of Duty (COD) series and EA's Battlefield. Fans of each franchise are loyal to the point of zealotry with little crossover between player bases. Here's where I stand: COD jumped the shark with Modern Warfare 2 in 2009. It's flip-flopped from WW2 to present-day combat and back again, tried sci-fi, and even the Battle Royale trend with the free-to-play Call of Duty: Warzone (2020), which has been thoroughly ruined by hackers and developer inaction.

    Continue reading
  • American diplomats' iPhones reportedly compromised by NSO Group intrusion software

    Reuters claims nine State Department employees outside the US had their devices hacked

    The Apple iPhones of at least nine US State Department officials were compromised by an unidentified entity using NSO Group's Pegasus spyware, according to a report published Friday by Reuters.

    NSO Group in an email to The Register said it has blocked an unnamed customers' access to its system upon receiving an inquiry about the incident but has yet to confirm whether its software was involved.

    "Once the inquiry was received, and before any investigation under our compliance policy, we have decided to immediately terminate relevant customers’ access to the system, due to the severity of the allegations," an NSO spokesperson told The Register in an email. "To this point, we haven’t received any information nor the phone numbers, nor any indication that NSO’s tools were used in this case."

    Continue reading

Biting the hand that feeds IT © 1998–2021