Nvidia Kairos Demo| Photo Credits Nvidia

Nvidia has revealed a new technology that enables players to interact with non-player characters (NPCs) in-game by actively speaking to them and getting suitable responses in real-time.

AI Game Development Technology

Computex, one of the largest trade events of the year for the laptop and PC industry, began today in Taipei and Nvidia’s keynote was one of the most anticipated, considering the company just registered massive returns for the first quarter of the year.

During the generative artificial intelligence(AI) presentation, Nvidia’s CEO, Jensen Huang, showed the world a peek at what it would be like when gaming and AI come together through a demo called Kairos featuring a playable character conversing with an NPC named Jin in a futuristic-looking ramen eatery.

“Generative AI technologies are revolutionizing how games are conceived, produced, and played. Game developers are exploring how these technologies impact 2D and 3D content-creation pipelines during production,” said Nvidia in a blog post.

The demo was developed by Nvidia in collaboration with Convai, an Nvidia inception startup, starring a collection of middleware called Nvidia ACE (Avatar Cloud Engine) for Games. ACE uses NVIDIA’s Omniverse Audio2Face and Riva, a voice recognition and speech-to-text technology, “for instantly creating expressive facial animation of a game character to match any speech track.”

The middleware also consists of the company’s NeMo framework which is used to construct and deploy massive language models that can be customized with narrative and character backstories. NeMo also provides guardrails to prevent inappropriate discussions as is a challenge with generative AI.

To build the demo, Nvidia used Unreal Engine 5 and MetaHuman to bring the immersive NPC Jin to life. The scene was rendered using RTX Direct Illumination (RTXDI) for ray-traced lighting and shadows and NVIDIA DLSS 3 for maximum performance along with other GPU technologies from NVIDIA.

Nvidia’s Computex Announcements

GeForce RTX 4090| Photo Credits Nvidia

It is yet to be determined whether any other developers will adopt the complete ACE toolset in the manner that the demo does but S.T.A.L.K.E.R. 2 Heart of Chernobyl and Fort Solis will use the Omniverse Audio2Face which syncs the facial animation of a 3D character to their voice actor’s speech.

Huang referred to ACE as a large language model which the company said is “optimized for latency – a critical requirement for immersive, responsive interactions in games.”

“The neural networks enabling Nvidia ACE for Games are optimized for different capabilities, with various size, performance, and quality trade-offs. The ACE for Games foundry service will help developers fine-tune models for their games, then deploy via Nvidia DGX Cloud, GeForce RTX PCs, or on-premises for real-time inferencing,” the company said.

Among other announcements Nvidia made at the Computex event is that the GeForce RTX 4080 Ti GPU for Gamers is now being produced in “large quantities” with partners in Taiwan. Also in full volume production is the world’s first computer with a transformer engine, GPU server HGX H100, which is being manufactured by “companies all over Taiwan,” according to Huang.

The company also revealed that the Cuda computing model currently serves over four million developers and more than 3,000 applications. The computing model has seen 40 million downloads of which 25 million were made last year alone potentially due to the increased development of AI applications.

Related articles:

What's the Best Crypto to Buy Now?

  • B2C Listed the Top Rated Cryptocurrencies for 2023
  • Get Early Access to Presales & Private Sales
  • KYC Verified & Audited, Public Teams
  • Most Voted for Tokens on CoinSniper
  • Upcoming Listings on Exchanges, NFT Drops