GROW YOUR TECH STARTUP

AI’s Pioneering Role: Crafting the Future of Video Games

November 30, 2023

SHARE

facebook icon facebook icon

This article was written by Michael Puscar, Co-Founder and CTO of NPCx.

Artificial Intelligence (AI) is advancing every industry, and the gaming sector is no exception. In fact, according to a 16z study of 243 game studios, 87% of those surveyed already use AI tools, and 99% plan to use AI in the future. 

The gaming industry—a playground of innovation and imagination—has embraced AI with open arms, fundamentally transforming the landscape of interactive experiences. And in particular, NPCx is leveraging neural networks and biomechanical models to make significant improvements to the clean-up process of motion capture (mocap), character movement, and frame selection. 

AI is also being used to create more intelligent non-player characters (NPCs) in video games, including a new technique pioneered by NPCx called behavioral cloning which clones the behaviors and actions of real world players. 

How exactly is this possible? Let’s dive into how AI technology is powering new and innovative gaming technology and how it’s changing the future of gaming.

Boosting speed and efficiency in the mocap process

To put it simply, mocap is the process of creating in-game characters by modeling the movements and motions of actors on a stage. The process of creating this captivating visual setting is complex. While mocap technology is widely used across the film and gaming industry, most studios encounter the same issues—high production costs, tedious calibration of cameras and markers, and, the most time-consuming of all, data clean-up.

Traditionally, mocap performances require painstaking manual ‘cleaning’ to prepare them for the final product. Mocap technicians complete this process using a mouse and keyboard to correct occlusions and anomalies. These can include character movement deformities, such as walking through walls and limbs penetrating other characters and objects. 

This year, NPCx launched a new product that changes how mocap cleanup is done. This new and advanced clean-up technique pioneered by NPCx is possible through the use of AI and biomechanical-powered motion capture processing tools, disrupting the conventional and labor-intensive process of tracking raw 3D point cloud data. These programs can now directly apply the captured data from any optical or sensor-based motion capture system onto the character skeleton without human intervention.

By utilizing neural networks and biomechanical models, developers can reduce the processing time for characters from around 8 hours to 10 minutes, a difference of two orders of magnitude. The benefits of this approach are staggering. Aside from the cost savings, this new technique allows video game studios to finish AAA (blockbuster) games in record times. Talented motion capture technicians no longer need to spend hours working on tedious and monotonous tasks, and can instead focus their efforts in other components of the game development. 

Leveraging AI for this process removes the laborious clean-up for faster and cost-effective character development in the gaming industry while providing a higher level of precision for gaming studios.

Elevating character motion

Character movement technology hasn’t developed or rapidly improved much over the years. AAA first-person shooter game characters look and move much as they did a decade ago, with movements that feel stiff and unrealistic. This results in characters that do not move naturally like a ‘real’ person —which ultimately is what game developers aim to imitate. This problem is primarily due to developers having to choose from a limited range of animation frames. 

With the introduction of cutting edge new AI technology, studios can now use their vast catalog of motion capture videos to augment the range of movements for characters and to choose movements, frame by frame, based upon how humans move. NPCx leverages neural networks, trained from motion capture stage data, to predict the next animation frame, resulting in smoother and more lifelike character movements. The result are characters in game that move much as their human counterparts do.

This technology, which utilizes neural networks, is a subset of machine learning (ML) and uses deep learning to recognize, classify, and predict gestures, smoothing out motion transitions and making character movements appear more fluid and human-like. These networks are particularly valuable when transitioning between different animation states, for example, when a character goes from sitting to standing or running to crouching. 

The advantages of using AI to assist with the character motion process will lead to more appropriate and accurate animations, improving both human players and non-player characters’ (NPCs) overall responsiveness and realism. Realistic character movement within games also has a knock-on effect by creating a more immersive and enjoyable experience for players. 

Introducing Behavioral Cloning 

In most video games, the behaviors of NPCs are predictable. Characters are governed by decision trees, essentially big if then else clauses. Human brains, of course, do not work this way and it is only a matter of time before humans outsmart these decision trees, beating the video game.  

In essence, these models represent the decision-making processes in a tree-like structure, with branches leading to different outcomes. Decision trees provide a structured way to control game variables, including the narrative, dialogues, and puzzle solving. However, there are a limited number of outcomes, meaning the more the game is played, the more predictable it becomes. 

To combat this, some studios are creating unbeatable ‘Godlike’ NPCs. However, their omniscient game knowledge allows them to ‘cheat’ owing to knowledge that traditional NPCs shouldn’t have, like knowing the entire game map. The outcome can be equally frustrating to players; whereas decision trees result in the player inevitably defeating the game, godlike AIs create video games that can be impossible to defeat. 

NPCx has pioneered a new solution: Using AI to clone human player behavior onto NPCs, including actions, decisions, and interactions. 

Behavioral cloning uses AI to watch humans playing a game for a period of time—depending on the game’s complexity—and analyzes how players respond to the game’s environment. It learns how humans react depending on hundreds of factors, such as their position, available in-game gadgets, and wound level. 

Generative adversarial networks (GANs) are the technology that makes behavioral cloning possible: They’re a type of unsupervised ML where two neural networks compete with each other to continuously improve predictions. Developers enhance gaming immersion and engagement by training GANs with real human data and translating it into realistic NPC actions. In fact, 88% of gamers from The Future of NPCs Report believed that games with advanced AI NPCs made games more interactive and enjoyable. 

So, by virtually cloning players to this extent, distinguishing between an NPC and a human player becomes nearly impossible. Cloned characters can be accessed when their human player is absent, facilitating gameplay with friends. Additionally, esports players can monetize by selling their clones, offering diverse gaming experiences and new revenue streams for individuals and gaming companies. 

Likewise, behavioral cloning has notable implications within the metaverse, as for virtual worlds to be worth visiting, they need to be populated. Through behavioral cloning, the metaverse can be inhabited by all kinds of NPCs, from the guy walking in the park to the barista at the coffee shop. As the distinction between human and AI interactions blur, this offers users a dynamic and captivating experience.

However, as with all uses of AI, there are consent issues regarding data privacy and monitoring, and the Whitehouse recently released a Blueprint for AI Rights. So, for gaming studios to offer and benefit from this use of AI, they need to ensure they’re following the established procedures and recommendations.

AI’s influence within video game creation extends from revolutionizing the mocap cleaning process to enhancing character animations and cloning intelligent and adaptive NPCs. NPCx’s impressive technology is not only enriching the gaming experience for players but is also overhauling the game-making process, saving significant amounts of time and money for studios. As technology advances, the synergy between AI and video game development holds the promise of even more stimulating and interactive gaming landscapes in the future.

Michael Puscar is the co-founder and CTO of NPCx, a company using AI to make non-player characters in video games more lifelike.

Disclosure: This article mentions a client of an Espacio portfolio company.

SHARE

facebook icon facebook icon

Sociable's Podcast

Trending