With rampant developments in computer vision technology, there are many ways that machine learning is being used in the video game industry, especially in virtual reality. Developments in VR games are quickly changing the way we game and the way we socialize. It wouldn’t be much of a stretch to say that VR is the future of gaming.
At the same time, if you’re a fan of classic console games, there are numerous ways that AI can make the console games we love even more interactive and immersive. In this article, we will give five predictions for the future of AI in gaming and how developments in machine learning can help developers build better games.
Note: This article is a collaborative effort between Limarc Ambalina and Boon Tan (McIntire, UVA). Sections 1 to 4 were written by myself (Limarc), while section 5 and the accompanying research was provided by Boon.
A big area of game development that takes a lot of time and effort is voice recording for spoken dialogue. Game devs have to hold auditions for voice actors, record scenes, and process the audio. Some game devs even go the extra mile to include voice acting in multiple languages, which takes an exorbitant amount of time. Personally, I love voice acting in video games and real human actors give the game a personal touch that really resonates. However, not all development studios have the budget to hire voice actors.
For indie developers, there wasn’t really an easy option for them to get spoken dialogue into games on a large scale, until now. Luckily, a synthetic voice company called Replica Studios may have created the perfect solution. In a nutshell, Replica Studios uses sophisticated neural networks that are trained to mimic human voices. On the front end, you can simply type in the text you want the synthetic voice to say and even include an emotion for how the text is performed.
Below is a sample of the currently available voice quality. This dialogue has been created to show what synthetic voices could sound like in a real game.
“Agartha” — From Replica Studios
Needless to say, most people wouldn’t even be able to tell the difference between these voices and real human voices. With that said, synthetic voices are surely going to become a large part of the game development industry in the near future.
While this may be farther in the future, I imagine and hope that virtual reality games or even just regular console games will become more immersive and dynamic. At the moment, most game developers employ writers to create storylines and dialogue.
In RPGs like Skyrim or The Witcher, the player is given agency in the form of pre-written dialogue options, but I imagine future games will employ dialogue systems that mimic real life.
Screenshot from The Elder Scrolls V: Skyrim, image via IGDB.com
Instead of walking up to an NPC and choosing from an arbitrary list of dialogue options, players will be able to walk up to an NPC and verbally say literally anything they want. Using a speech-to-text model, the system will convert what we said into text. That text would then be processed for it’s semantics and sentiment.
After that, a generative neural network would create the most appropriate NPC response, based on information it has been given or even personality types it has been programmed to mimic. The NPC response text would be converted into spoken dialogue using a text-to-speech synthetic voice system and played back to you (just like a normal conversation). In a way, every NPC would act like a chatbot, only much more intuitive and reactive.
Screenshot from The Witcher 3: Wild Hunt image via IGDB.com
You may be thinking that this sounds complicated and nearly impossible, given that all of those processes normally take a lot of time. However, Replica Studios (mentioned in section 1) is already working on a TTS API that can convert in-game dialogue into synthetic speech in a matter of seconds.
While the prototype and demos for Replica Studios’ TTS API are still confidential, I was given special access to see what the technology could do. Trust me when I say it is incredibly fast and impressive. Once launched, it will have a major impact on the gaming industry, especially for indie game developers working on tight budgets.
To learn more about Replica Studios, visit their website or read this interview with the CEO.
Another way that machine learning is improving video games is in the advancement of hand gesture recognition or hand tracking. For our purposes in this article, hand tracking refers to a VR headset’s ability to recognize and track your hand movements via the headset cameras. Earlier this year, Oculus released full hand tracking capabilities on their wireless VR headset, the Oculus Quest. While the tech is still in beta, it is quite accurate and breathtaking when you first experience it. We will very likely see more and more VR games adopt this technology.
Screenshot from Arizona Sunshine on the Oculus Quest, image via IGDB.com
However, one major issue with games that use hand tracking is that you lose the ability to feel tension or vibration. In games like the zombie shooter Arizona Sunshine, where you hold a weapon, motion controllers make the game feel more realistic, since you can simulate the gripping of a gun and the recoil of the bullet through controller vibration.
The best scenario would be the combination of hand tracking and thin haptic gloves. These are gloves that you can wear while playing the game that can simulate vibration and tension, making you feel like you are holding an object in real life. We will likely see more development of both haptic glove and hand tracking technology in the future of VR gaming.
Fortnite: Battle Royale, image via epicgames.com
Since the video game industry has adopted a culture of updates and patches, video games can change drastically even after they are officially released. Whether or not this culture is beneficial is widely debated. However, thanks to big data, game developers are able to collect insights and adjust their games for their audience.
One great example of this is Fortnite, the battle royale game that shook the world. While developer Epic Games has never publicly stated how they use data analytics to inform their decisions, their system has the ability to track details like how often players land in certain areas of the map, what character skins (avatars) are being used the most, or even what weapons and items are being used most frequently. Since the Fortnite map is constantly changing with old areas being demolished and weapons being removed from the game, it is almost certain that data analytics of player behavior plays an important role in the developer’s decisions.
Animal Crossing: New Horizons
One practical use of text analytics in the video game industry would be analyzing reviews. We could collect reviews from various review websites like Metacritic and GameSpot, and perform various analyses to have a general sense of what users think about a particular game. Word frequency analysis collates the total frequency of all words and see what words appeared most frequently in the entire collection of reviews.
This can also be combined with sentiment analysis, to see what words are associated with positive or negative feedback, allowing the audience to further pinpoint which areas to utilize or strengthen on. For example, in my sentiment analysis of reviews about Animal Crossing: New Horizons (ACHN) on Metacritic, negative reviews are associated with words like limits, dictator, and restricted, suggesting these users are not happy with the game design (see Figure 1). Using this information Nintendo could perhaps launch different marketing campaigns or even different games for this particular group of players.
Figure 1: Negative sentiment word cloud
In addition, text analytics in reviews can be more useful when complemented with analysis and text annotation of other dimensions like ratings. Combining the sentiment score of each review and the associated rating, we can see what exactly users think about the game, or if the users are truly happy or disappointed with the game. Google search interest can also be used as a good start to gauge the general interest of a game and have a more holistic view of the users’ feedback. In the case of ACNH, more people are searching for ACNH since January 2020 (Figure 2) but the ratings on Metacritic are low.
Figure 2: Google search interest of Animal Crossing
This can be a prompt to understand the people who are writing the reviews, and if this group of players is a true representation of the larger ACNH playerbase.
From the perspective of game review websites, text analytics can be used to identify keywords associated with a game, particularly words that most frequently appeared in user reviews. Search engines in these websites can then be enhanced with these keyword associations, allowing website users to search for games using words that are a more accurate representation of other users’ feedback about the game.
Those were our five predictions for the future of AI in the gaming industry. New advancements in TTS, computer vision, and sentiment analysis, are happening rapidly. Furthermore, we are seeing great things coming out of augmented reality, virtual reality, and mixed reality technologies. Be sure to sign up for the Hacker Noon newsletter to keep up with all the latest in AI and the gaming industry.
Previously published on: https://www.aitimejournal.com/@limarc.ambalina/5-predictions-for-the-future-of-ai-in-the-gaming-industry