How AI is changing gaming tech in 2025

How AI is changing gaming tech in 2025

AI-Generated Summary

The 2025 Game Developers Conference in San Francisco highlights AIโ€™s growing role in gaming, from tools enabling developers to craft worlds via text prompts to in-game assistants like Tencentโ€™s AI companion in FPS games. AI aims to enhance player experiences and streamline development processes, though concerns about job displacement and data sources persist. Indie creators showcase innovative control methods, such as emotion-sensing bands that adapt gameplay based on player feelings, promising ultra-personalized experiences. Despite AI advances, the industry remains fundamentally driven by human creativity and ideas, emphasizing that technological innovation complements rather than replaces human ingenuity in game development.

๐Ÿ“œ Full Transcript

[Music] San Francisco, home to the annual Games Developers Conference. GDC, as it’s known, isn’t like other game shows, it’s not really about showing off new games. I think every type of developers here, you know, whether you’re a writer, whether you’re working in narrative design, game design, level design, and this is the one chance that everyone can get together Trends come and go in video games. And the big one this year is everywhere you look on the expo floor. There’s an AI solution to this or an AI solution to that. But the big question is, what difference will all of this AI make to the games that we actually play? AI’s presence is being felt in a number of different ways. from tools that allow developers to create elements of video game worlds just by using text prompts. Forgotten your own company, have you? No. No. I’m I’m with Fazbear Financial. Interesting. Because I could swear you told me Emberstone Enterprises before to in-game characters that react in a more realistic way to the player. Bloom says you’re the one to help us tweak the plan. And AI and gaming particularly relates to how AI could improve gamers experience by helping them play the games or how it could enable developers to make games faster and cheaper. There’s definitely a tension and firstly it’s around jobs uh where you know AI could take uh developers jobs and secondly around training data. Where is the AI getting all the stuff that it uses to be able to generate say uh a new skin or a new uh character outfit? And if you imagine uh the traditional game developer in the middle, they’re squeezed by AI on one side, the robots making stuff and user generated content on the other, the people making stuff. So this traditional game development is going through a very challenging time. Bravo 2, lead the way. Copy. One company embracing AI is Chinese tech behemoth Tencent. It’s introduced an in-game assistant called Backle or first FPS AI companion who understands human language. That trips off the tongue in position. The AI companion will be introduced into firsterson shooter Arena Breakout. Its developers say unlike regular in-game teammates, this one can understand complex spoken instructions. Run to those sandbags up front for cover, then to the car ahead. Copy that. I think in the future we can put this tech in maybe open world RPGs. So imagine you can walk around the city and talk to everybody by using natural language. Search for a green box. Copy that. We’ll find out if this AI makes a good teammate when the feature launches later this year. Found it. The GDC show floor is packed with indie developers with out of the ordinary ideas for games and ways of controlling them. Stand up. Stand up. Stand up. Including patting a giant kitty. Umbrellas as controllers as well as seesaws. And they’re not alone. Keyboard and mouse, game pad. These are the things that we use to control video games. Anybody that’s played a game will be familiar with them. But what about if there was a different method of interacting with the action on the screen? That’s what this band on my wrist is doing. It’s fitted with a variety of different sensors which are measuring my emotions. It actually has more sensors in it than you would find in a smartwatch. The onscreen action will alter depending on how I feel. Now, this is a horror game demo. What happens if I get frightened? Called Oroide. It can be used to create a relaxing experience or as we see here something designed to do the opposite. And I can see the emotions that I’m feeling on this screen here. Spooky doors. Oh, mannequin. Okay, he’s not doing anything. An even creepier factory space. Whoa. If you want to survive, stay calm. If I don’t want this mannequin to get me, I’ve got to try and calm down. And there’s another one here. Okay, I’m going to make a run for it. Let’s go for the door. Open. Open. Open. On top of heart rate and skin temperature, the band measures galvanic skin response, which its developers say allows it to assess the emotional state of the player. Combining this with generative AI, changing gameplay elements based on how the player is feeling could lead to new types of player experience. Because the idea is to be able in the near future to have a really uh ultra personalized game that only measure your emotion and adapt itself depending on what the game design is. If they want to stress you more or they want or the designer wants to relax or maintain you in a certain level of spectrum of emotions. The band is currently being pitched to developers. The breadth of creativity on show at GDC 2025 demonstrates that even in the age of AI, the industry is still driven by ideas and human invention.

[ad_1]
[ad_2]