It doesn’t take a particularly keen eye to notice that AI is the star of CES 2024. At a certain point, every single conversation I had at the giant technology show led back to AI and how it’s going to change the way we use our computers and even play our games.
The core of that trend traces back to the Intel Core Ultra processors and how Team Blue has worked an NPU, or Neural Processing Unit, into its laptop CPUs. This isn’t necessarily anything new but given the sheer number of laptops that are powered by Intel, we’re about to enter into a new phase of AI technology that really is going to change how pretty much every computer is built and used.
However, as gamers, we’ve kind of been here a while now; at least since 2018, when the Nvidia GeForce RTX 2080 first hit the market.
What AI Means For Games
AI has become a kind of shorthand for “anything that gets better over time,” but the true conversation is about generative AI. Basically there are several AI models out there that can take a bunch of data and generate content from it, whether that’s writing, images, videos or, as we saw from Nvidia, conversations with an NPC in a game.
Generative AI has been a thing for a while. ChatGPT blew up last year when it let people ask questions to a bot and have it generate entire paragraphs worth of text. None of that is original content, but it did make AI a mainstream conversation. But Generative AI, or GenAI, is capable of so much more than stealing content from artists and writers.
Generative AI like ChatGPT is capable of so much more than stealing content from artists and writers
For instance, just take a look at DLSS, or deep learning super sampling. Originally DLSS was an AI algorithm that was trained on a game-to-game basis in order for Nvidia’s GenAI model to be able to actually upscale the game from 1080p to 4K. That’s no longer quite how DLSS works, as it’s now much less labor intensive for game developers to implement, but the idea is the same. DLSS takes a whole bunch of data and uses it to generate image and motion information, which is just as much generative AI as ChatGPT is.
AI In Gaming Is Going to Get Bigger
At CES 2024 I spent a lot of time talking to Intel in an attempt to wrap my head around the appeal “AI PCs” would have for gamers. Because while AI PCs are going to be an absolute game changer for creative professionals and anyone that uses spreadsheets, it’s hard to imagine why a gamer would want an AI Gaming PC.
Think about it this way. You’re playing Baldur’s Gate 3 and your inventory is constantly a mess. A CPU with a dedicated NPU could allow the developers to code in a way for the inventory to manage itself. You can have NPCs that can answer questions you type in or speak into your microphone, rather than having to select between different pre-selected questions.
In fact, I saw the latter in action at CES, though a very early version. The Nvidia AI demo stuck out like a sore thumb to me because while it was technically dialogue, it sounded like I was talking to a text-to-speech bot. But at the end of the day, this wasn’t a game that’s coming out this year or anything, but rather a sneak peek at what Generative AI could do for games like Cyberpunk 2077.
We asked Nvidia’s AI platform to review Cyberpunk 2077 a score out of 10 and uhhhh… #ces #ces2024 pic.twitter.com/lScWr0O4lR
— IGN (@IGN) January 12, 2024
Rather than having random NPCs that repeat the same line to you every time you click on them, you can just strike up natural conversations about the game world with a random passerby, because it’s trained on a compendium of the game’s universe. Imagine, for instance, that in-game books and lore entries aren’t just for you to read, but also to train a large language model [LLM] that lets you have dynamic conversations with literally any NPC you encounter.
One of the benefits of AMD and Intel’s latest processors is that the neural processors naturally boost performance, because the CPU can more intelligently prioritize tasks. But imagine what it could do for gigantic open worlds, which are still the thing that pressure processors more than anything else. Running a GenAI model on your PC that actually creates dynamic and custom content that changes depending on your choices in the game world.
There are a lot of ways for game developers to essentially fake this kind of thing already, but sometimes it can be hit and miss – just look at Mass Effect 3 or Starfield.
Imagine how AI could help populate Starfield’s thousands of barren planets…
Beyond the content of a game, AI has already made games look much better and the improvements are going to keep coming. Nvidia’s graphics cards have had Tensor Cores for going on six years now, and those are technically the first AI processors in mainstream computing. The benefits of this are already apparent if you look for them, though. Games like Alan Wake 2 and Cyberpunk 2077 wouldn’t look nearly as good if their game developers didn’t have a tool like DLSS to boost performance.
I don’t know if you’ve tried to run Cyberpunk 2077 with path tracing enabled but DLSS disabled, but even on the RTX 4090 it doesn’t run well. AMD and Intel have their own upscaling tech but neither of them are as smooth as DLSS because they’re not using GenAI models in the same way. Team Green realized a while ago that AI was going to be necessary to make games look better, and DLSS has benefited as a result.
Everything Is Going to Get More Expensive, And That Sucks
It’s easy to look at the potential of AI in the best PC games and be awe-struck by the possibilities, but the fact is that AI models are extremely hard to run. Hell, that’s why every company that makes processors now has dedicated hardware to help with AI workloads built in. Apple has done it since the M1, Nvidia has been doing it since the RTX 2080, and Intel and AMD have joined the fray with Intel Core Ultra and Ryzen 8000 Mobility, respectively.
But even with that dedicated processor baked into the CPU and GPU packages, system requirements could see massive jumps. Anyone that plays with Stable Diffusion can tell you that 16GB of RAM simply won’t cut it. We’re going to see another generational jump in PC requirements where 32GB could very well end up being the new standard.
There’s nothing inherently wrong with that – after all, technology getting better is good – but the problem is that none of this is going to get cheaper. You can already see this with graphics cards. The RTX 4080 Super just came out at $999 and that was a price cut. It wasn’t that long ago that the GTX 980 launched for $549. The price for a high-end graphics card has doubled since 2014. That’s just a decade.
Meanwhile, people aren’t making double the money they were making back in 2014; I know I’m not. So as AI models get worked into mainstream games, there’s a very real danger that PC gaming is going to get even more expensive and exclusionary than it’s already grown in the last five years.
This is something I’ve discussed at length with both AMD and Nvidia and they both have the same answer – making graphics cards more powerful is getting more expensive and that cost is inevitably being passed down to the consumer.
There are some options out there. For instance, you could pick up the RTX 2060, which has Tensor Cores to help with AI, but it doesn’t exactly feel great to need to buy a five-year-old processor just because you can’t afford a new one.
I don’t know what the solution to this is, short of dismantling capitalism, but it’s going to be rough building a new PC for a long time.
The Silver Lining
AI has been set loose in the game industry and there’s really no stopping it right now. As these new technologies work their way through games, the companies that make gaming hardware are going to need to adapt. Graphics cards, game consoles, and the games are going to have to get better or risk getting left behind in the dust.
We’re already seeing that happen, that’s why AMD and Intel saw what Apple was doing with the M1 and realized they needed to get on board. And it’s why they both created competitors to DLSS, even if it took both companies a while.
Six years later, upscaling tech is a necessity if you want to play AAA games. All three graphics card manufacturers now have it, basically after gamers saw what DLSS could do and understandably wanted that on their graphics cards too. And while FSR and XeSS can’t really hold a candle to DLSS right now, there’s no reason to suspect it’s going to stay that way.
One of the less talked about hardware features of the AMD Radeon RX 7900 XT and the rest of that lineup is that it does, in fact, have an AI processor. It’s just that FSR doesn’t really use it. Instead AMD is pitching it as a way to help with the workloads that creative professionals have. Stuff like smart fill in Photoshop or Stable Diffusion. But if you just want a graphics card for gaming, there’s a pretty large part of your graphics card that’s just sitting there doing essentially nothing a majority of the time.
AMD wouldn’t comment on whether or not it’s working on a new version of FSR that runs on a GenAI model rather than a glorified sharpening filter, but I wouldn’t be surprised if it was. And now that Team Red has worked AI tech into its CPUs, we could see the company starting to develop AI models to help with upscaling and frame generation.
This Is All Coming Sooner Than We Think
Whether I was talking to Nvidia, Intel or AMD, I kept hearing that AI is coming to games within a couple of years. That was a little hard to believe at first: I mean I saw the Nvidia demo and how bad the voice acting was, but when I sit back and think about it, that seems like something that will be pretty easy to fix, especially with a game publisher like Ubisoft funding it.
And, well, it is. Nvidia told me that it was working closely with Ubisoft to work something like that Nvidia Ace demo into upcoming games, and while it’s probably not coming this year, we can expect it to start showing up in the very near future.
That’s without taking indie games into consideration too. Indie games on PC have been a huge source of experimentation with ray tracing and there’s no doubt in my mind that indie games using AI in some way are going to start showing up everywhere in the coming year or so. There’s going to be a lot of garbage, there always is, but hidden among the trash will be small glimpses of the future of AAA games, and I can’t wait to dig through and find them.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : IGN – https://www.ign.com/articles/for-better-or-worse-ai-was-the-star-of-ces-2024