this post was submitted on 10 Sep 2024
212 points (92.7% liked)

Games

32669 readers
600 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS
 

$700, and the side by sides look barely different, from my perspective. The chat seemed to have the same opinion.

you are viewing a single comment's thread
view the rest of the comments
[–] ech@lemm.ee 6 points 2 months ago (2 children)

Is there an eli5 on how "ai upscaling" is less (or even equally) technologically demanding than just putting in better hardware?

[–] Thorry84@feddit.nl 13 points 2 months ago (2 children)

The game is rendered at a lower resolution, this saves a lot of resources. This isn't a linear thing, lowering the resolution reduces the performance needed by a lot more than you would think. Not just in processing power but also bandwidth and memory requirements. Then dedicated AI cores or even special AI scaler chips get used to upscale the image back to the requested resolution. This is a fixed cost and can be done with little power since the components are designed to do this task.

My TV for example has an AI scaler chip which is pretty nice (especially after tuning) for showing old content on a large high res screen. For games applying AI up scaling to old textures also does wonders.

Now even though this gets the AI label slapped on, this is nothing like the LMMs such as chat GPT. These are expert systems trained and designed to do exactly one thing. This is the good kind of AI that's actually useful instead of the BS AI like LLMs. Now these systems have their limitations, but for games the trade off between details and framerate can be worth it. Especially if our bad eyes and mediocre screens wouldn't really show the difference anyways.

[–] chicken@lemmy.dbzer0.com 5 points 2 months ago

This is the good kind of AI that’s actually useful instead of the BS AI like LLMs

lol, trying to hedge against downvotes from the anti-AI crowd?

[–] ech@lemm.ee 2 points 2 months ago (1 children)

The game is rendered at a lower resolution, this saves a lot of resources.

Then dedicated AI cores or even special AI scaler chips get used to upscale the image back to the requested resolution.

I get that much. Or at least, I get that's the intention.

This is a fixed cost and can be done with little power since the components are designed to do this task.

This us the part I struggle to believe/understand. I'm roughly aware of how resource intensive upscaling is on locally hosted models. The necessary tech/resources to do that to 4k+ in real time (120+ fps) seems at least equivalent, if not more expensive, to just rendering it that way in the first place. Are these "scaler chips" really that much more advanced/efficient?

Further questions aside, I appreciate the explanation. Thanks!

[–] Thorry84@feddit.nl 4 points 2 months ago (1 children)

Rendering a 3D scene is much more intensive and complicated than a simple scaler. The scaler isn't advanced at all, it's actually very simple. And it can't be compared with running a large model locally. These are expert systems, not large models. They are very good at one thing and can do only that thing.

Like I said the cost is fixed, so if the scaler can handle 1080p at 120fps to upscale to 2K, then it can always handle that. It doesn't matter how complex or simple the image is, it will always use the same amount of power. It reads the image, does the calculation and outputs the resulting image.

Rendering a 3D scene is much much more complex and power intensive. The amount of power highly depends on the complexity of the scene and there is a lot more involved. It needs the gpu, cpu, memory and even sometimes storage, plus all the bandwidth and latency in between.

Upscaling isn't like that, it's a lot more simple. So if the hardware is there, like the AI cores on a gpu or the dedicated upscaler chip, it will always work. And since that hardware will normally not be heavily used, the rest of the components are still available for the game. A dedicated scaler is the most efficient, but the cores on the gpu aren't bad either. That's why something like DLSS doesn't just work on any hardware, it needs specialized components. And different generations and parts have different limitations.

Say your system can render a game at 1080p at a good solid 120fps. But you have a 2K monitor, so you want the game to run at 2K. This requires a lot more from the system, so the computer struggles to run the game at 60 fps and has annoying dips in demanding parts. With upscaling you run the game at 1080p at 120fps and the upscaler takes that image stream and converts it into 2K at a smooth 120fps. Now the scaler may not get all the details right, like running native 2K and it may make some small mistakes. But our eyes are pretty bad and if we're playing games our brains aren't looking for those details, but are instead focused on gameplay. So the output is probably pretty good and unless you were to compare it with 2K native side by side, probably you won't even notice the difference. So it's a way of having that excellent performance, without shelling out a 1000 bucks for better hardware.

There are limitations of course. Not all games conform to what the scaler is good at. It usually does well with realistic scenes, but can struggle with more abstract stuff. It can get annoying halos and weird artifacts. There are also limitations to what bandwidth it can push, so for example not all gpus can do 4K at a high framerate. If the game uses the AI cores as well for other stuff, that can become an issue. If the difference in resolution is too much, that becomes very noticeable and unplayable. Often there's also the option to use previous frames to generate intermediate frames, to boost the framerate with little cost. In my experience this doesn't work well and just makes the game feel like it's ghosting and smearing.

But when used properly, it can give a nice boost basically for free. I have even seen it used where the game could be run at a lower quality at the native resolution and high framerate, but looked better at a lower resolution with a higher quality setting and then upscaled. The extra effects outweighed the small loss of fidelity.

[–] ech@lemm.ee 2 points 2 months ago

That is interesting. Thanks for the extra info!

[–] yamanii@lemmy.world 3 points 2 months ago (1 children)

It started as good tech to make GPUs last longer, but now is a crutch that even top notch hardware like a 4090 needs to actually achieve playable performance with ray tracing at high resolutions. And that hardware is already way overpriced, imagine the price of something that could do it natively.

[–] ech@lemm.ee 1 points 2 months ago

Huh, I wasn't aware that 4090s use similar tech. That sheds light on a few things. Thanks!