this post was submitted on 10 Dec 2024
70 points (87.2% liked)

PC Gaming

8775 readers
273 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RightHandOfIkaros@lemmy.world 17 points 1 week ago (1 children)

This is great and I hope this technology can be implemented on older hardware that maybe barely doesn't meet todays high system requirements.

I hope this is not used as a crutch by developers to hide really bad optimization and performance, as they have already been doing with upscalers like FSR/DLSS.

[–] vrighter@discuss.tchncs.de 22 points 1 week ago (2 children)

no, I fucking hope not. Older games rendered an actual frame. Modern engines render a noisy, extremely ugly mess, and rely on temporal denoising and frame generation (which is why most modern games only show you scenes with static scenery with a very slow moving camera).

Just render the damn thing properly in the first place!

[–] Lemming6969@lemmy.world 4 points 1 week ago (1 children)

Depends what you want to render. High fps requirements in conjunction with movement where the human eye is the bottleneck is a perfect interpolation case. In such a case the bad frames aren't really seen.

[–] vrighter@discuss.tchncs.de 4 points 1 week ago* (last edited 1 week ago) (1 children)

no, it depends how you want to render it. Older games still had most of today's effects. It's just that everyone is switching to unreal, whose focus isn't games anymore. And which imo, looks really bad on anything except a 4090, if that. Nobody is putting in the work for an optimized engine. There is no "one size fits all". They do this to save money in development, not because it's better.

ffs even the noisy image isn't always at native resolution anymore.

[–] Lemming6969@lemmy.world 3 points 1 week ago (1 children)

A context aware interpolation with less overhead is a cool technology as compared to context unaware averaging. How that ends up implemented in various engines is a different topic.

[–] vrighter@discuss.tchncs.de 7 points 1 week ago

there should't be any averaging! Just render the damn frame!

You can't tell me we could get something like mgsV on previous gen hardware at 60 fps, And that hardware with 9 times the processing power can only render a lower resolution, noisy image which is then upscaled and denoised.... at 30 fps.

"But raytracing!!!"

If these are the compromises that need to be made just to shoehorn that in, then current hardware isn't really capable of realtime raytracing in the first place.

[–] RightHandOfIkaros@lemmy.world 1 points 1 week ago

I think you are misunderstanding, because I agree with you when the games minimum hardware requirements are met.

I am saying I hope this technology can be used so that hardware that is below minimum requirements could potentially still get decently playable framerates using this technology on newer titles. The obvious drawback being decreased visual quality. I agree that upscaling, particularly TAA and its related effects, should not be used to reduce system requirements because the developers do not design their game well or make use of ugly effects. But I think this can be useful for old systems or perhaps only integrated graphics chips depending on how the technology works. That was what I meant. Sorry I was not clear enough initially.