this post was submitted on 10 Jan 2025
18 points (87.5% liked)
PC Gaming
8877 readers
606 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How is it an admission of faliure? They probably can design a GPU for that, but do you want to pay hundreds of thousands, because the chip uses a full silicon wafer?
Do you think NVIDIA or AMD should have sat on that technology for decades, until it's good enough for 4k 144fps? Then you would probably say, it's not good enough, because it can't do 8k 144fps. Also, why 4k as your arbitrary limit? Most people are still on 1080p. So why not just say it's good enough, when the hardware can do 1080p 60fps.
Definitely not, since it can't even do 30fps in 4k, with all the bells and whistles and no DLSS. 1440p ist probably not even gonna be 60fps.
What? I'm pretty sure the technology they're using, Frame Warping, has been around for years, and it's used in VR, so you can just look that up and see what it does.