this post was submitted on 04 Jul 2025
307 points (94.5% liked)

Technology

72414 readers
2984 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MHLoppy@fedia.io 23 points 21 hours ago (3 children)

It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:

Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)

eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)

The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral -- a top-end card being used for overclocking world records -- is $2.8k. I couldn't quickly find the European MSRP but my money's on it being more than 2.2k euro.

If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.

NVENC isn't much of a moat right now, as both Intel and AMD's encoders are roughly comparable in quality these days (including in Intel's iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.

as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced

Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn't had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).


DLSS is, and always was, snake oil

I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don't really care how a frame is generated if it looks good enough (and doesn't come with other notable downsides like latency). This almost feels like complaining about screen space reflections being "fake" reflections. Like yeah, it's fake, but if the average player experience is consistently better with it than without it then what does it matter?

Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it's more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that's fine by me. I don't think blaming DLSS (and its equivalents like FSR and XeSS) as "snake oil" is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can't tell it's not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.

There's some fair points here about RT (though I find exclusively using path tracing for RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading "DLSS is, and always was, snake oil"?


obligatory: disagreeing with some of the author's points is not the same as saying "Nvidia is great"

[–] poopkins@lemmy.world 6 points 11 hours ago

Thanks for providing insights and inviting a more nuanced discussion. I find it extremely frustrating that in communities like Lemmy it's risky to write comments like this because people assume you're "taking sides."

The entire point of the community should be to have discourse about a topic and go into depth, yet most comments and indeed entire threads are just "Nvidia bad!" with more words.

Obligatory disclaimer that I, too, don't necessarily side with Nvidia.

[–] JuxtaposedJaguar@lemmy.ml 13 points 18 hours ago (1 children)

I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?

But it does come with increased latency. It also disrupts the artistic vision of games. With MFG you're seeing more fake frames than real frames. It's deceptive and like snake oil in that Nvidia isn't distinguishing between fake frames and real frames. I forget what the exact comparison is, but when they say "The RTX 5040 has the same performance as the RTX 4090" but that's with 3 fake frames for every real frame, that's incredibly deceptive.

[–] FreedomAdvocate@lemmy.net.au 3 points 6 hours ago* (last edited 5 hours ago)

He’s talking about DLSS upscaling - not DLSS Frame Generation - which doesn’t add latency.

[–] CheeseNoodle@lemmy.world 12 points 19 hours ago (1 children)

I think DLSS (and FSR and so on) are great value propositions but they become a problem when developers use them as a crutch. At the very least your game should not need them at all to run on high end hardware on max settings. With them then being options for people on lower end hardware to either lower settings or combine higher settings with upscaling. When they become mandatory they stop being a value proposition since the benefit stops being a benefit and starts just being neccesary for baseline performance.

[–] FreedomAdvocate@lemmy.net.au 1 points 5 hours ago (1 children)

They’re never mandatory. What are you talking about? Which games can’t run on a 5090 or even 5070 without DLSS?

[–] KokoSabreScruffy@lemmy.world 1 points 5 hours ago

Correct me if I am wrong but maybe they meant when Publisher/Devs list hardware requirement for their games and includes DLSS in the calculations. IIRC AssCreed Shadows and MH Wilds had that.