this post was submitted on 21 Jan 2024
321 points (97.1% liked)

Technology

60074 readers
3541 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

top 50 comments
sorted by: hot top controversial new old
[–] caseyweederman@lemmy.ca 243 points 11 months ago (2 children)

Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

[–] downhomechunk@midwest.social 25 points 11 months ago (1 children)

I wish they would have started putting out AMD products. Powercolor just doesn't feel like a flagship partner like evga was to nvidia.

[–] shasta@lemm.ee 14 points 11 months ago

I would've actually switched to AMD if EVGA did

[–] dependencyinjection@discuss.tchncs.de 25 points 11 months ago (1 children)
[–] frezik@midwest.social 72 points 11 months ago (9 children)

Yup. It was something like 90% of their revenue, but 25% of their profit.

load more comments (9 replies)
[–] FiskFisk33@startrek.website 126 points 11 months ago (2 children)

GPUs haven't been reasonably priced since the 1000 series.

And now there's no coin mining promising some money back.

[–] 9488fcea02a9@sh.itjust.works 18 points 11 months ago (1 children)

The new mining is AI... TSMC is at max capacity. They're not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each

load more comments (1 replies)
[–] Sibbo@sopuli.xyz 18 points 11 months ago (7 children)

You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it's a good price for the performance I get.

[–] 2xar@lemmy.world 56 points 11 months ago (3 children)

That is still overpriced i think. Although, much less egregious than what Nv is doing. Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd. A few years prior the 4850 started at 200 usd. Even the Rx 480 started at only 230 usd. And those were all very decent cards in their time.

load more comments (3 replies)
load more comments (6 replies)
[–] Rakonat@lemmy.world 94 points 11 months ago (2 children)

Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.

Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.

It's possible we're approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.

TL;DR Nvidia is trying to sell a card at twice it's value cause greed.

[–] Evilcoleslaw@lemmy.world 34 points 11 months ago* (last edited 11 months ago) (8 children)

They're beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

load more comments (8 replies)
[–] genie@lemmy.world 16 points 11 months ago

Couldn't agree more! Abstracting to a general economic case -- those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn't quite add up @nvidia :)

Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.

[–] UnfortunateShort@lemmy.world 81 points 11 months ago (10 children)

Why people no buy our GPU anymore?

Because I can get a whole fucking console for the price of a lower midrange GPU. My only hope is Intel's Battlemage at this point.

[–] darkkite@lemmy.ml 22 points 11 months ago

yeah but then you have to play a console without mods or cheap games

try buying a used GPU and game on 1080p monitor and you'll be able to have great graphics without a lot of money

[–] GhostlyPixel@lemmy.world 12 points 11 months ago

I will hold onto my $700 3080 until it spits fire, cannot believe I was lucky enough to get it at launch.

load more comments (8 replies)
[–] PlasmaDistortion@lemm.ee 76 points 11 months ago* (last edited 11 months ago) (9 children)

My RTX 4060 has 16GB of RAM. What on earth makes them think people would go for 12GB?

[–] lapommedeterre@lemmy.world 35 points 11 months ago (1 children)

Not being a power of 2 gives me displeasure.

[–] lemmyvore@feddit.nl 13 points 11 months ago (1 children)
load more comments (1 replies)
load more comments (8 replies)
[–] LOLjoeWTF@lemmy.world 62 points 11 months ago (1 children)

My Nvidia 1070 with 8gb vram is still playing all of my games. Not everything gets Ultra, nor my monitor isn't 4K. Forever I am the "value buyer". It's hard to put money into something that is marginally better though. I thought 16g would be a no-brainer.

[–] MeatsOfRage@lemmy.world 66 points 11 months ago (5 children)

Exactly, people get to caught up in the Digital Foundry-ification of ultra max settings running at a perfect ~120 unlocked frames. Relax my dudes and remember the best games of your life were perfect dark with your friends running at 9 FPS.

1080p is fine, medium settings are fine. If the game is good you won't sweat the details.

[–] ABCDE@lemmy.world 26 points 11 months ago

remember the best games of your life were perfect dark with your friends running at 9 FPS.

The frame rate was shat on at the time and with good reason, that was unplayable for me. Best times were Halo 4-16 local multiplayer.

[–] Ragdoll_X@lemmy.world 16 points 11 months ago (3 children)

As someone who really doesn't care much for game graphics I feel that a comment I wrote a few months ago also fits here:

I’ve never really cared much about graphics in video games, and a game can still be great with even the simplest of graphics - see the Faith series, for example. Interesting story and still has some good scares despite the 8-bit graphics.

To me many of these games with retro aesthetics (either because they’re actually retro or the dev decided to go with a retro style) don’t really feel dated, but rather nostalgic and charming in their own special way.

And many other people also don’t seem to care much about graphics. Minecraft and Roblox are very popular despite having very simplistic graphics, and every now and then a new gameplay video about some horror game with a retro aesthetic will pop up on my recommended, and so far I’ve never seen anyone complain about the graphics, only compliments about them being interesting, nostalgic and charming.

Also I have a potato PC, and it can’t run these modern 8K FPS games anyway, so having these games with simpler graphics that I can actually run is nice. But maybe that’s just me.

load more comments (3 replies)
[–] umbrella@lemmy.ml 12 points 11 months ago* (last edited 11 months ago) (2 children)

30fps is fine too on most games.....

friend of mine makes do with a gtx960@720p and is perfectly fine with it, the fun games run. even new ones.

maybe an upgrade to digital foundry perfect 120fps would be worth it if it werent so damn expensive nowadays outside the us.

load more comments (2 replies)
load more comments (2 replies)
[–] Kazumara@feddit.de 52 points 11 months ago (10 children)

600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

load more comments (10 replies)
[–] DoctorButts@kbin.melroy.org 31 points 11 months ago

4070 is $600. That seems like total shit to me. That's why.

[–] Binthinkin@kbin.social 30 points 11 months ago (1 children)

You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

Aren’t they taking the 4080 completely off the market too?

[–] TheGrandNagus@lemmy.world 15 points 11 months ago

Aren’t they taking the 4080 completely off the market too?

Apparently they stopped production of it months ago. Whatever still exists on shelves is only there because nobody has been buying them.

Honestly this has been the worst 80-class Nvidia card ever. The GTX 480 was a complete joke but even that managed to sell ok.

[–] altima_neo@lemmy.zip 30 points 11 months ago

The RAM is so lame. It really needed more.

Performance exceeding the 3090, but limited by 12 gigs of RAM .

[–] AlexisFR@jlai.lu 25 points 11 months ago (3 children)

Wait, they didn't put the 4070 super at 16 GB?

[–] AProfessional@lemmy.world 27 points 11 months ago

They clearly believe customers will always buy nvidia over amd so why bother competing just make an annoyingly segmented lineup.

[–] Kbobabob@lemmy.world 21 points 11 months ago (1 children)

Nope. Even my 3080 Ti has 12Gb. I was waiting for the 4000 series refresh but i think I'll just wait and see what the 5000 series looks like.

load more comments (1 replies)
load more comments (1 replies)
[–] CosmoNova@lemmy.world 24 points 11 months ago (8 children)

I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.

load more comments (8 replies)
[–] the_q@lemmy.world 23 points 11 months ago

laughs in 6800XT

[–] Dra@lemmy.zip 21 points 11 months ago* (last edited 11 months ago) (8 children)

I haven't paid attention to GPUs since I got my 3080 on release day back in Covid.

Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don't struggle to run a single game on max settings at high frames @ 1440p, what's the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

[–] Eccitaze@yiffit.net 27 points 11 months ago (1 children)

An actual technical answer: Apparently, it's because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it's a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

load more comments (1 replies)
[–] Asafum@feddit.nl 24 points 11 months ago (1 children)

Lmao

We have your comment: what am I doing with 20gb vram?

And one comment down: it's actually criminal there is only 20gb vram

load more comments (1 replies)
load more comments (6 replies)
[–] Shirasho@lemmings.world 19 points 11 months ago (10 children)

I don't know about everyone else, but I still play at 1080. It looks fine to me and I care more about frames than fidelity. More VRAM isn't going to help me here so it is not a factor when looking at video cards. Ignoring the fact I just bought a 4070, I wouldn't not skip over a 4070 Super just because it has 12GB of RAM.

This is a card that targets 1440p. It can pull weight at 4k, but I'm not sure if that is justification to slam it for not having the memory for 4k.

[–] miss_brainfart@lemmy.ml 17 points 11 months ago

It can pull weight at 4k, but I'm not sure if that is justification to slam it for not having the memory for 4k.

There are many games that cut it awfully close with 12GB at 1440p, for some it's actually not enough. And when Nvidia pushes Raytracing as hard as they do, not giving us the little extra memory we need for that is just a dick move.

Whatever this card costs, 12GB of vram is simply not appropriate.

load more comments (9 replies)
[–] mlg@lemmy.world 15 points 11 months ago

insert linus torvalds nvidia clip here

[–] hark@lemmy.world 13 points 11 months ago (3 children)

So many options, with small differences between them, all overpriced to the high heavens. I'm sticking with my GTX 1070 since it serves my needs and I'll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that's $430 in today's dollars.

load more comments (3 replies)
[–] DingoBilly@lemmy.world 11 points 11 months ago* (last edited 11 months ago) (2 children)

What's going on? It's overpriced and completely unnecessary for most people. There's also a cost of living crisis.

I play every game I want to on high graphics with my old 1070. Unless you're working on very graphically intensive apps or you're a pc master race moron then there's no need for new cards.

load more comments (2 replies)
load more comments
view more: next ›