this post was submitted on 21 Jan 2024
321 points (97.1% liked)
Technology
59578 readers
2943 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I haven't paid attention to GPUs since I got my 3080 on release day back in Covid.
Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don't struggle to run a single game on max settings at high frames @ 1440p, what's the benefit that justifies the cost of 20gb VRAM outside of AI workloads?
An actual technical answer: Apparently, it's because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it's a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.
Perfect answer thank you!
Lmao
We have your comment: what am I doing with 20gb vram?
And one comment down: it's actually criminal there is only 20gb vram
Lol
Current gen consoles becoming the baseline is probably it.
As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it's almost Scrooge-like not to offer 16GB on a £579 GPU.
That said, I think the pricing is still much more of an issue than the RAM. People just don't want to pay these ludicrous prices for a GPU.
I'm maxed on VRAM in VR for the most part with a 3080. It's my main bottleneck.
If only game developers optimized their games...
The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.
GPU rendering and AI.
Perhaps not the biggest market but consumer cards (especially nvidia's) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They're the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here
Personally I need it for video editing & 3D work but I get that's a niche case compared to the gaming market.