this post was submitted on 05 Nov 2025
562 points (99.1% liked)

Technology

76648 readers
3343 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

"I've been saving for months to get the Corsair Dominator 64GB CL30 kit," one beleagured PC builder wrote on Reddit. "It was about $280 when I looked," said u/RaidriarT, "Fast forward today on PCPartPicker, they want $547 for the same kit? A nearly 100% increase in a couple months?"

you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 26 points 1 day ago* (last edited 1 day ago) (19 children)

I just got a 2x64GB 6000 kit before its price skyrocketed by like $130. I saw other kits going up, but had no clue I timed it so well.

...Also, why does "AI" need so much CPU RAM?

In actual server deployments, pretty much all inference work is done in VRAM (read: HBM/GDDR); they could get by with almost no system RAM. And honestly most businesses are too dumb to train anything that extensively. ASICs that would use, say, LPDDR are super rare, and stuff like Hybrid/IGP inference is the realm of a few random folks with homelabs... Like me.

I think 'AI' might be an overly broad term for general server buildout.

[–] humanspiral@lemmy.ca 1 points 22 hours ago

why does “AI” need so much CPU RAM

It doesn't really, though CPU inference is possible/slow at 256+gb. The problem is that they are making HBM (AI) ram instead of ddr4/5.

load more comments (18 replies)