this post was submitted on 03 Nov 2024
269 points (98.9% liked)

Technology

59641 readers
2892 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Panther Lake and Nova Lake laptops will return to traditional RAM sticks

you are viewing a single comment's thread
view the rest of the comments
[–] riskable@programming.dev 96 points 3 weeks ago (4 children)

Gelsinger said the market will have less demand for dedicated graphics cards in the future.

No wonder Intel is in such rough shape! Gelsinger is an idiot.

Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

The board needs to fire his ass ASAP and replace him with someone who has a grip on reality. Or at least someone who has a some imagination of how the future could be.

[–] TheGrandNagus@lemmy.world 71 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

Gelsinger said the market will have less demand for dedicated graphics cards in the future.

Reminds me of decades ago when intel didn't bother getting into graphics because they said pretty soon CPUs would be powerful enough for high-performance graphics rendering lmao

The short-sightedness of Intel absolutely staggers me.

[–] Buffalox@lemmy.world 49 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

CPUs would be powerful enough for high-performance graphics rendering lmao

And then they continued making 4 core desktop CPU's, even after phones were at deca-core. 🤣🤣🤣

[–] Trainguyrom@reddthat.com 11 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

To be fair, the arm SOCs on phones use BigLittle cores, where it will enable/disable cores on the fly and move software around so it's either running on the Big high performance cores or the Little low power cores based on power budget needs at that second. So effectively not all of those 6+ cores would be available and in use at the same time on phones

[–] Buffalox@lemmy.world 13 points 3 weeks ago* (last edited 3 weeks ago)

True, but I use the phone reference to show how ridiculous it is that Intel remained on 4 cores for almost 8 years.
Even Phenom was available with 6 good cores in 2010, yet Intel remained on 4 for almost 8 years until Coffee Lake came out late 2017, but only with 6 cores against the Ryzen 8.
Intel was pumping money from their near monopoly for 7 years, letting the PC die a slow death of irrelevancy. Just because AMD FX was so horrible their 8 Buldozer cores were worse than 4 Core2 from Intel. They were even worse than AMDs own previous gen Phenom.
It was pretty obvious when Ryzen came out that the market wanted more powerful processors for desktop computers.

[–] ChicoSuave@lemmy.world 27 points 3 weeks ago

It's been the same "vision" since the late 90s - the CPU is the computer and everything else is peripherals.

[–] T156@lemmy.world 6 points 3 weeks ago (1 children)

Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

Maybe the idea is to put it on the CPU/NPU instead? Hence them going so hard on AI processors in the CPU, even though basically nothing uses it.

[–] bruhduh@lemmy.world 11 points 3 weeks ago (1 children)

But if he wants npu then why not buff igpu too? I mean, igpu exclusive on CPU memory is good boost, look up intel i7 8709g they put AMD Radeon vega igpu and exclusive to igpu 4gb of hbm memory, it did wonders, now when AMD is winning in apu sector, they could utilise same ideas they did in the past

[–] Trainguyrom@reddthat.com 7 points 3 weeks ago

Seriously putting a couple gigs of on-package graphics memory would completely change the game, especially if it does some intelligent caching and uses RAM for additional memory as needed.

I want to see what happens if Intel or AMD seriously let a generation rip with on package graphics memory for the iGPU. The only real drawback I could see is if the power/thermal budget just isn't sufficient and it ends up with wonky performance (which I have seen on an overly thin and light laptop I have in my personal fleet. It's got a Ryzen 2600 by memory that's horribly thermally limited and because of that it leaves so much performance on the table)

[–] ColeSloth@discuss.tchncs.de 5 points 3 weeks ago

Probably because APU's are getting better and more pc gamers are doing handhelds and apu laptops instead of dedicated desktops. PC gaming has gotten really expensive.

This is a non comparison for at least the next 5 years. A dedicated gpu is still a better choice hands down for gaming. Even going on a lower end build an older gpu will still beat the current best apu by a good amount, but in 10 years time it may not be so necessary to need a gpu over an apu. GPUs are getting too power hungry and expensive. Gamers gonna game, but they won't all want to spend an ever increasing amount of money to get better graphics, and arc would need at least another 5 years to be competition enough to claim a worthwhile market share from amd or nvidia and that's wishful thinking. Long time to bleed money on a maybe.

[–] linearchaos@lemmy.world 3 points 3 weeks ago (1 children)

Unless he thinks he's going to serve all that from the die in the next 5 years.

[–] ms_lane@lemmy.world 4 points 3 weeks ago

You think Intel is going to have 500-850mm^2 dies?

That's what they need to compete in the GPU space.