this post was submitted on 15 Oct 2025
13 points (93.3% liked)
PC Gaming
12528 readers
233 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Unless they’ve come up with some truly novel new architecture that makes training and/or inference MUCH more efficient… why? The generational creep is obviously unsustainable, and using current-style GPUs for ML applications is a recipe for obsolescence for a hardware manufacturer these days.