That's why I play using a PC and not a console. Though PC components have also been overpriced for years.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
game graphics and design peaked in 2008. N64 was more optimized than anything that came after. Im so over current gen, and last gen and the gen before that too. Let it all burn. :)
Edit: Furthermore,
https://lemmy.blahaj.zone/pictrs/image/222c26df-19d9-4fce-9ce3-2f3dcffefc60.webp
Was about to say this too. Can't tell a difference between most games made in 2013 vs 2023.
Battlefield 1 still beats 99% of games releasing now
So now we can finally go back to good old code optimization, right? Right? (Padme.jpg)
We'll ask AI to make it performant, and when it breaks, we'll just go back to the old version. No way in hell we are paying someone
Damn. I hate how it hurts to know that's what will happen
Ironic the image is of a switch, like Nintendo has been on the cutting edge at all in the last 20+ years
It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.
Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.
Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.
This is absolutely right. We are getting to the point where the circuit pathway is hundreds or even dozens of electrons wide. The fact that we can even make circuits that small in quantity is fucking amazing. But we are rapidly approaching laws-of-physics type limits in how much smaller we can go.
Plus let's not forget an awful lot of the super high-end production is being gobbled up by AI training farms and GPU clusters. Companies that will buy 10,000 chips at a time are absolutely the preferred customers.
Not to mention that even when some components do shrink, it's not uniform for all components on the chip, so they can't just do 1:1 layout shrinks like in the past, but pretty much need to start the physical design portion all over with a new layout and timings (which then cascade out into many other required changes).
Porting to a new process node (even at the same foundry company) isn't quite as much work as a new project, but it's close.
Same thing applies to changing to a new foundry company, for all of those wondering why chip designers don't just switch some production from TSMC to Samsung or Intel since TSMC's production is sold out. It's almost as much work as just making a new chip, plus performance and efficiency would be very different depending in where the chip was made.
Which itself is a gimmick, they've just made the gates taller, electron leakage would happen otherwise.
NM has been a marketing gimmick since Intel launched their long-standing 14nm node. Actual transistor density depending on which fab you compare to is shambles.
It's now a title / name of a process and not representative of how small the transistors are.
I've not paid for a CPU upgrade since 2020, and before that I was using a 22nm CPU from 2014. The market isn't exciting (to me anymore), I don't even want to talk about the GPUs.
Back in the late 90s or early 2000s upgrades felt substantial and exciting, now it's all same-same with some minor power efficiency gains.
Now, maybe, but like I said - in the past this WAS what let consoles get big price cuts and size revisions. We’re not talking about since 2020, we’re talking about things like the PS -> PSOne, PS2 - PS2 Slim.
This is why I'm more than happy with my 5800X3D/7900XTX; I know they'll perform like a dream for years to come. The games I play run beautifully on this hardware under Linux (BeamNG.Drive runs faster than on Windows 10), and I have no interest in upgrading the hardware any time soon.
Hell, the 4790k/750Ti system I built back in 2015 was still a beast in 2021, and if my ex hadn't gotten it in the divorce (I built it specifically for her, so I didn't lose any sleep over it), a 1080Ti upgrade would have made it a solid machine for 2025. But here we are - my PC now was a post-divorce gift for myself. Worth every penny. PC and divorce.
There’s no world in which a 750Ti or even 1080Ti is a “solid machine” for gaming in 2025 lol.
Depends on your expectations. If you okay mainly eSports titles at 1080p it would've probably been quite sufficient still.
But I agree it's a stretch as an all-rounder system in 2025. My 3090 is already showing signs of it's age, a card that's two generations older would certainly be struggling today.
For what I do? It would be perfectly fine. Maybe not for AAA games, but for regular shit at ~40fps and 1080p, it would be perfectly fine.
Gotta remember that some of us are reaching 40 years old, with kids, and don't really give a shit about maxing out the 1% lows.
but for regular shit at ~40fps and 1080p
it would be perfectly fine.
That's not "perfectly fine" to most people, especially PC players.
Gotta remember that some of us are reaching 40 years old, with kids, and don’t really give a shit about maxing out the 1% lows.
Already there myself. I don't care about maxing out the 1% lows, but I care about reaching a minimum of 60fps average at the bare minimum, preferably closer to 100 - and definitely higher than 1080p. Us oldies need more p's than that with our bad eyesight haha
This article doesn't factor in the new demand that is gobbling up all the CPU and GPU production: Ai server farms. For example, Nvidia, that was once only making graphic cards for gamers, has been trying to keep up with global demand for Ai. The whole market is different, then toss tarrifs and the rest of top.
I wouldn't blame moores law death, technology is still advancing, but per usual, based on demand.
technology is still advancing
Actually not really: performance per watt of the high end stuff has been stagnating since Ampere generation. NVidia hides it by changing models in its benchmarks or advertising raw performance without power figures.
Idk, seems like Germany is making progress.
AI has nothing to do with it. Die shrinks were the reason for “slim” consoles and big price drops in the past. Die shrinks are basically a thing of the past now.
Consoles are just increasingly bad value for consumers compared to PCs.
Are they tho? Have you seen graphics card prices?
I can get ps5 graphics with a $280 video card, games are often way cheaper, I can hook the pc up to my TV, and still play with a ps5 or Xbox controller, or mouse and keyboard.
I suspect next gen there will be a ps6 and Xbox will make a cheap cloud gaming box and just go subscription only.
My 4070 cost $300 and runs everything.
The whole PC cost around $1000, and i have had it since the Xbox One released.
You can get similar performance from a $400 steam deck which is a computer.
You don't need a top end card to match console specs, something like a 6650XT or 6700XT is probably enough. Your initial PC build will be more than a console by about 2X if you're matching specs (maybe 3X if you need a monitor, keyboard, etc), but you'll make it up with access to cheaper games and being able to upgrade the PC without replacing it, not to mention the added utiliy a PC provides.
So yeah, think of PC vs console as an investment into a platform.
If you only want to play 1-2 games, console may be a better option. But if you're interested in older or indie games, a PC is essential.