I would help out their GPU sales if they were ever in stock.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
The last time they had plenty of stock and cards people wanted to buy at the same time was the RX 200 series. They sold lots of cards, but part of the reason people wanted them was because they were priced fairly low because the cards were sold with low margins, so they didn't make a huge amount of money, helping to subsidise their CPU division when it was making a loss, but not more.
Shortly after this generation launched Litecoin ASIC mining hardware became available, so suddenly the used market was flooded with these current-generation cards, making it make little sense to buy a new one for RRP, so towards the end of the generation, the cards were sold new at a loss just to make space. That meant they needed to release the next generation cards to convince people to buy them, but as they were just a refresh generation (basically the same GPUs but clocked higher and with lower model numbers with only the top-end Fury card being new silicon) it was hard to sell 300-series cards when they cost more than equivalent 200-series ones.
That meant they had less money to develop Polaris and Vegas than they wanted, so they ended up delayed. Polaris sold okay, but was only available as low-margin low-end cards, so didn't make a huge amount of money. Vega ended up delayed by so long that Nvidia got an entire extra generation out, so AMD's GTX 980 competitor ended up being an ineffective GTX 1070 competitor, and had to be sold for much less than planned, so again, didn't make much money.
That problem compounded for years until Nvidia ran into their own problems recently.
It's not unreasonable to claim that AMD graphics cards being in stock at the wrong time caused them a decade of problems.
It's a huge gamble for manufacturers to order a large allocation of wafers a year in advance of actual retail sales. The market can shift considerably in that time. They probably didn't expect Nvidia to shit the bed so badly.
Wait, are you saying if they had more product, they could sell more product?
Sounds like voodoo economics to me!
I had luck with Microcenter last week (if you have one near you); checked their website at my preferred location and they had 9070 XTs in stock, went after work and I got one.
I have one about an hour away and no luck so far at that location.
Edit: oh damn, they are in stock today!
Edit2: it was one and now its gone :(
I got mine on a Thursday FWIW. The employees didn't even know they had them in stock, they had to pull it out of a case in the back; had to have been an afternoon truck delivery
Also it is goddamned wonderful as a GPU 😍 ive been replaying Far Cry 6 with maxed out... everything. At 4k, and I've got roughly 100 FPS stable. Its absolutely gorgeous
If you really wanna get one and it's in stock, reserve it for pick up, they'll hold it for like 3 days and you don't have to pay until you actually go get it, in case you change your mind or can't make it in that time.
My ryzen 7 3700x is several years old now. It was a little finicky with what memory it liked but since working that out, it's been great. No complaints. I expect this system to last me at least another 5 years.
3800x here, I've been very happy with it. I don't see a need to upgrade. My 2070s, however... does not work very high-end with my ultra-wide monitor when playing AAA (or even AAAA!..) games lol.
I upgraded to a 5700X from a 3600 this year to take advantage of some sales, no regrets. Wish I had the spare cash for a 9070XT, maybe next gen.
Wish my 5800x had lasted :( It seems to have died after only 4 years.
The only hardware issues I've ever had were due to poor thermal management.
If you want hardware longevity, use a high quality PSU, don't overclock, provide excessive cooling (so that several years from now, when you neglect your system and its full of dust, you'll still be OK).
Thermal problems are much less likely to kill hardware than they used to be. CPU manufacturers have got much better at avoiding microfractures caused by thermal stress (e.g. by making sure that everything in the CPU expands at the same rate when heated) and failures from electromigration (where the atoms in the CPU move because of applied voltage and stop being parts of transistors and traces, which happens faster at higher temperatures). Ten or twenty years ago, it was really bad for chips to swing between low and high temperatures a lot due to thermal stress, and bad for them to stay at above 90°C for a long time due to electromigration, but now heat makes so little difference that modern CPUs dynamically adjust their frequency to stay between 99.0° and 99.9° under load by default. The main benefit of extra cooling these days is that you can stay at a higher frequency for longer without exceeding the temperature limit, so get better average performance, but unless your cooling solution is seriously overspecced, the CPU will be above 99.0° under load a lot of the time either way and the motherboard just won't ramp the fan up to maximum.
I had all of that. Ran into intermittent random crashes about a year ago. After a year of not being able to find a cause I found a thread of other 5800x users running into the same problem. (For the record this was with a high quality PSU, a very very light overclock, and temps were fine throughout that time. Also while I'm not a true IT professional I do know my way around a computer, and the most in depth error logs I could find, which there were very few of, pointed to really low level calculation errors.)
After finally giving up and just buying a 9800x3d I gave the system to my friend for a huge discount, but after reinstalling everything the CPU never booted again.
While what you say is generally true, it is also sometimes the case that some parts are just slightly defective, and those defects might show with age. It's the first CPU I've ever had that died on me (other than a 9800x3d but that was an MSI mobo that killed it,) so I don't really hold it against them. And I'm very happy with the 9800x3d. Its amazing the difference it's made in games.
That's a bummer that it failed on you.
I've been wondering if it would be worth it to replace my 3700x with a 5800x3d but I'm not sure the modest performance improvement would be worth the price.
I've always been an AMD cpu guy; my first pc had an AMD gpu and then i moved away to nvidia, but due to costs i now moved back to AMD and i got zero complaints.
Hell they finally got me too... They can thank Intel for royally fucking up 13-14 gen and then releasing a new chip that wasn't much better than previous ones to warrant the price.
I always built Intel PCs out of habit mostly, but I just got a 9800x3d last week for my rebuild.
Been AMD for years but went Intel for a media server due to the encoder and better idling power. I wish AMD would improve their video encoding.
Radeon RX 9000 series are amazing. Apparently sales are doing great, but obviously NVIDIA is holding monopoly and people are brainwashed to just buy NVIDIA no matter what.
Seemed like it was marginal improvement with focus on upscaling/framegen, which does not really interest me. I'm still really happy with my 6900 XT. Although, NVIDIA has been marginal improvement with significant TDP (💀) and price increase for several generations now, so whatever 🤷
40 series to 30 series was pretty tangible IMO (4090 gets something like 30-50% more perf in most tasks than 3090 Ti with the same TDP), in part thanks to the much higher L2 cache plus newer process node.
50 series was very poor though, probably because it's the same process node.
I was surprised to see the 9070 xt at about double the 6800 xt performance in benchmarks, once ones with both of those started coming out.
I got it because I also see that if China does follow through with an attack on Taiwan, PC components are going to become very hard to find and very expensive while all of that production capacity is replaced. And depending on how things go after that, this might be the last GPU I ever buy.
A huge factor is rendering resolution. I only render at most <1080p (1024x768 or 1600x1200). 2x performance improvement over 6800XT in general sounds very incorrect if the benchmarks are run at 1080p, unless they are using upscaling and frame gen to cheat the performance numbers. Do you have a link to these benchmarks? I'd be less skeptical about a significant performance improvement over 6800XT if the benchmarks were done specifically at 4k resolution though as both AMD and NVIDIA have further optimized GPUs for 4k rendering each passing generation.
Upscaling/framegen and 4k are completely irrelevant to me, so counting that out, it is marginal improvement based on the numbers I've seen. I'd like to be wrong though, and I could be
I do care about upscaling and ray tracing, which is why I didn't go with AMD for last few generations. RX 9070 XT felt like the right time as they made huge improvements. Especially FSR4 is easily comparable to DLSS and I use it as antialiasing replacement while boosting performance. FSR2, while it works it turns into pixelated mess during fast movements and has a lot of ghosting. FSR4 is near perfect.
What I also love is how AMD's Fluid Motion Frames just work in all games with minimal artifacting and Radeon Chill is what I especially love with summer coming in. It decreases power consumption dramatically and thus heat output to levels RTX 5070Ti couldn't ever achieve despite being more efficient in raw tests for power consumption. All while not affecting experience. It's so good I'm using it in Overwatch 2 and Marvel Rivals and I can't really tell a diffeeence, it controls framerate that seamlessly.
Ray tracing just still isn't there yet. Even during the manicured ray tracing demo during the AMD announcement event for 9000 series, its nothing but surface boil. Looks like analog white static overlayed on all the surfaces.
That's not the experience in actual games.
Are you kidding...?? I wish that was true. The worst I've seen it is in Marvel Rivals. It's pretty bad in S.T.A.L.K.E.R. Heart of Chernobyl as well
That's not down to graphic card. It's the game. I had horrible boiling in Marvel Rivals on RTX 3080 to a point I preferred screenspace reflections over ray traced Lumen reflections. Still do on Radeon. Surprisingly, Oblivion Remaster running Unreal Engine 5 doesn't have this issue even on RX 9070 XT.
That's not down to graphic card.
Yeah. That's literally my point. Ray tracing just isn't there yet. Has nothing to do with GPUs.
Surprisingly, Oblivion Remaster running Unreal Engine 5 doesn't have this issue even on RX 9070 XT.
Because you have aggressive upscaling and frame gen enabled, so you've blurred your screen to the point that details like boiling are lost and then artificially resharpened your screen with the details that an AI is guessing were there.
Disable these and set to render natively and enjoy the analog static
That's not how it works. With low quality upscaling you'd just amplify the noise because it's internally processed at lower resolution.
Nope! It actually is mathematically how it works. Upscaling does not amplify NOISE, like eg surface boiling, although it does introduce many other artifacts. Noise, specifically, would be smoothed. The problem with upscaling is actually not noise, but oversmoothing, which is why it's paired with sharpening. You can just look at an upsampled signal to see how noise is affected. Boosting gain would increase noise; interpolating samples does not increase noise.
You can test it yourself and see, just go ahead and disable the FSR and frame gen gimmicks entirely while keeping ray tracing on. Hell, disable all AA and motion blur while you're at it, and really take a gander at what actual, unblurred ray tracing looks like.
Edit: also, "with low quality upscaling" lmao I'd love to hear what the implied "high quality upscaling" does differently 😂 something right? It's totally different!!
It's called internal upscaling resolution. That's where something is "low quality" upscaling. Some effects are done at full or half resolution regardless of upscaling you use, some use the same resolution as upscaler.adding shimmering from low resolution to the boiling and it looks worse.
Glad i already built a system and don't have to worry for a few years about upgrades unless something breaks. Also i'm Canadian so our tariff dispositions may be different.
Once again gamers are oppressed
Strong CPU ~~sales~~, but GPUs ... trail behind