this post was submitted on 16 Oct 2025
179 points (96.9% liked)

Technology

76112 readers
3330 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:

top 50 comments
sorted by: hot top controversial new old
[–] mistermodal@lemmy.ml 7 points 16 hours ago
[–] SlartyBartFast@sh.itjust.works 40 points 1 day ago (2 children)

There were supposed to be cheap GPUs after the crypto bubble burst

[–] boonhet@sopuli.xyz 1 points 14 hours ago

Also the AI GPUS probably won't be great for gaming. And cheap could mean anything when they go for 20k a piece.

[–] frezik@lemmy.blahaj.zone 24 points 1 day ago (2 children)

There were when the first Ethereum bubble burst. That was one easier for the average person to get into with gamer GPUs, and they flooded the market on eBay as soon as it was no longer profitable.

Bitcoin won't do that, because it hasn't been based on GPUs for a long, long time. Ethereum doesn't even work like that anymore.

The AI bubble popping will only flood the market with GPUs that are useful for running AI models. The GPUs in AI datacenters often don't even have a display output connector. I think Corey is overstating his case on that one. Most likely, those GPUs are headed to the landfill.

[–] humanspiral@lemmy.ca 5 points 16 hours ago

The AI bubble doesn't mean AI/LLMs aren't useful. It means datacenter speculation can't make money.

those GPUs are headed to the landfill.

They'll just have a similar discount to the Ethereum switch.

[–] jim3692@discuss.online 7 points 1 day ago (1 children)

You can still use such GPU as an accelerator either for running AI, or for gaming. In either case, given that you workload is Vulkan-based on Linux, you can use vkdevicechooser.

Of course, you will need a second GPU (even the CPU's integrated one) to connect your display(s).

[–] frezik@lemmy.blahaj.zone 5 points 1 day ago (1 children)

That has never worked well. It might give high average framerates on paper, but it introduces jitter that produces a worse overall experience. In fact, Gamers Nexus just came out with a video on a better way to measure this, and it touches on showing the problem with multi-GPU setups:

https://youtu.be/qDnXe6N8h_c

[–] jim3692@discuss.online 6 points 1 day ago

I think that you misunderstood my comment.

The video shows how SLI makes the frame pacing more inconsistent, which is a known issue when multiple GPUs work together to solve the same problem.

What I am talking about is more like Nvidia Optimus. This is a common technology on laptops, where the display is connected to the low power iGPU, while games can use the dedicated Nvidia chipset.

I don't know about potential frame pacing issues on these technologies, and it seems like it was not addressed in the video either. However, I know that newer laptops have a switching chip that connects the display to the dedicated GPU, which, I think, aims on lowering the latency.

[–] Alphane_Moon@lemmy.world 22 points 1 day ago (1 children)

I can strongly recommend the arricle from the OP blog post about marker dynamics and use of what is essentially accounting fraud by major companies involved in AI:

Lifespan of AI Chips: The $300 Billion Question

I am looking forward to reading the research paper they are working on.

While the author takes a relatively neutral tone, the analysis is brutal in its portrayal of major market players (Nvidia, Microsoft, Amazon, Google); they come of more as oligopolists who are happy to engage in what is de facto in an attempt to undermine true market competition.

[–] humanspiral@lemmy.ca 2 points 7 hours ago (1 children)

OP's post is largely right, but it doesn't require that link to be true. Also, whether these $3m+ systems are warrantied is a relevant question. It's hard to know exact lifespan from one person saying their gpu failed quickly. Paper still stands well.

Because of power constraints, I'd expect they replace GPUs every 2 years with new generations, and so there will be big write offs.

[–] Alphane_Moon@lemmy.world 2 points 5 hours ago (1 children)

A 6 year depreciation schedule seems unrealistically long for a GPU.

Even in gaming terms (I know this is completely different use case), a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.

Then there is the question of incentives. An objective look at American technology and VC suggests they are far closer to criminal organizations than their treatment by media and US institutions would imply. They very much can be expected to engage in what is essentially accounting fraud.

[–] humanspiral@lemmy.ca 2 points 3 hours ago

a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.

On one hand a 2080s would still be good at doing what it was doing 6 years ago. If there are new needs, and unlimited power availability, then a new card in addition to whatever AI workload the 6 year old GPU can do in addition to the new card makes sense... if that card still works. Selling your 2080s or whatever old card, does mean a fairly steep loss compared to original price, but 6 year depreciation schedule is ok... IF the cards are still working 6 years later.

$3m NVL72 systems are a bit different, as one out of 72 cards burning out can screw up whole system, and datacenter power structure and expertise requirements, would have low resale value, though I assume the cards can be ripped out and sold individually.

They very much can be expected to engage in what is essentially accounting fraud.

Oracle this week "proudly boasted" that they get 30% margins on their datacenter, and stock went up. This is not enough, as it is just 30% over electricity costs. Maintenance/supervision, and gpu costs/rentals don't count, and it is unlikely that they are profitable, though it's not so much accounting fraud as it is accounting PR.

[–] nightlily@leminal.space 7 points 1 day ago

One thing I worry about is that there‘s going to be a fire sale on the polluting crap that are powering these GPU farms. It’ll likely end up in poorer countries because it’ll be cheaper than new renewables.

[–] salacious_coaster@infosec.pub 41 points 1 day ago (4 children)

The Internet has already been mostly destroyed, drowned in AI slop. Is all that shit gonna be taken down? Are search engines going to go back to working again?

[–] prole@lemmy.blahaj.zone 5 points 1 day ago* (last edited 1 day ago)

They will ruin it, and then move onto the next thing to subsume and destroy forever.

[–] muusemuuse@sh.itjust.works 27 points 1 day ago (2 children)

Dude this. Looking up how to pull off pci passthrough on an SBC I have as well as answer a few lingering filesystem questions I get nothing but slop. The useful shit isn’t even visible anymore. And if I ask chatGPT to sift through it all, it can’t do it either, instead regurgitating all the slop it can’t make sense of either.

We are looking at the destruction of the greatest library in mankind’s history. Because NVIDIA’s line must go up.

[–] HarkMahlberg@kbin.earth 15 points 1 day ago (1 children)

I was going to reply "at least the burning of Alexandria was an accident," and then I thought to look that up. Seems egotists destroying public collections of knowledge is just baked into humanity. We'll never be free of its scourge.

[–] frezik@lemmy.blahaj.zone 8 points 1 day ago

Local archives of Wikipedia and Project Gutenberg have never been a better idea.

load more comments (1 replies)
[–] Tollana1234567@lemmy.today 8 points 1 day ago

nope, its useful for propaganda still, especially dimwits like conservatives that cant tell the difference.

load more comments (1 replies)
[–] db2@lemmy.world 51 points 1 day ago (3 children)

Please let the pop take the tech bros with it.

[–] frongt@lemmy.zip 48 points 1 day ago (4 children)

They'll move on to the next big thing, just like they did after bitcoin.

[–] oxysis@lemmy.blahaj.zone 33 points 1 day ago (9 children)

And after NFTs, blockchain, the metaverse and so on

[–] NotSteve_@piefed.ca 1 points 1 day ago

The metaverse was/is the stupidest thing ever. They couldn't even sell it to the average person. I'm into tech but even I can barely describe what its supposed to be. Is it just VR? VR office work? 🤷

load more comments (8 replies)
[–] prole@lemmy.blahaj.zone 1 points 1 day ago* (last edited 1 day ago)

What if we were pro-active, and trapped them by creating some bullshit "next big thing" and then just pocket all of their cash when they invest.

"You thought your money was going towards a space elevator, but little did you know, you've been single-handedly housing and feeding the population of six US cities!! Muahahahahahah!"

load more comments (2 replies)
[–] bursaar@lemmy.world 1 points 1 day ago

I wish. I suspect the mega-billionaires will be absolutely fine.

load more comments (1 replies)
[–] prole@lemmy.blahaj.zone 3 points 1 day ago

Someone think of the "prompt engineers"!

[–] Zkuld@lemmy.world 17 points 1 day ago (14 children)

Hmm and what about the everyday user who needs to ask AI how long to cook potatoes? What will they do after the pop?

[–] rem26_art@fedia.io 28 points 1 day ago

completely uncharted territory. No one tried to cook a potato until Sam Altman graced us plebs with ChatGPT

[–] cubism_pitta@lemmy.world 21 points 1 day ago (1 children)

Local models are actually pretty great! They are not great at everything... but for what most people are using LLMs for they do a fine job.

Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.

Boiling time isn't related to original potato size, it's related to the size of pieces you cut. So the first half is irrelevant and the second half is overly verbose.

load more comments (12 replies)
load more comments
view more: next ›