Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
There were supposed to be cheap GPUs after the crypto bubble burst
Also the AI GPUS probably won't be great for gaming. And cheap could mean anything when they go for 20k a piece.
There were when the first Ethereum bubble burst. That was one easier for the average person to get into with gamer GPUs, and they flooded the market on eBay as soon as it was no longer profitable.
Bitcoin won't do that, because it hasn't been based on GPUs for a long, long time. Ethereum doesn't even work like that anymore.
The AI bubble popping will only flood the market with GPUs that are useful for running AI models. The GPUs in AI datacenters often don't even have a display output connector. I think Corey is overstating his case on that one. Most likely, those GPUs are headed to the landfill.
The AI bubble doesn't mean AI/LLMs aren't useful. It means datacenter speculation can't make money.
those GPUs are headed to the landfill.
They'll just have a similar discount to the Ethereum switch.
You can still use such GPU as an accelerator either for running AI, or for gaming. In either case, given that you workload is Vulkan-based on Linux, you can use vkdevicechooser.
Of course, you will need a second GPU (even the CPU's integrated one) to connect your display(s).
That has never worked well. It might give high average framerates on paper, but it introduces jitter that produces a worse overall experience. In fact, Gamers Nexus just came out with a video on a better way to measure this, and it touches on showing the problem with multi-GPU setups:
I think that you misunderstood my comment.
The video shows how SLI makes the frame pacing more inconsistent, which is a known issue when multiple GPUs work together to solve the same problem.
What I am talking about is more like Nvidia Optimus. This is a common technology on laptops, where the display is connected to the low power iGPU, while games can use the dedicated Nvidia chipset.
I don't know about potential frame pacing issues on these technologies, and it seems like it was not addressed in the video either. However, I know that newer laptops have a switching chip that connects the display to the dedicated GPU, which, I think, aims on lowering the latency.
I can strongly recommend the arricle from the OP blog post about marker dynamics and use of what is essentially accounting fraud by major companies involved in AI:
Lifespan of AI Chips: The $300 Billion Question
I am looking forward to reading the research paper they are working on.
While the author takes a relatively neutral tone, the analysis is brutal in its portrayal of major market players (Nvidia, Microsoft, Amazon, Google); they come of more as oligopolists who are happy to engage in what is de facto in an attempt to undermine true market competition.
OP's post is largely right, but it doesn't require that link to be true. Also, whether these $3m+ systems are warrantied is a relevant question. It's hard to know exact lifespan from one person saying their gpu failed quickly. Paper still stands well.
Because of power constraints, I'd expect they replace GPUs every 2 years with new generations, and so there will be big write offs.
A 6 year depreciation schedule seems unrealistically long for a GPU.
Even in gaming terms (I know this is completely different use case), a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.
Then there is the question of incentives. An objective look at American technology and VC suggests they are far closer to criminal organizations than their treatment by media and US institutions would imply. They very much can be expected to engage in what is essentially accounting fraud.
a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.
On one hand a 2080s would still be good at doing what it was doing 6 years ago. If there are new needs, and unlimited power availability, then a new card in addition to whatever AI workload the 6 year old GPU can do in addition to the new card makes sense... if that card still works. Selling your 2080s or whatever old card, does mean a fairly steep loss compared to original price, but 6 year depreciation schedule is ok... IF the cards are still working 6 years later.
$3m NVL72 systems are a bit different, as one out of 72 cards burning out can screw up whole system, and datacenter power structure and expertise requirements, would have low resale value, though I assume the cards can be ripped out and sold individually.
They very much can be expected to engage in what is essentially accounting fraud.
Oracle this week "proudly boasted" that they get 30% margins on their datacenter, and stock went up. This is not enough, as it is just 30% over electricity costs. Maintenance/supervision, and gpu costs/rentals don't count, and it is unlikely that they are profitable, though it's not so much accounting fraud as it is accounting PR.
One thing I worry about is that there‘s going to be a fire sale on the polluting crap that are powering these GPU farms. It’ll likely end up in poorer countries because it’ll be cheaper than new renewables.
The Internet has already been mostly destroyed, drowned in AI slop. Is all that shit gonna be taken down? Are search engines going to go back to working again?
They will ruin it, and then move onto the next thing to subsume and destroy forever.
Dude this. Looking up how to pull off pci passthrough on an SBC I have as well as answer a few lingering filesystem questions I get nothing but slop. The useful shit isn’t even visible anymore. And if I ask chatGPT to sift through it all, it can’t do it either, instead regurgitating all the slop it can’t make sense of either.
We are looking at the destruction of the greatest library in mankind’s history. Because NVIDIA’s line must go up.
I was going to reply "at least the burning of Alexandria was an accident," and then I thought to look that up. Seems egotists destroying public collections of knowledge is just baked into humanity. We'll never be free of its scourge.
Local archives of Wikipedia and Project Gutenberg have never been a better idea.
nope, its useful for propaganda still, especially dimwits like conservatives that cant tell the difference.
Please let the pop take the tech bros with it.
They'll move on to the next big thing, just like they did after bitcoin.
And after NFTs, blockchain, the metaverse and so on
The metaverse was/is the stupidest thing ever. They couldn't even sell it to the average person. I'm into tech but even I can barely describe what its supposed to be. Is it just VR? VR office work? 🤷
What if we were pro-active, and trapped them by creating some bullshit "next big thing" and then just pocket all of their cash when they invest.
"You thought your money was going towards a space elevator, but little did you know, you've been single-handedly housing and feeding the population of six US cities!! Muahahahahahah!"
I wish. I suspect the mega-billionaires will be absolutely fine.
Someone think of the "prompt engineers"!
Hmm and what about the everyday user who needs to ask AI how long to cook potatoes? What will they do after the pop?
completely uncharted territory. No one tried to cook a potato until Sam Altman graced us plebs with ChatGPT
Local models are actually pretty great! They are not great at everything... but for what most people are using LLMs for they do a fine job.
Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.
Boiling time isn't related to original potato size, it's related to the size of pieces you cut. So the first half is irrelevant and the second half is overly verbose.