this post was submitted on 11 Oct 2023
502 points (92.8% liked)

Technology

59641 readers
2548 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] TWeaK@lemm.ee 100 points 1 year ago (2 children)

Sounds like the internet in the 90s.

[–] 1bluepixel@lemmy.world 67 points 1 year ago* (last edited 1 year ago) (4 children)

It also reminds me of crypto. Lots of people made money from it, but the reason why the technology persists has more to do with the perceived potential of it rather than its actual usefulness today.

There are a lot of challenges with AI (or, more accurately, LLMs) that may or may not be inherent to the technology. And if issues cannot be solved, we may end up with a flawed technology that, we are told, is just about to finally mature enough for mainstream use. Just like crypto.

To be fair, though, AI already has some very clear use cases, while crypto is still mostly looking for a problem to fix.

[–] thecrotch@sh.itjust.works 19 points 1 year ago

Let's combine AI and crypto, and migrate it to the cloud. Imagine the PowerPoints middle managers will make about that!

[–] p03locke@lemmy.dbzer0.com 17 points 1 year ago* (last edited 1 year ago) (2 children)

No, this isn't crypto. Crypto and NFTs were trying to solve for problems that already had solutions with worse solutions, and hidden in the messaging was that rich people wanted to get poor people to freely gamble away their money in an unregulated market.

AI has real, tangible benefits that are already being realized by people who aren't part of the emotion-driven ragebait engine. Stock images are going to become extinct in several years. People can make at least a baseline image of what they want, no matter the artistic ability. Musicians are starting to use AI tools. ChatGPT makes it easy to generate low-effort, high-time-consuming letters and responses like item descriptions, or HR responses, or other common draft responses. Code AI engines allow programmers to present reviewable solutions in real-time, or at least something to generate and tweak. None of this is perfect, but it's good enough for 80% of the work that can be modified after the initial pass.

Things like chess AI has existed for decades, and LLMs are just extensions of the existing generative AI technology. I dare you to tell Chess.com that "AI is a money pit that isn't paying off", because they would laugh their fucking asses off, as they are actively pouring even more money and resources into Torch.

The author here is a fucking idiot. And he didn't even bother to change the HTML title ("Microsoft's Github Copilot is Losing Huge Amounts of Money") from its original focus of just Github Copilot. Clickbait bullshit.

[–] Revonult@lemmy.world 20 points 1 year ago (3 children)

I totally agree. However, I do feel like the market around AI is inflated like NFTs and Crypto. AI isn't a bust, there will be steady progress at universities, research labs, and companies. There is too much hype right now, slapping AI on random products and over promising the current state of technology.

load more comments (3 replies)
load more comments (1 replies)
[–] iopq@lemmy.world 15 points 1 year ago (6 children)

I'm still trying to transfer $100 from Kazakhstan to me here. By far the lowest fee option is actually crypto since the biggest difference is the currency conversion. If you have to convert anyway, might as well only pay 0.30% on both ends

load more comments (6 replies)
load more comments (1 replies)
[–] Lmaydev@programming.dev 10 points 1 year ago (14 children)

Or computers decades before that.

Many of these advances are incredibly recent.

And also many of the things we use in our day to day are ai powered without people even realising.

load more comments (14 replies)
[–] bappity@lemmy.world 77 points 1 year ago (1 children)

if A.I. dies out because capitalism I will wheeze

[–] WrittenWeird@lemmy.world 80 points 1 year ago* (last edited 1 year ago) (16 children)

The current breed of generative "AI" won't 'die out'. It's here to stay. We are just in the early Wild-West days of it, where everyone's rushing to grab a piece of the pie, but the shine is starting to wear off and the hype is juuuuust past its peak.

What you'll see soon is the "enshittification" of services like ChatGPT as the financial reckoning comes, startup variants shut down by the truckload, and the big names put more and more features behind paywalls. We've gone past the "just make it work" phase, now we are moving into the "just make it sustainable/profitable" phase.

In a few generations of chips, the silicon will have made progress in catching up with the compute workload, and cost per task will drop. That's the innovation to watch out for now, who will de-throne Nvidia and its H100?

[–] GenderNeutralBro@lemmy.sdf.org 32 points 1 year ago (6 children)

This is why I, as a user, am far more interested in open-source projects that can be run locally on pro/consumer hardware. All of these cloud services are headed down the crapper.

My prediction is that in the next couple years we'll see a move away from monolithic LLMs like ChatGPT and toward programs that integrate smaller, more specialized models. Apple and even Google are pushing for more locally-run AI, and designing their own silicon to run it. It's faster, cheaper, and private. We will not be able to run something as big as ChatGPT on consumer hardware for decades (it takes hundreds of gigabytes of memory at minimum), but we can get a lot of the functionality with smaller, faster, cheaper models.

[–] WrittenWeird@lemmy.world 10 points 1 year ago

Definitely. I have experimented with image generation on my own mid-range RX GPU and though it was slow, it worked. I have not tried the latest driver update that's supposed to accelerate those tools dramatically, but local AI workstations with dedicated silicon are the future. CPU, GPU, AIPU?

[–] FaceDeer@kbin.social 9 points 1 year ago (1 children)

Hundreds of gigabytes of memory in consumer PCs is not decades away. There are already motherboards that accept 128 GB.

load more comments (1 replies)
load more comments (4 replies)
load more comments (15 replies)
[–] MargotRobbie@lemmy.world 65 points 1 year ago* (last edited 1 year ago) (14 children)

Oh surprise surprise, looks like generative AI isn't going to fulfill Silicon Valley and Hollywood studios' dream of replacing artist, writers, and programmers with computer to maximize value for the poor, poor shareholders. Oh no!

As I said here before, generative AIs are not universal solution to everything that has ever existed like they are hyped up to be, but neither are they useless. At the end of the day, they are ultimately tools. Complex, powerful, useful tools, but tools nonetheless. A good artist can create better work faster with the help of a diffusion model, the same way LLM code generation can help a good programmer finish their project faster and better. (I think). All of these AI models are trained on data from data from everyone on Internet, which is why I think its reasonable that everyone should have access to these generative AI models for the benefit of humanity and not profit, and not just those who took other people's work for free to trained the models. In other words, these generative AI models should belong to everyone.

And here lies my distaste for Sam Altman: OpenAI was founded as a nonprofit for the benefit of humanity, but at the first chance of money he immediately started venture capitalisting and put anything from GPT-2 onwards under locks and keys for money, and now it looks like that they are being crushed under the weight of their own operating costs while groups like Facebook and Stability catches up with actual open models, I will not be sad if "Open"AI fails.

(For as much crap as I give Zuck for the other awful things they do, I do admire their commitment to open source.)

I have to admit, playing with these generative models is pretty fun.

[–] atetulo@lemm.ee 15 points 1 year ago (1 children)

Hm. I think you should zoom out a bit and try to recognize that AI isn't stagnant.

Voice recognition and translation programs to years before they were appropriate for real-world applications. AI is also going to require years before it's ready. But that time is coming. We haven't reached a 'ceiling' for AI's capabilities.

[–] MargotRobbie@lemmy.world 10 points 1 year ago (3 children)

Breakthrough technological development usually can be described as a sigmoid function (s-shaped curve), while there is an exponential progress in the beginning, it usually hit a climax then slow down and plateau until the next breakthrough.

There are certain problem that are not possible to resolve with the current level of technology for which development progress has slowed to a crawl, such as level 5 autonomous driving (by the way, better public transport is a way less complex solution.), and I think we are hitting the limit of what far transformer based generative AI can do since training has become more and more expensive for smaller and smaller gains, whereas hallucination seems to be an inherent problem that is ultimately unfixable with the current level of technology.

load more comments (3 replies)
[–] batmangrundies@lemmy.world 10 points 1 year ago* (last edited 1 year ago) (1 children)

There was a smallish VFX group here that was attached to a volume screen company. They employed something like 20 people I think? So pretty small.

But the volume screen employed a guy who could do an adequate enough job with generative tools instead and the company folded. The larger VFX company they partner with had 200 employees, they recently cut to 50.

In my field, a team leader in 2018 could earn about 180,000 AUD P/A. Now those jobs are advertised for 130,000 AUD, because new models can do ~80% of the analysis with human accuracy.

AI is already folding companies and cutting jobs. It's not in the news maybe, but as industries shift to compete with smaller firms leveraging AI it will cascade.

I had/have my own company, we were attached to Metropolis which unfortunately folded. I think that had a role to play in the job cuts as well. Luckily for me I wasn't overleveraged, but I am packing up and changing careers for sure.

[–] MargotRobbie@lemmy.world 10 points 1 year ago (1 children)

Generative AI can make each individual artist/writer/programmer much more efficient at their job, but the shareholders and executives get their way and only big companies have access to this technologu, this increased productivity will instead be used reduce headcount and make the remaining people do more work on a tighter deadline, instead of helping everyone work less, do better work, and be happier.

This is the reason I think democratizing generative AI via local models is important, because as your example shows, it levels the playing field between small and big players, and helps people work less while making more cool stuff.

[–] batmangrundies@lemmy.world 8 points 1 year ago (3 children)

A big problem in Aus is the industry culture. They don't care about using technology to improve results. They only care about cutting costs, even if the final product doesn't meet the previous standard.

And we've seen that with VFX across the globe, the overall quality dropped drastically. Because studios play silly buggers to weasel out of paying VFX companies what they are due.

From what I hear, even DNEG is in trouble, and were even before the strike.

It's a race to the bottom it seems.

My honest hope for the film industry is likely the same as yours. That we have smaller productions with access to better post due to improvements in AI-driven compositing software and so on.

But it's likely that a role that was earning $$$ before is devalued significantly. And while I'm an unabashed anti-capitalist, I think a lot of folks misunderstand what this sudden downward pressure on income can do. Cost of living increasing while wages shrink is an awful combination

I'm 35, left a six figure job, folding my company and starting an electrician's apprenticeship. To give you an idea around what my views about AI are. And of course this is as an Australian. We have a garbage white collar work culture anyway.

I think there will be a net improvement. But I worry that others will fail to adapt quickly. Too many are writing off AI as this thing that already came and went, but the tools have just landed, and we don't yet have workflows that correctly implement and leverage these yet.

load more comments (3 replies)
[–] nickwitha_k@lemmy.sdf.org 9 points 1 year ago

Oh surprise surprise, looks like generative AI isn't going to fulfill Silicon Valley and Hollywood studios' dream of replacing artist, writers, and programmers with computer to maximize value for the poor, poor shareholders. Oh no!

It really is incredible how much this rhymes with the crypto hype. To be fair, the technology does actually have uses but, as someone in the latter category, after I saw it in action, I quickly felt less worried about my job prospects.

Fortunately, enough people in charge of staffing seem to have listened to people with technical knowledge to not make my earlier prediction (mass layoffs directly due to LLMs, followed by mass, panicked re-hirings when said LLMs ruined the business) come true. But, the worry itself, along with the RTO pushes (not to mention exploitation of contractors and H1B holders) really underscore his desperately the industry needs to get organized. Hopefully, what's going on in the games industry with IATSE gets more traction and more of my colleagues on the same page but, that's one area where I'm not as optimistic as I'd like to be - I'll just have to cheer on SAG, WGA, and UAW for the time being.

(For as much crap as I give Zuck for the other awful things they do, I do admire their commitment to open source.)

Absolutely agreed. There's a surprising amount of good in the open source world that has come from otherwise ethically devoid companies. Even Intuit donated the Argo project, which has evolved from a cool workflow tool to a toolkit with far more. There is always the danger of EEE, however, so, we've got to stay vigilant.

load more comments (11 replies)
[–] macallik@kbin.social 53 points 1 year ago (1 children)

What I don't like about the article is that the phrasing 'paying off' can apply to making investors money OR having worthwhile use cases. AI has created plenty of use cases from language learning to code correction to companionship to brainstorming, etc.

It seems ironic that a consumer-facing website is framing things from a skeptical "But is it making rich people richer?" perspective

[–] xantoxis@lemmy.world 8 points 1 year ago

In my case, I still want to know if it's not making rich people richer, because a) fuck rich people, and b) I don't want to buy into things that will disappear in a year when the hype dies down. As a "consumer" my purchasing decisions impact my life, and the actions of the wealthy affect that more than you'd like.

[–] mojo@lemm.ee 52 points 1 year ago (4 children)

Silicon like usual thinking these things are as big as the invention as the internet, and trying to get their money in there the first place. AI was and still is a massive game changer, but nothing can live up to the hype of which they throw a stupid amount of money at these things. They didn't learn their lesson after crypto or the "metaverse" either lol. I see AI being a tool, an incredibly useful one. That also means it has a lot of jobs it simply can't do. It can't replace artists, but artists can use it as a tool to help them work off of things.

[–] snek@lemmy.world 16 points 1 year ago (5 children)

So far I've only seen AI being used to fire employees that a company totally absolutely still needs but just doesn't want to pay wages to. Companies are dumb as fuck, that's my conclusion, but what else can you expect by organizations run by ladder-climbing CEO figures?

load more comments (5 replies)
[–] guacupado@lemmy.world 8 points 1 year ago (1 children)

What I'm curious is what's going to happen to all these companies that went all-in on building data centers when they weren't doing it previously. Places like Meta and Amazon are huge enough that it's always been a sound investment but with this hype there are other companies trying to set up server farms with no real prize in sight.

load more comments (1 replies)
load more comments (1 replies)
[–] Smacks@lemmy.world 37 points 1 year ago (4 children)

AI is a tool to assist creators, not a full on replacement. Won't be long until they start shoving ads into Bard and ChatGPT.

load more comments (4 replies)
[–] Kanda@reddthat.com 35 points 1 year ago

Wait, R&D doesn't research and develop dollar bills into existence?

[–] RanchOnPancakes@lemmy.world 29 points 1 year ago

Thats how this works. Blow though VC money to try and "strike gold" fail. Change model to become profitable." Move to the next scam.

[–] jimbo@lemmy.world 28 points 1 year ago

Have they not tried simply asking the AI how to make it profitable?

[–] aesthelete@lemmy.world 27 points 1 year ago* (last edited 1 year ago) (4 children)

You'd think at this point that investors would wait for a thing to fill out the question mark second step in their business plan before investing in it, but you'd be way, way wrong.

Every new tech company comes to the investor panel with:

  1. build expressive to run new tool and give it away to end users for free

  2. ??????

  3. profit!

And somehow they keep falling for it.

[–] punkwalrus@lemmy.world 20 points 1 year ago

Because people assume all these investors know what they are doing. They don't. Now, some investors are good, but they usually don't go for shit like this. At lot of investors are VCs, rich upper class twits, who can afford to lose money. Pure and simple. It's like a bunch of lotto winners telling people they know how to pick numbers, betting outside bets once in a while, get lucky, and have selective bias.

Plus, they have enough money to hedge their bets. For example, say you invest $1mil in companies A, B, C, D, E, and F. All lose everything except A and B, which earn you $3mil each. You put in $6mil, got back $6mil. You broke even, tell people you knew what you were doing because you picked A and B, and conveniently never mention the rest. Then rich twits people invest in what YOU invest in. So you invest in H, others invest in H because you did, drives up the value. Now magnify this by a lot of investors, hundreds of letters, and it's all like some weird game of luck and timing.

But a snapshot in time leads to your 2) ?????? Point. Many know this is a confidence game, based on luck, charm, and timing. Some just stumble through it, and others are fleeced, but who cares? Daddy's got money.

Money works different for rich people. It's truly puzzling.

load more comments (3 replies)
[–] alienanimals@lemmy.world 25 points 1 year ago (2 children)

AI isn't paying off if you're too dumb to figure out how to use the many amazing tools that have come about.

[–] BolexForSoup@kbin.social 22 points 1 year ago* (last edited 1 year ago) (8 children)

I was going to say...I use AI-transcription tools for video editing, AI-upscaling, and Resolve dropped an incredible AI green screen tool that makes it effortless. I also use AI to repair audio as of 6mo ago lol. I don't think I gone more than 48hrs without using an AI tool professionally.

[–] NegativeLookBehind@kbin.social 32 points 1 year ago (7 children)

I wonder if “AI not paying off” in the context of this article actually means “Companies haven’t been able to lay off a bunch of their staff yet, like they’re hoping to do”

load more comments (7 replies)
[–] Semi-Hemi-Demigod@kbin.social 8 points 1 year ago

AI is a lot more like the Internet than it is like Facebook. It's a set of techniques you can use to create tools. These are incredibly useful tools, but you're not going to make Facebook money off of them because the techniques are pretty easy to replicate and the genie is out of the bottle.

What the tech bros are looking for is a way to control access to AI so they can be a chokepoint. Like if Craftsman could charge for every single time you used their tool to make something. For one very recent example, see what happened to Unity. Creating chokepoints and then collecting rent is the modern corporate feudal strategy, but that won't work if everybody with an AWS account and enough money can spin up an LLM and start training it.

load more comments (6 replies)
load more comments (1 replies)
[–] Potatos_are_not_friends@lemmy.world 21 points 1 year ago (2 children)

These are the same kind of people who go, "We spent money on Timmy's clothes for over two years and it's not paying off."

Bro, AI is an investment.

load more comments (2 replies)
[–] Lugh 12 points 1 year ago

It should also worry investors open-source AI is only months behind the big tech leaders. I looked into AI voice cloning lately. There's a few really pricey options. Like $25 a month for a couple of hours voice cloning.

However, there's already an open-source version of what they're selling.

load more comments
view more: next ›