this post was submitted on 22 Dec 2024
381 points (95.5% liked)

Technology

60059 readers
2896 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] iAvicenna@lemmy.world 5 points 13 hours ago

oh wow who would have guessed that business consultancy companies are generally built on top of bullshitting about things which you dont really have a grasp on

[–] LenielJerron@lemmy.world 111 points 1 day ago* (last edited 1 day ago) (2 children)

A big issue that a lot of these tech companies seem to have is that they don't understand what people want; they come up with an idea and then shove it into everything. There are services that I have actively stopped using because they started cramming AI into things; for example I stopped dual-booting with Windows and became Linux-only.

AI is legitimately interesting technology which definitely has specialized use-cases, e.g. sorting large amounts of data, or optimizing strategies within highly restrained circumstances (like chess or go). However, 99% of what people are pushing with AI these days as a member of the general public just seems like garbage; bad art and bad translations and incorrect answers to questions.

I do not understand all the hype around AI. I can understand the danger; people who don't see that it's bad are using it in place of people who know how to do things. But in my teaching for example I've never had any issues with students cheating using ChatGPT; I semi-regularly run the problems I assign through ChatGPT and it gets enough of them wrong that I can't imagine any student would be inclined to use ChatGPT to cheat multiple times after their grade the first time comes in. (In this sense, it's actually impressive technology - we've had computers that can do advanced math highly accurately for a while, but we've finally developed one that's worse at math than the average undergrad in a gen-ed class!)

[–] Voroxpete@sh.itjust.works 46 points 23 hours ago (13 children)

The answer is that it's all about "growth". The fetishization of shareholders has reached its logical conclusion, and now the only value companies have is in growth. Not profit, not stability, not a reliable customer base or a product people will want. The only thing that matters is if you can make your share price increase faster than the interest on a bond (which is pretty high right now).

To make share price go up like that, you have to do one of two things; show that you're bringing in new customers, or show that you can make your existing customers pay more.

For the big tech companies, there are no new customers left. The whole planet is online. Everyone who wants to use their services is using their services. So they have to find new things to sell instead.

And that's what "AI" looked like it was going to be. LLMs burst onto the scene promising to replace entire industries, entire workforces. Huge new opportunities for growth. Lacking anything else, big tech went in HARD on this, throwing untold billions at partnerships, acquisitions, and infrastructure.

And now they have to show investors that it was worth it. Which means they have to produce metrics that show people are paying for, or might pay for, AI flavoured products. That's why they're shoving it into everything they can. If they put AI in notepad then they can claim that every time you open notepad you're "engaging" with one of their AI products. If they put Recall on your PC, every Windows user becomes an AI user. Google can now claim that every search is an AI interaction because of the bad summary that no one reads. The point is to show "engagement", "interest", which they can then use to promise that down the line huge piles of money will fall out of this pinata.

The hype is all artificial. They need to hype these products so that people will pay attention to them, because they need to keep pretending that their massive investments got them in on the ground floor of a trillion dollar industry, and weren't just them setting huge piles of money on fire.

load more comments (13 replies)
[–] Brodysseus@lemmy.dbzer0.com 8 points 17 hours ago

I've ran some college hw through 4o just to see and it's remarkably good at generating proofs for math and algorithms. Sometimes it's not quite right but usually on the right track to get started.

In some of the busier classes I'm almost certain students do this because my hw grades would be lower than the mean and my exam grades would be well above the mean.

[–] eleitl@lemm.ee 4 points 16 hours ago (1 children)

Page doesn't render properly.

[–] Joker@sh.itjust.works 4 points 16 hours ago (1 children)
[–] eleitl@lemm.ee 1 points 14 hours ago

Thanks -- it has been clear enough that an another AI winter is coming. Likely latest when the Global Financial Crisis 2 is here.

[–] einlander@lemmy.world 28 points 1 day ago (5 children)
load more comments (5 replies)
load more comments
view more: ‹ prev next ›