this post was submitted on 12 Jan 2025
227 points (94.5% liked)

Technology

60467 readers
4042 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] gencha@lemm.ee 2 points 21 hours ago

There are few reports of this directly from the industry, because nobody wants to admit talent shortage. It's a much better sell to claim that you pivot towards AI.

I'm an enterprise consultant for technology executives, and work mostly as a platform architect for a global enterprise. The scale of this issue is invisible to most people.

I know this is basically "trust me, bro", and I wish I had more to show, but this evolution is in plain sight. And it's not like AI introduced this problem either. I'm old. Still, take my Internet connection away from me, and watch me struggle to figure out if I want .includes() or .contains() on a JS array. There is a scale.

The problem is that we've reached a point where it's easier to generate a convenient result that communicates well, instead of the "correct" solution that your executives don't understand. Decision makers today will literally take your technical concept from your presentation to have it explained to them by an LLM afterwards. They will then challenge you and your concept, based on their interactions with the LLM.

LLMs are continuously moved towards a customer-pleasing behavior, they are commercial products. If you ask them for something, they are likely to produce a response that is as widely understood as possible. If you, as a supposed expert, can't match those "communication skills", AI-based work will defeat you. Nobody likes a solution that points out unaddressed security issues. A concept that doesn't mention them, goes down a lot easier. This is accelerated by people also using AI to automate their review work. The AI prefers work that is similar to its own. Your exceptional work does not align with the most common denominator.

You can't "just Google it" anymore, all results are LLM garbage (and Google was always biased to begin with as well). All source information pools are poisoned by LLM garbage at this point. If you read a stack of books and create something original, it's not generally understood, or seen as unnecessarily complicated. If you can ask an AI for a solution, and it will actually provide that, and everyone can ask their LLM if it's good stuff, and everyone is instantly happy, what are the incentives for developers to resist that? Even if you just let an LLM rewrite your original concept, it will still reach higher acceptance.

You also must step outside of your own perspective to fully evaluate this. Ignore what you believe about LLMs helping you personally for a moment. There are millions of people out there using this technology. I attended seminars with 100+ people where they were instructed on "prompting" to generate technical documentation and compliance correspondence. You have no chance to win a popularity contest against an LLM.

So why would I need you, if the LLM already makes me happier than your explanations I don't understand, and you yourself are also inherently motivated to just use LLM results to meet expectations?

Yes, I know, because my entire enterprise will crumble long-term if I buy into the AI bullshit and can't attract actual talent. But who will admit it first, while there is so much money to be made with snake oil?