this post was submitted on 17 Jun 2025
148 points (94.6% liked)

Technology

39202 readers
447 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jjjalljs@ttrpg.network 50 points 11 hours ago (3 children)

I feel like the current AI stuff has been net negative. It prompted layoffs and hiring freezes, but then didn't produce quality results.

[–] megopie@beehaw.org 34 points 10 hours ago* (last edited 6 hours ago) (1 children)

It gave CEOs an excuse to do layoffs even though they knew it would hurt their human capital long term, and that they would probably have to hire back a lot of those positions long term at higher wages. In the short terms it gave them a few quarters of increased profits. It also let them push out blatantly unfinished products on the promise of future improbable improvements. This will hurt companies reputations long term, but in the short term is let them juice the stock price.

They needed the increased profit and the pie in the sky growth promises to game the stock market, say all the right buzz words and show an improving price to earnings.

Sure they made the companies worse and less sustainable long term, but, they got huge compensation packages right now thanks to the markets, and they probably won’t be running these companies long enough to see the true fallout.

[–] Geodad@beehaw.org 11 points 9 hours ago (1 children)

I hope the stock market craters.

We need to do away with capitalism completely, or put it on a very short leash.

[–] Krauerking@lemy.lol 4 points 7 hours ago (1 children)

I wish governments still believed in regulations instead of whatever this shit is.

[–] Geodad@beehaw.org 2 points 6 hours ago

Yeah, we need socialism /communism. Either would be better than this.

[–] HobbitFoot@thelemmy.club 3 points 8 hours ago

But it isn't about creating quality results. It is about creating good enough results where the cost of failure in AI over humans is lower than the cost of humans over AI.

[–] Peanutbjelly@sopuli.xyz -1 points 8 hours ago (1 children)

i think it's a framing issue, and AI development is catching a lot of flak for the general failures of our current socio-economic hierarchy. also people having been shouting "super intelligence or bust" for decades now. i just keep watching it get better much more quickly than most people's estimates, and understand the implications of it. i do appreciate discouraging idiot business people from shunting AI into everything that doesn't need it, because buzzword or they can use it to exploit something. some likely just used it as an excuse to fire people, but again, that's not actually the AI's fault. that is this shitty system. i guess my issue is people keep framing this as "AI bad" instead of "corpos bad"

if the loom was never invented, we would still live in an oppressive society sliding towards fascism. people tend to miss the forest for the trees when looking at tech tools politically. also people are blind to the environment, which is often more important than the thing itself. and the loom is still useful.

compression and polysemy growing your dimensions of understanding in a high dimensional environment, which is also changing shape, comprehension growing with the erasure of your blindspots. collective intelligence (and how diversity helps cover more blindspots) predictive processing (and how we should embrace lack of confidence, but understand the strength of proper weighting for predictions, even when a single blindspot can shift the entire landscape, making no framework flawless or perfectly reliable.) and understanding how everything we know is just the best map of the territory we've figured out so far. if you want to know judge how subtle but in our face blindspots can be, look up how to test your literal blindspot, you just need 30 seconds a paper with two small dots to see how blind we are to our blindspots. etc.

more than fighting the new tools we can use, we need to claim them, and the rest of the world, away from those who ensure that all tools will only exist to exploit us.

am i shouting to the void? wasting the breath of my digits? will humanity ever learn to stop acting like dumb angry monkeys?

[–] jjjalljs@ttrpg.network 3 points 7 hours ago

will humanity ever learn to stop acting like dumb angry monkeys?

Seems unlikely.

As to your broader point about the tools themselves not being bad, the root problem remains capitalism, or "a few people have unaccountable power over many"