this post was submitted on 13 Aug 2025
442 points (95.3% liked)

Technology

74521 readers
4454 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] mienshao@lemmy.world 185 points 2 weeks ago (5 children)

“Fixing” social media is like “fixing” capitalism. Any manmade system can be changed, destroyed, or rebuilt. It’s not an impossible task but will require a fundamental shift in the way we see/talk to/value each other as people.

The one thing I know for sure is that social media won’t ever improve if we all accept the narrative that it can’t be improved.

We live in capitalism. Its power seems inescapable. So did the divine right of kings. Any human power can be resisted and changed by human beings. Resistance and change often begin in art, and very often in our art, the art of words.

-Ursula K Le Guin

[–] harribert@lemmy.world 39 points 2 weeks ago (2 children)

Seriously, read her books. I looooove „The Dispossessed“

[–] floofloof@lemmy.ca 19 points 2 weeks ago* (last edited 2 weeks ago)

The Left Hand of Darkness is excellent too. Sci-fi from the 1960s about a planet whose people have no fixed sex or gender, and a man from Earth who struggles to understand and function in this society. That description makes it sound very worthy, but it's actually gripping and moving.

[–] kalkulat@lemmy.world 8 points 2 weeks ago

LeGuin is a treasure.

[–] BreadstickNinja@lemmy.world 20 points 2 weeks ago* (last edited 2 weeks ago)

Particularly apt given that many of the biggest problems with social media are problems of capitalism. Social media platforms have found it most profitable to monetize conflict and division, the low self-esteem of teenagers, lies and misinformation, envy over the curated simulacrum of a life presented by a parasocial figure.

These things drive engagement. Engagement drives clicks. Clicks drive ad revenue. Revenue pleases shareholders. And all that feeds back into a system that trades negativity in the real world for positivity on a balance sheet.

[–] masterspace@lemmy.ca 12 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Yeah, this author is the pop-sci / sci-fi media writer on Ars Technica, not one of the actual science coverage ones that stick to their area of expertise, and you can tell by the overly broad, click bait, headline, that is not actually supported by the research at hand.

The actual research is using limited LLM agents and only explores an incredibly limited number of interventions. This research does not remotely come close to supporting the question of whether or not social media can be fixed, which in itself is a different question from harm reduction.

load more comments (2 replies)
load more comments (2 replies)
[–] 9point6@lemmy.world 61 points 2 weeks ago (4 children)

Meta and twitter cease to exist tomorrow and 99% of the issues are solved IMO

The fediverse is social media and it doesn't have anything close to the same kinds of harmful patterns

[–] Nougat@fedia.io 60 points 2 weeks ago (4 children)

It's almost like the problem isn't social media, but the algorithms that put content in front of your eyeballs to keep your engagement in order to monetize you. Like a casino.

[–] Jason2357@lemmy.ca 9 points 2 weeks ago

Facebook was pretty boring before they tried to make money. Still ick, but mostly just people posting pictures of activities with family or friends.

load more comments (3 replies)
[–] chaosCruiser 12 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Amazon, Google and Microsoft would still be there, so the Internet seems to be suffering from a metastatic cancer at this point. Cutting off two revolting lumps helps, but the prognosis doesn’t look that great.

[–] 9point6@lemmy.world 5 points 2 weeks ago (1 children)

None of those have had much success in creating social networks that suck people in quite like the others

Not to say they don't have their own problems, but the bulk of problems with social media come squarely from meta & twitter.

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)
[–] imetators@lemmy.dbzer0.com 41 points 2 weeks ago (9 children)

The amount of comments thinking that Lemmy is totally not like a typical social media is absurd.

Guys, we only don't have major tracking of users here.That's it! Everything else is the fucking same shit you'd see on facebook. The moment Lemmy gets couple tens of millions of users, we gonna become 2nd facebook.

[–] hansolo@lemmy.today 21 points 1 week ago (4 children)

It's that there's no incentive to have 80 million bots manipulate everything. Our user base is too small, and likely too jaded about fake internet points to be a target for scammers, ai slop bots, or advertisers.

Or at least that's what I thought when I drink a refreshing Pepsi! hiss-crack! glugg glugg Aaaah!! PEPSI! The brown fizz that satisfies! Pepsi!

[–] someguy3@lemmy.world 6 points 1 week ago (3 children)

... If there are people to mislead with misinformation, or people with money to buy things, there will be incentive. I learned about this in this great book called

load more comments (3 replies)
load more comments (3 replies)
[–] PotatoesFall@discuss.tchncs.de 9 points 1 week ago (1 children)

Lemmy doesn't have a neural net prediction/recommendation engine. This is a HUGE difference.

load more comments (1 replies)
[–] Alphane_Moon@lemmy.world 6 points 2 weeks ago (2 children)

I haven't used FB in half a decade, but at least with respect to reddit, there are definitely more good "features" in the threadiverse than just lack of tracking.

Not saying there aren't any issues or that scaling to 10 M MAUs won't create new problems, but lack of tracking isn't the only differentiating factor.

load more comments (2 replies)
[–] samus12345@sh.itjust.works 6 points 1 week ago* (last edited 1 week ago)

It's not a typical social media because it's decentralized, but it's not immune to all the problems of social media by any means. I'm not sure why you're using Facebook as an example rather than reddit.

load more comments (5 replies)
[–] bigbabybilly@lemmy.world 32 points 1 week ago (1 children)

Social media isn’t broken. It’s working exactly how it was meant to. We just need to break free of it.

[–] gandalf_der_12te@discuss.tchncs.de 8 points 1 week ago (6 children)

first of all, it's a broad overgeneralization to assume that all social media is created with the intention to manipulate people. there was honest people running social media, but it's long past. (in the corporate domain)

  • social media can be useful if it presents non-emotional, non-brigading content. rational discourse is one of the valuable options possible. throwing away the whole internet because Xitter sucks is throwing away the baby with the bathwater.

  • but yes, social media is the new Volksempfänger and manipulates people (social engineering)

[–] DSTGU@sopuli.xyz 6 points 1 week ago (2 children)

No social media was created to manipulate people. (Most) social media is a business, optimised to make money. You make money by showing people ads. You can show more ads to people if they stay on the platform longer. You can make people stay longer by engaging them emotionally. End of conspiracy...

load more comments (2 replies)
[–] stickly@lemmy.world 5 points 1 week ago* (last edited 1 week ago) (1 children)

But it's not possible to get unbiased content on the internet. Everything exists with an agenda behind it, for the sole reason that hosting anything is going to constantly cost money.

This wasn't a huge deal when individuals were paying to host and share content to a small audience, it was a small amount of money and you could see their motives clearly (a forum for a hobby, a passion project, an online store, etc...).

Social media is different because it presents itself as a public forum where anything can be shared and hosted (for free) to as many people as you want. But they're still footing a very large bill and the wide net of content makes their motives completely opaque. Nobody cares that much about the headaches of maintaining a free and open public forum, and any profit motive is just another way to sell manipulation.

load more comments (1 replies)
load more comments (4 replies)
[–] Xanthrax@lemmy.world 29 points 2 weeks ago

We're on the solution right now, lmao

[–] kalkulat@lemmy.world 27 points 2 weeks ago

Of course -corporate- social media can't be fixed ... it already works exactly they way they want it to...

[–] moseschrute@lemmy.world 21 points 1 week ago (1 children)
[–] fyzzlefry@retrolemmy.com 7 points 1 week ago (1 children)

As long as you know you're in an echo chamber there's nothing wrong with it. Everything is an echo chamber of varying sizes.

load more comments (1 replies)
[–] mctoasterson@reddthat.com 21 points 2 weeks ago (3 children)

I think just going back to internet forums circa early 2000s is probably a better way to engage honestly. They're still around, just not as "smartphone friendly" and doomscroll-enabled, due to the format.

I'm talking stuff like SomethingAwful, GaiaOnline, Fark, Newgrounds forum, GlockTalk, Slashdot, vBulletin etc.

These types of forums allowed you to discuss timely issues and news if you wanted. You could go a thousand miles deep on some bizarre subculture or stick to general discussion. They also had protomeme culture before that was a thing - aka "embedded image macros".

[–] Jason2357@lemmy.ca 8 points 2 weeks ago (1 children)

Anything that is topic focussed rather than following individuals is a big difference, and then take away the engagement algorithm and it’s much better.

load more comments (1 replies)

That's what I've been hoping for with Reddit and now Lemmy. I don't care about individuals, I care about topic based discussion.

My problem with forums is they are more like a club, where you get loss of off-topic discussion by people who happen to share an interest. I don't care what tech nerds think about medicine on a tech nerd forum, and joining dozens of forums to get the right discussion is a huge pain.

Forums are cool, and I use a few, but I really want a place that connects different subjects.

load more comments (1 replies)
[–] avidamoeba@lemmy.ca 15 points 2 weeks ago* (last edited 2 weeks ago)

Uhm, I seem to recall that social media was actually pretty good in the late 2000s and early 2010s. The authors used AI models as the users. Could it be that their models have internalized the effects of the algorithms that fundamentally changed social media from what it used to be over a decade ago, and then be reproducing those effects in their experiments? Sounds like they're treating models as if they're humans, and they are not. Especially when it comes to changing behaviour based on changes in the environment, which is what they were testing by trying different algorithms and mitigation strategies.

[–] uberdroog@lemmy.world 15 points 1 week ago

Its performing as expected

[–] bridgeenjoyer@sh.itjust.works 14 points 1 week ago (2 children)

I also noticed something in my friend group. No one makes anything. Its all share share share. Im the only one taking original photos or videos or making jokes. Its kind of sad. And is not like their lives are boring either. They'd just rather consume others stuff.

Are most people like that?

[–] tempest@lemmy.ca 7 points 1 week ago (2 children)

I've started asking people what they have created lately... They seem to take it as an insult when it isn't meant to be.

The reality is consuming is easier than producing. You can see it with the usage of phones and tablets vs laptops. It's hard to create on a touch screen but it's easy to consume.

load more comments (2 replies)
load more comments (1 replies)
[–] Zak@lemmy.world 14 points 2 weeks ago

The study is based on having LLMs decide to amplify one of the top ten posts on their timeline or share a news headline. LLMs aren't people, and the authors have not convinced me that they will behave like people in this context.

The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There's no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.

I mostly use social media to share pictures of birds. This contributes to some of the problems the source article discusses. It causes fragmentation; people who don't like bird photos won't follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict.

[–] AI_toothbrush@lemmy.zip 12 points 1 week ago* (last edited 1 week ago) (14 children)

I mean lemmy is pretty fucking neat, i love it here, no need to fix anything.

load more comments (14 replies)

Social media was a mistake, tbh

[–] TankovayaDiviziya@lemmy.world 11 points 1 week ago (1 children)

No shit. Unless the Internet becomes democratised and publicly funded like other media in other countries like the BBC or France24, social media will always be toxic. They thrive in provocations and there are studies to prove it, and social media moguls know this. Hell, there are people who make a living triggering people to gain attention and maintain engagement, which leads to advertising revenue and promotions.

As long as profit motive exists, the social media as we know it can never truly be fixed.

load more comments (1 replies)
[–] Perspectivist@feddit.uk 10 points 2 weeks ago (4 children)

Ofcourse not. The issue with social media are the people. Algorithms just bring out the worst in us but it didn't make us like that, we already were.

load more comments (4 replies)
[–] tacosanonymous@mander.xyz 9 points 2 weeks ago

Neat.

Release the epstein files then burn it all down.

[–] General_Effort@lemmy.world 9 points 2 weeks ago (1 children)

The original source is here:

https://arxiv.org/abs/2508.03385

Social media platforms have been widely linked to societal harms, including rising polarization and the erosion of constructive debate. Can these problems be mitigated through prosocial interventions? We address this question using a novel method – generative social simulation – that embeds Large Language Models within Agent-Based Models to create socially rich synthetic platforms. We create a minimal platform where agents can post, repost, and follow others. We find that the resulting following-networks reproduce three well-documented dysfunctions: (1) partisan echo chambers; (2) concentrated influence among a small elite; and (3) the amplification of polarized voices – creating a “social media prism” that distorts political discourse. We test six proposed interventions, from chronological feeds to bridging algorithms, finding only modest improvements – and in some cases, worsened outcomes. These results suggest that core dysfunctions may be rooted in the feedback between reactive engagement and network growth, raising the possibility that meaningful reform will require rethinking the foundational dynamics of platform architecture.

load more comments (1 replies)
[–] 3dcadmin@lemmy.relayeasy.com 7 points 1 week ago

Should just be people can't be fixed....

[–] roguetrick@lemmy.world 7 points 2 weeks ago* (last edited 2 weeks ago)

Pre print journalism fucking bugs me because the journalists themselves can't actually judge if anything is worth discussing so they just look for click bait shit.

This methodology to discover what interventions do in human environments seems particularly deranged to me though:

We address this question using a novel method – generative social simulation – that embeds Large Language Models within Agent-Based Models to create socially rich synthetic platforms.

LLM agents trained on social media dysfunction recreate it unfailingly. No shit. I understand they gave them personas to adopt as prompts, but prompts cannot and do not override training data. As we've seen multiple times over and over. LLMs fundamentally cannot maintain an identity from a prompt. They are context engines.

Particularly concerning sf the silo claims. LLMs riffing on a theme over extended interactions because the tokens keep coming up that way is expected behavior. LLMs are fundamentally incurious and even more prone to locking into one line of text than humans as the longer conversation reinforces it.

Determining the functionality of what the authors describe as a novel approach might be more warranted than making conclusions on it.

[–] TAG@lemmy.world 7 points 1 week ago (4 children)

The article argues that extremist views and echo chambers are inherent in public social networks where everyone is trying to talk to everyone else. That includes Fediverse networks like Lemmy and Mastodon.

They argue for smaller, more intimate networks like group chats among friends. I agree with the notion, but I am not sure how someone can build these sorts of environments without just inviting a group of friends and making an echo chamber.

load more comments (4 replies)
[–] General_Effort@lemmy.world 6 points 2 weeks ago (1 children)

I'm not surprised. I am surprised that the researchers were surprised, though.

Bridging algorithms seem promising.

The results were far from encouraging. Only some interventions showed modest improvements. None were able to fully disrupt the fundamental mechanisms producing the dysfunctional effects. In fact, some interventions actually made the problems worse. For example, chronological ordering had the strongest effect on reducing attention inequality, but there was a tradeoff: It also intensified the amplification of extreme content. Bridging algorithms significantly weakened the link between partisanship and engagement and modestly improved viewpoint diversity, but it also increased attention inequality. Boosting viewpoint diversity had no significant impact at all.

load more comments (1 replies)
[–] SoftestSapphic@lemmy.world 6 points 1 week ago (3 children)

Social spaces aren't something that needs fixing.

We blame the problems caused by wealth inequality on technology as a way to not even discuss making the rich contribute to society

load more comments (3 replies)
[–] kibiz0r@midwest.social 5 points 2 weeks ago (1 children)

Because how to use it is baked into what it is. Like many big tech products, it’s not just a tool but also a philosophy. To use it is also to see the world through its (digital) eyes.

load more comments (1 replies)
load more comments
view more: next ›