this post was submitted on 23 Jan 2025
1116 points (97.1% liked)

Technology

61227 readers
4190 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] pennomi@lemmy.world 200 points 1 week ago (9 children)

I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.

[–] Plebcouncilman@sh.itjust.works 64 points 1 week ago* (last edited 1 week ago) (2 children)

I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.

Been considering creating content myself to at least stem the tide a little.

[–] BassTurd@lemmy.world 40 points 1 week ago (3 children)

I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.

[–] Plebcouncilman@sh.itjust.works 15 points 1 week ago

Not necessarily. For example a lot of “manosphere” guys have taken a hold of philosophy,health and fitness topics, a liberal influencer can give a liberal view on these subjects. For example in philosophy, explain how Nietzsche was not just saying that you can do whatever the fuck you want, or how stoicism is actually a philosophy of tolerance not of superiority etc. there’s really a lot of space that can be covered.

load more comments (2 replies)
load more comments (1 replies)
[–] credo@lemmy.world 15 points 1 week ago (2 children)

I refuse to watch those shit shorts; I think your theory has legs. Unfortunately there doesn’t seem to be a way to turn them off.

[–] JPAKx4@lemmy.blahaj.zone 13 points 1 week ago (1 children)

I use YouTube revanced to disable them.

load more comments (1 replies)
load more comments (1 replies)
load more comments (7 replies)
[–] victorz@lemmy.world 73 points 1 week ago (2 children)

I keep getting recommendations for content like "this woke person got DESTROYED by logic" on YouTube. Even though I click "not interested", and even "don't recommend channel", I keep getting the same channel, AND video recommendation(s). It's pretty obvious bullshit.

[–] SaharaMaleikuhm@feddit.org 22 points 1 week ago (2 children)

Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.

[–] victorz@lemmy.world 7 points 1 week ago* (last edited 1 week ago) (4 children)

Are those available on PC/Linux? On my TV? 😭 I have them on my phone but I feel like there's too much hassle to do on my main viewing devices.

load more comments (4 replies)
load more comments (1 replies)
[–] lennivelkant@discuss.tchncs.de 19 points 1 week ago (6 children)

You'd think a recommendation algorithm should take your preferences into account - that's the whole justification for tracking your usage in the first place: recommending relevant content for you...

[–] andallthat@lemmy.world 11 points 1 week ago* (last edited 6 days ago) (2 children)

it is. But who said that you get to decide what's relevant for you? Welcome and learn to trust your algorithmic overlords

Thanks, I hate it

load more comments (1 replies)
[–] victorz@lemmy.world 10 points 1 week ago

YOU'D THINK THAT YES. [caps intended]

load more comments (3 replies)
[–] HawlSera@lemm.ee 60 points 1 week ago (1 children)

I hate the double standards

On a true crime video: "This PDF-File game ended himself after he was caught SAing this individual.... Sorry Youtube forces me to talk like that or I might get demonetized" Flagged for discussing Suicide

On PragerU: "The Transgender Agenda is full of rapists and freaks who will sexually assault your children, they are pedophiles who must be dealt with via final solution!" Completely fucking acceptable!

[–] SkyezOpen@lemmy.world 7 points 1 week ago (2 children)
load more comments (2 replies)
[–] SamuelRJankis@lemmy.world 52 points 1 week ago (3 children)

Instagram is probably notably worse, I have a very establish account that should be very anti that sort of thing and it keeps serving up idiotic guru garbage.

Tiktok is by far the best in this aspect, at least before recent weeks.

[–] IllNess@infosec.pub 17 points 1 week ago (1 children)

A couple of years ago, I started two other Instagram accounts besides my personal one. I needed to organize and have more control of what content I want to see at times I choose. One was mostly for combat sports, other sports, and fitness. The second one was just food.

The first one, right off the bat showed me girls with OnlyFan accounts in the discovery page. Then after a few days, they begin showing me right wing content, and alpha male garbage.

The second one, the food account, showed alternative holistic solutions. Stuff like showing me 10 different accounts of people suggesting I consume raw milk. They started sending me a mix of people who just eat meat and vegans.

It's really wild what these companies show you to complete your profile.

[–] SamuelRJankis@lemmy.world 14 points 1 week ago

I saw a tiktok video talking about how Instagram starts the redpill/incel stuff early for the young people then they become failures in life at which point they push the guru stuff for "guidance".

EU and even China has at least made a attempt of holding these companies accountable for the algorithm but US and Canadian government just sat there and did nothing.

load more comments (2 replies)
[–] ragebutt@lemmy.dbzer0.com 46 points 1 week ago* (last edited 1 week ago)

Do these companies put their fingers on the scale? Almost certainly

But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc

This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.

Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).

You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.

This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up

The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”

This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products

If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings

[–] Blackmist@feddit.uk 36 points 1 week ago (1 children)

From my anecdotal experiences, it's "manly" videos that seem to lead directly to right wing nonsense.

Watch something about how a trebuchet is the superior siege machine, and the next video recommended is like "how DEI DESTROYED Dragon Age Veilguard!"

[–] Valmond@lemmy.world 22 points 1 week ago

Or "how to make ANY woman OBEY you!"

Check out a short about knife sharpening or just some cringe shit and you're all polluted.

[–] shalafi@lemmy.world 25 points 1 week ago (9 children)

I'll get downvoted for this, with no explanation, because it's happened here and on reddit.

I'm a liberal gun nut. Most of my limited YouTube is watching gun related news and such. You would think I'd be overrun with right-wing bullshit, but I am not. I have no idea why this is. Can anyone explain? Maybe because I stick to the non-politcal, mainstream guntubers?

The only thing I've seen start to push me to the right was watching survival videos. Not some, "dems gonna kill us all" bullshit, simply normal, factual stuff about how to survive without society. That got weird fast.

[–] JackbyDev@programming.dev 11 points 1 week ago

Their algorithms are probably good enough to know you're interested in guns but not right wing stuff. Simple as that.

load more comments (8 replies)
[–] x00z@lemmy.world 24 points 1 week ago (4 children)

Filter bubbles are the strongest form of propaganda.

load more comments (4 replies)
[–] GhostlyPixel@lemmy.world 21 points 1 week ago* (last edited 1 week ago) (2 children)

The view farming in shorts makes it even harder to avoid as well. Sure, I can block the JRE channel, for example, but that doesn’t stop me from getting JRE clips from probably day-old accounts which just have some shitty music thrown on top. If you can somehow block those channels, there’s new ones the next day, ad infinitum.

It’s too bad you can’t just disable the tab entirely, I feel like I get sucked in more than I should. I’ve tried browser extensions on mobile which remove the tab, but I haven’t had much luck with PiPing videos from the mobile website, so I can’t fully stop the app.

[–] Traister101@lemmy.today 7 points 1 week ago (1 children)
load more comments (1 replies)
load more comments (1 replies)
[–] doortodeath@lemmy.world 19 points 1 week ago (1 children)

I don't know if anyone of you still looks at memes on 9gag, it once felt like a relatively neutral place but the site slowly pushed right wing content in the last years and is now infested with alt-right and even blatantly racist "memes" and comment-sections. Fels to me like astroturfing on the site to push viewers and posters in some political direction. As an example: in the span during US-election all of a sudden the war on palestine became a recurring theme depicting the Biden admin and jews as "bad actors" and calling for Trump; after election it became a flood of content about how muslims are bad people and we shouldn't intervene in palestine...

load more comments (1 replies)
[–] jared@mander.xyz 16 points 1 week ago

Don't let the algorithm feed you!

[–] ohlaph@lemmy.world 16 points 1 week ago (1 children)

If I see any alt-right content, I immediately block the account and report it. I don't see any now. I go to yourube for entertainment only. I don't want that trash propaganda.

load more comments (1 replies)
[–] FireTower@lemmy.world 15 points 1 week ago (10 children)

Saying it disproportionately promotes any type of content is hard to prove without first establishing how much of the whole is made up by that type.

The existence of proportionately more "right" leaning content than "left" leaning content could adequately explain the outcomes.

load more comments (10 replies)
[–] Valmond@lemmy.world 14 points 1 week ago

I bet thise right wing shorts are proposed and shoehorned in everywhere because someone pays for the visibility. Simple as that.

[–] thezeesystem@lemmy.blahaj.zone 12 points 1 week ago

All platforms are now excessively catering to Elon Nazi trump America. It's pretty much propaganda. And it's extreme and excessive.

[–] bulwark@lemmy.world 11 points 1 week ago

I noticed my feed almost immediately changed after Trump was elected. I didn't change my viewing habits. I'm positive YouTube tweaked the algorithm to lean more right.

[–] ohellidk@sh.itjust.works 10 points 1 week ago

Crazy stuff. So not only does YouTube make you generally dumber, it now is pushing the audience to more Conservative viewpoints because of the "emotional engagement" that keeps 'em watching. and YouTube probably sells more premium subscriptions that way. fuck google!

[–] Subtracty@lemmy.world 9 points 1 week ago

With Milo (miniminuteman) in the thumbnail, I thought the video was going to imsinuate that his content was part of the alt-right stuff. Was confused and terrified. Happily, that was not the case.

[–] LandedGentry@lemmy.zip 9 points 1 week ago

This is basically the central thesis of The Social Dilemma.

[–] TankovayaDiviziya@lemmy.world 9 points 1 week ago (1 children)

Yeah, I've gotten more right wing video recommendations on YouTube, even though I have turned off my history. And even if I turned on my history, I typically watch left wing videos.

load more comments (1 replies)
[–] Hope@lemmy.world 8 points 1 week ago* (last edited 1 week ago) (1 children)

Just scrolling through shorts on a given day, I'm usually recommended at least one short by someone infamously hostile to the LGBTQIA+ community. I get that it could be from people with my interests hate-watching, but I don't want to be shown any of it. (Nearly all of my YouTube subscriptions are to LGBTQIA+ creators. I'm not subscribed to anyone who has ever even mentioned support for right leaning policies on anything.)

load more comments (1 replies)
load more comments
view more: ‹ prev next ›