TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.
Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.
- Houston: 88 shorts
- Chicago: 98 shorts
- Atlanta: 109 shorts
- NYC: 247 shorts
- San Fransisco: never (Benaminute stopped after 250 shorts)
There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.
What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.
In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.
Do these companies put their fingers on the scale? Almost certainly
But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc
This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.
Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).
You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.
This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up
The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”
This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products
If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings
So… in the US then ?
It’s 100% not just the US where the algorithm favours this stuff.
Agreed 100%. Whenever I’m in India, I get Hindu nationalist content A LOT. I briefly attempted the dislike/don’t recommend thing, but nope! I was getting absolutely spammed with stuff like this regardless. I just disabled shorts after that.
I use YouTube and don’t get much far-right content. My guess is it’s because I don’t watch much political content. I use a podcatcher and websites for that. If I watched political content, it might show me some lurid videos promoting politics I disagree with because that tends to keep viewers engaged with the site/app longer than if they just showed videos consistent with the ideology I seek out. That gives people the feeling they’re trying to push an ideology.
I made that up without any evidence. It’s just my guess. I’m a moderate libertarian who leans Democratic because Republicans have not even been pretending to care about liberty, and for whatever reason it doesn’t recommend the far-right crap to me.
Good. Now the leftists can get a taste of what conservatives have experienced for years.
There is no war but the class war.
Don’t get me wrong, I prefer if platforms don’t take a political stance at all. That’s the reason why I use platforms like Lemmy.
I am simply just pointing out that conservative ideologies have been oppressed online and in the media for the greater part of a decade. Funny to see how the left are losing their minds now that they get a little taste of it themselves.
Idealogues getting mad that their ideology is becoming less popular is both predictable and uninteresting. It’s mostly astroturfing. Left vs right is a divide and conquer tactic to get the workers to not rise up against the oligarchs by convincing them to blame other workers for their problems.
Must’ve been hard for you in the 90’s with Xena Warrior Princess being a DEI gay female superhero.
deleted by creator
I noticed my feed almost immediately changed after Trump was elected. I didn’t change my viewing habits. I’m positive YouTube tweaked the algorithm to lean more right.
Didn’t watch, why is milo pictured?
Don’t be lazy
I’m reading this while my daughter warms up for a chorus. Don’t be a dick.
You could always save it and watch it later if you actually give a shit.
I dont.
So don’t comment at all.
Ok
The presenter chats with Milo.
Thank you
If I see any alt-right content, I immediately block the account and report it. I don’t see any now. I go to yourube for entertainment only. I don’t want that trash propaganda.
Same. I watched one Rogan video in like, 2019, and it was like opening a flood gate. Almost immediately almost every other recommendation was some right-wing personality’s opinion about “cancel culture” or “political correctness.” It eventually called down once I started blocking those channels and anything that looks like it might lead to that kind of content. I can only imagine what would pop up now.
I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.
Isn’t the simpler explanation is youtube has and always will promote the alt-right? Also, no longer the alt right, it’s just the right.
No, the explanation that involves conspiracy is not the simpler explanation.
Was it a conspiracy in 2016? Was it a conspiracy that elon bought x to control the narrative? Was it a conspiracy that TikTok has avoided shutdown by glazing trump? Was it a conspiracy when zuck just changed the way they did moderation on Facebook and then showed up at trumps inauguration?
Youtube cucks are the worst.
I refuse to watch those shit shorts; I think your theory has legs. Unfortunately there doesn’t seem to be a way to turn them off.
FreeTube on PC and Revanced on phone
I use YouTube revanced to disable them.
Thanks!
Removed by mod
yeah i created a new youtube account in a container once and just watched all the popular/drama suggestions. that account turned into a shitstorm immediately
these days i curate my youtube accounts making liberal use of Not interested/Do not recommend channel/Editing my history and even test watching in a container before watching it on my curated account
this is just how “the algorithm” works. shovel more of what you watch in your face.
the fact that they initially will give you right-wing, conspiracy fueled, populist, trash right off the bat is the concern
Man that seems like a lot of work just to preserve a shitty logarithm that clearly isn’t working for you… Just get a third party app and watch without logging in
oddly enough it seems to be working, if i don’t login at all youtube just offers up the usual dross
Oh no, I only watch videos from channels I subscribe to. Just not through YouTube. It’s very easy.
I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.
Been considering creating content myself to at least stem the tide a little.
I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.
Plus fact based videos require research, sourcing and editing.
Emotional fiction only takes as long to create as a daydream.
Not necessarily. For example a lot of “manosphere” guys have taken a hold of philosophy,health and fitness topics, a liberal influencer can give a liberal view on these subjects. For example in philosophy, explain how Nietzsche was not just saying that you can do whatever the fuck you want, or how stoicism is actually a philosophy of tolerance not of superiority etc. there’s really a lot of space that can be covered.
Creation if a lot more difficult than destruction.
Yup. A lesson that I fear we will be learning over and over and over in the coming years.
Really? As someone who dislikeds both mainstream extremes (I consider myself libertarian), I see a lot more left-leaning content than right-leaning content. I wouldn’t be surprised if >75% of the content I watch comes from a left-leaning creator, nor because I seek it out, but because young people into tech tend to lean left, and I’m into tech.
I keep getting recommendations for content like “this woke person got DESTROYED by logic” on YouTube. Even though I click “not interested”, and even “don’t recommend channel”, I keep getting the same channel, AND video recommendation(s). It’s pretty obvious bullshit.
You’d think a recommendation algorithm should take your preferences into account - that’s the whole justification for tracking your usage in the first place: recommending relevant content for you…
YOU’D THINK THAT YES. [caps intended]
deleted by creator
Even in the best-intentioned recommender system, trained on the content you watch to estimate what you’re interested in and recommend similar things, that would be the drift of things. You can’t really mathematically judge the emotions the viewers might feel unless they express them in a measurable way, so observing their behaviour and recommending similar by whatever heuristic. And if they keep clicking on rageposts, that’s what the system has to go on.
But at least giving the explicit indication “I don’t want to see this” should be heavily weighted in that calculation. Just straight up ignoring that is an extra layer of awful.
I feel like it at least used to pretend that it was doing this (YouTube) at least.
I can’t say for recently as I use a third party client these days and do not log in.
Wrong, the whole purpose of tracking your usage is to identify what kind of consumer you are so they can sell your views to advertisers. Recommendations are based on what category of consumer you’ve been identified as. Maintaining your viewership is secondary to the process of selling your views.
I said justification, not purpose. They claim they want to track usage to tailor your experience to you.
They don’t actually believe that, of course, but respecting your explicit expression of interest ought to be the minimum perfunctory concession to that pretense. By this we can see just how thin a pretense it is.
it is. But who said that you get to decide what’s relevant for you? Welcome and learn to trust your algorithmic overlords
Thanks, I hate it
The algorithms are always trying to poke you in the id.
Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.
And if you don’t want to deal with those breaking (becaue google is actively shooting them in the face) they DO provide RSS feeds for your creators.
Just add the channel url to your feed reader (ex. https://www.youtube.com/@LinusTechTips )and forget the youtube webui exists.
Are those available on PC/Linux? On my TV? 😭 I have them on my phone but I feel like there’s too much hassle to do on my main viewing devices.
I use FreeTube on Linux. I think it’s Chromium based, so some people don’t like it, and it’s usually one of the bigger resource hogs when I have it open, but its worth it for the ad-free, subscriptions-only experience imo…
Though lately it hasn’t been behaving well with the vpn…
I mean, on PC I’m not really having much issue. I don’t fall for “recommendations”, and I run ublock origin in Firefox so I have zero ads. All good there. The TV is the worst though…
Well you asked if it worked on PC so… lol
I know, sorry. I realized it wasn’t an issue on PC after the fact.
But the TV… It’s brutal, the amount of long, unskippable ads. It’s worse than on regular/linear television.
Filter bubbles are the strongest form of propaganda.
does shadow-banning create filter bubbles, in a way it demonstrates the power these platforms hold over their users? https://en.wikipedia.org/wiki/Shadow_ban
Commenting on stuff definitely strengthens it, but I wouldn’t know if a shadow ban changes that. I don’t think there’s much difference if you are shadowbanned or not, you’re still interacting with the content.
In my view, instagram blocking Searchterm democrat for short times is kind of a shadowban … on all democrats
That’s not what a shadowban is. A shadow ban is where the user does not know they are banned. These search terms were very obviously censorship and not a shadowban.
If it were a shadowban then you would still get results and be able to interact with it. But some results might have been hidden and your interactions would be hidden to others too. A shadowban is meant to make you believe you were not censored.
I don’t know if anyone of you still looks at memes on 9gag, it once felt like a relatively neutral place but the site slowly pushed right wing content in the last years and is now infested with alt-right and even blatantly racist “memes” and comment-sections. Fels to me like astroturfing on the site to push viewers and posters in some political direction. As an example: in the span during US-election all of a sudden the war on palestine became a recurring theme depicting the Biden admin and jews as “bad actors” and calling for Trump; after election it became a flood of content about how muslims are bad people and we shouldn’t intervene in palestine…
From what I heard, the site was astroturfing long before it took a right turn. But my only sources are online rumors…
I’ve been happy with BlockTube for blocking channels or single videos. I also use YouTube Shorts Redirect for automatically converting shorts into regular videos.
You get what you usually click?
I didn’t watch the video, but it’s YT short, you just swipe like tiktok. The few ways to curate the algorithm is to either swipe away quickly, click on the “not interested” button, downvote, or delete watched shorts from history. If you doesn’t interact with any of this and watch the full length of the video, the algorithm gonna assume you like this kind of content. They also will introduce you content you never watched before to gauge your interest, a lot of times it’s not even related to what you currently watched, and if you didn’t do any curation, they gonna feed you the exact type for some times. I don’t know how they manage the curation but that’s the gist of it from my experience. My feed have 0 politics, mostly cats. I control the feed strictly so i got what i demand.
Does this mean youtube preferentially selects alt-right shorts, or alt-right people make more shorts? Or some other thing entirely? Jump to your own conclusion.
YouTube selects what gives YouTube the most views for the longest time. If that’s right wing shorts, they don’t care.
If the channel is popular, those videos will get recommend
Of it has engagement on top of that, you are fucked, it will definitely get recommend to you.
Either block the channel, the user, or use in incognito. Or don’t