How to become a brainwashed A-hole online:
.....
"Over years of reporting on internet culture, I’ve heard countless versions of Mr. Cain’s story: an aimless young man — usually white, frequently interested in video games — visits YouTube looking for direction or distraction and is seduced by a community of far-right far-wrong creators.
Some young men discoverfar-right far-wrong videos by accident, while others seek them out. Some travel all the way to neo-Nazism, while others stop at milder forms of bigotry.
The common thread in many of these stories is YouTube and its recommendation algorithm, the software that determines which videos appear on users’ home pages and inside the “Up Next” sidebar next to a video that is playing. The algorithm is responsible for more than 70 percent of all time spent on the site.
The radicalization of young men is driven by a complex stew of emotional, economic and political elements, many having nothing to do with social media. But critics and independent researchers say YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.
Some young men discover
The common thread in many of these stories is YouTube and its recommendation algorithm, the software that determines which videos appear on users’ home pages and inside the “Up Next” sidebar next to a video that is playing. The algorithm is responsible for more than 70 percent of all time spent on the site.
The radicalization of young men is driven by a complex stew of emotional, economic and political elements, many having nothing to do with social media. But critics and independent researchers say YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.
.....
In recent years, social media platforms have grappled with the growth of extremism on their services. Many platforms have barred a handful of YouTube, whose rules prohibit hate speech and harassment, took a more laissez-faire approach to enforcement for years. This past week, the company announced that it was updating its policy to ban videos espousing neo-Nazism, white supremacy and other bigoted views. The company also said it was changing its recommendation algorithm to reduce the spread of misinformation and conspiracy theories." [About effing time!]
.......
In reality, YouTube has been a godsend for hyper-partisans on all sides. It has allowed them to bypass traditional gatekeepers and broadcast their views to mainstream audiences, and has helped once-obscure commentators build lucrative media businesses.
It has also been a useful recruiting tool for far-right extremist groups. Bellingcat, an investigative news site, analyzed messages fromfar-right far-wrong chat rooms and found that YouTube was cited as the most frequent cause of members’ “red-pilling” — an internet slang term for converting to far-right far-wrong beliefs. A European research group, VOX-Pol, conducted a separate analysis of nearly 30,000 Twitter accounts affiliated with the alt-right Ultra-Wrong. It found that the accounts linked to YouTube more often than to any other site.
https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html .
It has also been a useful recruiting tool for far-right extremist groups. Bellingcat, an investigative news site, analyzed messages from
https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html .
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.