If YouTube’s algorithms radicalize people, it’s hard to tell from the data
Read Time:49 Second

If YouTube’s algorithms radicalize people, it’s hard to tell from the data

0 0

Enlarge / YouTube’s recommendation algorithm probably didn’t send them to Washington, DC. (credit: Brent Stirton / Getty Images)

We’ve all seen it happen: Watch one video on YouTube and your recommendations shift, as if Google’s algorithms think the video’s subject is your life’s passion. Suddenly, all the recommended videos—and probably many ads—you’re presented with are on the topic.

Mostly, the results are comical. But there has been a steady stream of stories about how the process has radicalized people, sending them down an ever-deepening rabbit hole until all their viewing is dominated by fringe ideas and conspiracy theories.

A new study released on Monday looks at whether these stories represent a larger trend or are just a collection of anecdotes. While the data can’t rule out the existence of online radicalization, it definitely suggests that it’s not the most common experience. Instead, it seems like fringe ideas are simply part of a larger self-reinforcing community.

Read 15 remaining paragraphs | Comments

About Post Author

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous post Boeing postpones its Starliner mission after detecting a technical issue
Next post After a COVID-free year, delta arrives in Wuhan, China
Generated by Feedzy