We've all seen it happen: Watch one video on YouTube and your recommendations shift, as if Google's algorithms think the video's subject is your life's passion. Suddenly, all the recommended videos—and probably many ads—you're presented with are on the topic.
Mostly, the results are comical. But there has been a steady stream of stories about how the process has radicalized people, sending them down ever-deepening rabbit holes until all their viewing is dominated by fringe ideas and conspiracy theories.
A new study released on Monday looks at whether these stories represent a larger trend or are just a collection of anecdotes. While the data can't rule out the existence of online radicalization, it definitely suggests that it's not the most common experience. Instead, it seems like fringe ideas are simply part of a larger self-reinforcing community.
Normally, the challenge of doing a study like this is getting data on people's video-watching habits without those people knowing—and potentially changing their behavior accordingly. The researchers worked around this issue by getting data from Nielsen, which simply tracks what people are watching. People allow Nielsen to track their habits, and the firm anonymizes the resulting data. For this study, the researchers obtained data from over 300,000 viewers who collectively watched over 21 million videos on YouTube during a period that ran from 2016 through the end of 2019.
Most of these videos had nothing to do with politics, so the authors used the literature to identify a large collection of channels that previous research had labeled according to their political slant, ranging from far left through centrist to far right. To that list, the researchers added a category that they termed "anti-woke." While they're not always overtly political, a growing collection of channels focus on "opposition to progressive social justice movements." While those channels tend to align with right-wing interests, the ideas are often not presented that way by the hosts of the videos.
All told, the channels the researchers categorized (just under 1,000 of them) accounted for only 3.3 percent of the total video views during this period. And those who viewed them tended to stick with a single type of content; if you started out watching left-leaning content in 2016, you were likely to still be watching it when the study period wrapped up in 2020. In fact, based on time spent per video, you were very likely to be watching more of that content in 2020, perhaps as a product of the contentiousness of the Trump years.
(The exception to this is far-left content, which was viewed so infrequently that it was impossible to pick out statistically significant trends in most cases.)
Almost all types of content outside the fringes also saw growth over this period, both in terms of total viewers and the amount of time spent watching videos on these channels (the exception being far-left and far-right content). This finding suggests that at least some of the trends reflect a growing use of YouTube as a substitute for more traditional broadcast media.
Since viewers mostly watched a single type of content, it's easiest to think of them as forming distinct groups. The researchers tracked the number of people belonging to each group, as well as the time they spent watching videos during the four-year period.
Throughout that time, the mainstream left was about as big as the other groups combined; it was followed by centrists. The mainstream right and anti-woke started the period at about the same level as the far right. But they all showed different trends. The total number of far-right viewers stayed flat, but the amount of time they spent watching videos climbed. By contrast, the total number of mainstream-right viewers rose, but the amount of time they spent watching wasn't much different from the far right.
The anti-woke viewers showed the highest rate of growth of any group. By the end of the period, they spent more time watching videos than the centrists, even if their population remained smaller.
Does any of this represent radicalization? The lack of significant growth at the two extremes would suggest that there's not a major trend of YouTube viewing pushing people into the far left or far right. In fact, the researchers found evidence that many of the people on the far right were just using YouTube as one part of an ecosystem of sites they were engaged in. (Again, the far left was too small to analyze.) Viewers of far-right videos were more likely to arrive at them via links from right-wing websites than from another video.
In addition, there was no sign of any sort of acceleration. If YouTube's algorithms keep directing people to more extreme videos, the frequency of far-right videos should go up toward the end of a viewing session. That didn't happen—in fact, the opposite did.
The researchers note, however, that far-right content was a bit stickier, with viewers spending more time on it, even though the community of far-right viewers didn't grow significantly. Anti-woke material was stickier still and saw the largest growth of viewership. In addition, people who viewed several anti-woke videos in one session were more likely to continue watching them in the future.
Although the anti-woke videos didn't present themselves as overtly political, their viewers tended to consider them right-wing based on their integration with the larger ecosystem of right-wing websites. That didn't drive radicalization, though—having more anti-woke viewers didn't ultimately produce more far-right viewers.
Although the researchers found no evidence that YouTube is driving radicalization, the work has some clear limitations. For one, it only tracked desktop browser use, so it missed mobile viewing. The researchers also could not determine what YouTube's algorithms actually recommended, so they could only infer the actual response to recommendations based on overall behavior. And as always, the average behavior of users can obscure some dramatic exceptions.
"On a platform with almost 2 billion users, it is possible to find examples of almost any type of behavior," as the researchers put it.
To see more documents/articles regarding this group/organization/subject click here.