Reddit — formerly known as the ‘front page of the internet’ — has come to be known in recent years as a politically polarized platform. More and more people have been relying on the platform for news, making it an extremely popular place for political discussion.
Following Donald Trump’s 2015 campaign, political subreddit r/TheDonald formed on the site. Other pages, such as r/democrats and r/conservative, have also rapidly gained popularity, creating endless echo chambers and bouts of brigading between right- and left-wing communities. These changes amplified a vocal, extreme minority, and many had assumed that the existing user base on Reddit became heavily polarized during the elections.
However, a U of T study — led by Isaac Waller, a PhD student in computer science, and Ashton Anderson, an assistant professor in the Department of Computer and Mathematical Sciences at UTSC — found that the increased polarization of Reddit is not a product of its preexisting users, but rather of the influx of new users who joined the platform during and after the 2016 US elections.
The shift to belligerence
This phenomenon isn’t new. Since the popularization of the internet, online culture has constantly shifted as new users overwhelm and change the cultures on online forums and platforms. This trend reached its peak in September 1993, in what came to be known as the ‘Eternal September,’ when AOL began to offer access to many more users. This phenomenon flooded the small pool of existing forums at the time, fundamentally changing the social status quo. Since then, the internet has seen a constant stream of new users across a number of platforms — which is exactly what happened to Reddit in 2016.
To study polarization across the platform, Waller and Anderson designed a machine learning model that looked at over 5.1 billion comments to create community embeddings — scales used to represent and quantify the similarities in community memberships. These embeddings showed how many individuals were active in specific communities, which was then used to show divisions across several lines on the social media platform, including political polarization.
Prior to 2016, the amount of political activity was fairly muted in the actual far-left- and far-right-wing communities, since these communities only accounted for 2.8 per cent of all political discussion in January 2015. This number rapidly increased in 2016 and peaked in November, when nearly 25 per cent of all political discussion was taking place in these subreddits — and Reddit’s polarization has stayed near the levels of that period ever since.
The study showed that while only eight per cent of all political discussion occurred in far-left-wing subreddits, the users who contributed to these communities had almost half of their activity in these subreddits. The same can be seen for more right-wing communities — only 16 per cent of political discussion occurs in far-right-wing subreddits, but these communities are where around 62 per cent of activity from right-wing users comes from.
The authors of the study mentioned that “changes in polarization over time on Reddit are not associated with previous activity on the platform but rather are synchronously aligned with external events, and are disproportionately driven by new users.”
Surprisingly, close to no ideological polarization was observed in left-leaning communities or userbases. In every month from 2012 to 2018, right-wing communities were observed to be more polarized than their left-wing counterparts. Right-wing communities were also the biggest contributors to the increased polarization scores observed in the study, despite being some of the smallest political communities on the platform.
An opinionated internet
This study was also vital in creating a new means of analysis for online platforms. Sociologists are constantly looking for ways to better quantify and understand social connections and group identities, and harnessing machine learning for this task has proven to be exceptionally useful.
Reddit is an ideal candidate for it, since the platform has not yet implemented user-based algorithms that would affect user activity by suggesting new content and communities to them. This lack of algorithmic content allows users to have more organic interactions in the communities they care about and gives them more control over what they consume.
Content algorithms on other popular platforms, however, impact the communities and circles that users interact in. As a result, they take control away from the user in the content they consume. While there had already been some prior studies in this area, they always focused on platforms such as Twitter or Facebook. However, these platforms and their algorithms create interactions that would not exist if users had more control over what they got to see.
The study also offered some support for the echo chamber hypothesis, which argues that a group of people online with similar views can often develop a sort of tunnel vision. People end up participating in only the communities where they know their views will be accepted, which leads them to believe their perspective is shared by the majority, even if it is not.
As the effects of the Eternal September continue, and as ‘netiquette’ changes with the arrival of new, unaccustomed users, these analytic methods will only get stronger and better at understanding online communities — and, hopefully, give us a better idea of how social media can be affected by real-world events.