On October 31, The Independent reported that more than 126 million Americans may have been exposed to Facebook posts “disseminated by Russian-linked agents seeking to influence the 2016 presidential election.” This astounding figure, representing more than half of eligible American voters, is indicative not only of the serious effects that foreign agents may have had in the 2016 American election, but also of a larger trend in the way that people access news.

Accessing news from social media instead of from more traditional providers like television and newspapers is becoming increasingly popular. According to Pew research from August, two-thirds of Americans report that they get at least some of their news directly from social media sites, with 20 per cent confessing that they do so “often.”

The source from which news is accessed has an important effect on the nature of the information received. And although it is easy enough to mandate that social media platforms regulate themselves by blocking or labeling misinformation, this may prove to be far easier said than done.

While we should be concerned that news providers — in this case, Facebook and Twitter especially — are motivated by profit instead of by truth, the problem is far more nuanced than that. Media sources have long been businesses first and foremost. The first news program to be broadcast in colour was Camel News Caravan, brought to you by Camel cigarettes. Walter Cronkite, long known as “the most trusted man in America,” uttered slogans for Winston Tobacco between segments.

It is not the profit motive that makes getting news from social media so dangerous; rather, it is what profit motivates these platforms to be. Whereas concern for the bottom line prompts traditional media to be fair and balanced, the effects it has on social media are far more nefarious.

Before the ubiquity of social media, a lack of options made the average consumer occasionally frustrated but generally informed — and on the same page as his neighbour who, regardless of political affiliation, ultimately got the same set of facts.

This is because, perhaps paradoxically, the business side of traditional news outlets actually incentivizes balance and parity of points of view. As long as the information cannot be tailored to suit the preferences and biases of each individual viewer, fostering a sense of fairness and impartiality is simply the best way to maximize viewership. The left-leaning viewer and the right-leaning viewer are forced, due to simple dearth of options, to get their news from the same source. For this reason, to avoid losing half of the market, traditional news outlets have had to be balanced enough to keep people of all political stripes tuning in.

Today, the algorithms that determine our news feeds are not hindered by lack of options. It turns out that people prefer confirmation to truth, agreeability to variation, and corroboration of previously held views over new, challenging evidence. Within Facebook’s incessantly shifting network are innumerable echo chambers, enclosed by a barrier that is impenetrable to dissenting views: profit. Now that the news provider can tailor the information it provides to the exact preferences of the viewer, the profit motive — which seeks only to ensure eyeballs on advertisements — no longer values impartiality, but rather the continued confirmation and exacerbation of those preferences.

As long as we prefer to return to sources that confirm our views, it is difficult to foresee how getting news from social media could be anything but divisive. Many have called for the platforms themselves to clearly distinguish disreputable information on their sites; Facebook has begun to do so by designing a new banner that will alert viewers to posts that are disputed by the requisite number of sources.

However, these measures can only address a small part of the larger problem. We need to begin by distinguishing two issues: the proliferation of false informatio and the entirely different issue of inaccessibility of dissenting views.

The first issue seems, at least at first glance, far easier to fix — social media should clearly indicate when false information is being presented. However, this solution is not as simple as it seems. For starters, it’s one thing to remove an unfounded news piece from the site, and it’s quite another to censure the contributions of actual individual users.

Using social media as news sources blurs the line between news providers and news consumers. This is troubling because while there is a long tradition of holding news providers accountable if their content is manifestly false, the rest of us are not usually held to the same standard. But social media is built around the contributions of individual users, and there is a big difference between fact-checking content submitted by third-party sources or corporations and censoring the views of regular people.

This applies just as much to opinion as it does to news. For example, I might write a status about how the Star Wars prequels are better than the originals. As obviously false as most would think this claim is, is it Facebook’s responsibility to correct me?

Once social media sites begin marking the submissions of individuals as plainly false or fallacious, it seems inevitable that there will be considerable backlash, even if the demarcation is correct. Also, if the last 18 months have taught us anything, it’s that people will doubt the credibility of news outlets long before they will doubt their own views. If Facebook positions itself as one of those authorities, it will lose eyeballs and then profits, which will seriously test its resolve.

It is not clear that the problems presented by making social media our primary news source can be solved by intervention from those platforms. This is especially true due to the inaccessibility of challenges to our views. Indeed, the only real solution may be cognizance. It is only awareness of our vulnerability to bias that will make us less vulnerable to misinformation; it is only consciousness of our inherent hostility toward dissent that we might become more accepting of it. If we can learn to question our own biases, to pause for a moment before hitting the ‘share’ button to consider our own motivations, then perhaps we can begin to undo the damage that has been done. One thing, however, is abundantly clear: whatever we’re doing now is not working.

Zach Rosen is a second-year student at Trinity College studying History and Philosophy. He is The Varsity’s Current Affairs Columnist.