The Facebook corporation's recent attempts to address problems on its site should leave users skeptical. STEVEN LEE/THE VARSITY

In November of last year, I wrote a piece called “Fact-checking Facebook.” In that piece, I identified two distinct ways that social media threatens our democracy and our discourse: the way it can facilitate the proliferation of false information and the way it can minimize dissenting views or make them inaccessible. I was skeptical that Facebook’s ultimate motivations as a business would lead them to make the substantive changes needed to confront these issues.

Last week, the social media behemoth provided us with an opportunity to re-evaluate that prediction. In a somewhat confounding series of posts called Hard Questions, Facebook reflected that social media had originally “seemed like a positive” when it came to democracy. They conceded that the 2016 American election changed that impression. You don’t say.

Alongside this acknowledgement of the problem came two attempts at a solution. Though these represent a step in the right direction, we ought to be skeptical of the extent to which Facebook will be willing to enforce them. As a corporation, Facebook is transparently driven namely by profit and not by goodwill — meaning it is unlikely to pursue solutions that will ultimately hurt its bottom line, even if doing so would be in the best interests of its users.

First, there is the issue of the proliferation of fake news. Facebook’s solution is to double-down on incorporating a fact-checking mechanism into the way posts and stories are shared and viewed on the site. The new feature, which is a partnership with Politifact, flags content when enough users have tagged it as potentially unverified. While a commitment to fact-checking represents an important step in the right direction, it does seem antithetical to Facebook’s current model as a place where views can be shared and discussed, even if those views are based on falsehoods. If Facebook becomes inhospitable for those of certain political orientations, it seems almost inevitable that they will lose users.

For this reason, stringent fact-checking may be bad for the bottom line. The events of the last few years have demonstrated that people will enthusiastically contest the credibility of long-established and imminently respected sources in the service of confirming previously held beliefs. If Facebook positions itself as the arbiter of truth, this might provoke a backlash from those who find the truth incompatible with their point of view. While this might not be a bad thing for democracy, it will push people off the site, and it’s not clear whether that’s a result Facebook will tolerate.

Another pressing issue is the way in which Facebook facilitates and sustains online echo chambers, which seriously hinder constructive dialogue. Echo chambers are the result of social media’s proclivity to confirm existing views instead of presenting challenging new evidence or dissenting opinions. Instead, sites like Facebook siphon conversation into self-affirming silos, which thwarts discourse.

Facebook’s proposed solution in this regard — offering a more varied selection of sources in the Related Articles tab associated with a link — is totally impotent. If Facebook were serious about fixing this problem, it wouldn’t focus on a rarely visited and isolated feature on the site — instead, it would take the radical step of diversifying the content served up on the News Feed. But Facebook has so far neglected to do so because the need for civil and constructive discourse is substantially less compelling than the financial incentive to keep users’ eyes on advertisements.

This incongruity is made plain given that, in June 2016, Facebook announced changes to the News Feed, reaffirming that it would continue to tailor its content to suit the preferences of each user. Facebook’s Vice-President of Product Management confirmed that the organization’s objective “is to deliver the types of stories… an individual person most wants to see” because doing so “is good for [Facebook’s] business.” Put simply, providing users with content that confirms their existing views and shuts out dissent is part of Facebook’s business model. According to that same vice-president, “When people see content they are interested in, they are more likely to spend time on News Feed and enjoy their experience.”

I’m not optimistic that this basic premise will change any time soon. And unless it does, Facebook won’t be changing either. Solving the problem of the destructive effect of social media on democracy won’t be achieved by waiting around for Facebook, a profit-driven entity, to make itself less divisive. Rather, if we wish to dig ourselves out of our respective confirmation feedback loops, we need to take steps to fundamentally change our relationship to social media.

First, it is important that we consciously reduce our reliance on social media for information. Pew research from last August showed that two-thirds of respondents get at least some of their news from social media. Using social media as a news source fundamentally affects the type of information received. While copy editors and fact-checkers used to be in a position to prevent untrue information from being widely disseminated, social media does not provide that kind of filter. Also, while it is established practice to hold traditional news sources accountable for making unverifiable claims, the same rules do not apply online. Changing our sources of information will go a long way toward repairing our discourse.

Second, and perhaps more importantly, echo chambers are only effective so long as they remain covert — or at least passively unacknowledged. They work because it’s one thing to understand that Facebook feeds you exactly what you want to see, and it’s another thing to internalize that fact and allow it to tarnish your experience with the site. The most important measure we can take to de-silo our discourse is to recognize the isolation and to understand that what we see is not all that there is to see. If we can learn to take what Facebook shows us with an ocean’s worth of grains of salt, then Facebook may actually be prompted to make the substantive changes that would be most effective. If last week is any indication, waiting for them to take initiative won’t do much good.

Zach Rosen is a second-year student at Trinity College studying History and Philosophy. He is The Varsity‘s Current Affairs Columnist.




Stay up to date. Get breaking news alerts, sent straight to your inbox:

* indicates required

Tags: