Facebook CEO Mark Zuckerberg once said, “A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” As a company that operates on relevance, Facebook has a vested interest in showing its users what they want to see, and its News Feed algorithms help ensure it.
Time Magazine has reported that each week, a group of approximately 20 engineers and data scientists gathers in Facebook’s Silicon Valley headquarters to analyze the millions of likes, comments, and clicks made daily by Facebook users. On the other side of the country in Knoxville, Tennessee, a group of 30 contract workers are paid to surf Facebook. These hires scroll through their personal News Feeds to evaluate how well stories relative to their interests are placed.
Although Facebook does not explicitly disclose how its algorithm works, some effects are fairly obvious. Facebook takes into consideration how close you are with a person by evaluating how often you like their posts, write on their Timeline, look through their photos, or chat with them on Messenger. The algorithm will then begin to show the posts of your “closest” Facebook friends more frequently in your timeline. Additionally, the algorithm assesses what kind of posts users tend to engage with more, showing more videos to people who frequently like videos and links to people who like links.
According to Caleb Gardner, a former adviser to President Obama on social media, 44 per cent of adults and 61 per cent of millennials in the US get their news through Facebook.
Considering its ubiquity, this is no surprise, but much of the details of how the system itself is operated remain under wraps. In fact, in October, German Chancellor Angela Merkel called on major Internet corporations like Facebook to reveal the secrets to their algorithms. Addressing a media conference in Munich, Merkel said, “I’m of the opinion that algorithms must be made more transparent, so that one can inform oneself as an interested citizen about questions like ‘what influences my behaviour on the Internet and that of others?’”
Over the past few months, my News Feed has been dominated by articles — and memes — pertaining to the recent American presidential election. This shouldn’t be surprising, since I constantly find myself clicking and liking these types of posts. The day after the election, my timeline was flooded with articles about Donald Trump’s “shocking” win. Apparently, for the majority of my Facebook friends liking and reading the same articles that I was, “shocking” best described the result.
In the days following, I began to wonder what other people’s News Feeds looked like. Was my feed shielding me from articles that could have foreshadowed the election’s outcome? Was I subconsciously deciding who and what I was influenced by on social media?
According to Professor Steve Hoselton, a Senior Lecturer at the Book and Media Studies Department at the University of Toronto, people are constantly subconsciously deciding what they are influenced by on social media.
“I think the fear of selective awareness is a real one. Both consciously and unconsciously – and through algorithms imposed from outside, we engage the overwhelming volume of data by editing it,” Hoselton says. Referencing the work of Marshall McLuhan, a prominent media theorist and former University of Toronto professor, Hoselton added, “Those of us who use media to reinforce our fears and desires will end up suffering from our ignorance. Those of us who use media to challenge our assumptions and expand our understanding will be wiser for it.”
Because Facebook’s algorithms seem to be contributing to the company’s success from a business perspective, its modus operandi is unlikely to change. It’s up to the informed consumer to make the conscious decision to look for more diverse sources of information. While technology exerts a historically unprecedented influence over us, we still have the power to decide its extent.