This past November, the Oxford Dictionary announced the word of 2016: ‘post-truth.’ It is defined as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”

This announcement couldn’t come at a more apt time, with fake news stories gaining traction across the World Wide Web. No, Denzel Washington did not endorse President-elect Donald Trump — least not ‘in the most epic way’ possible — and no, Trump was not born in Pakistan.

Profiteers of these stories are people like Paris Wade and Ben Goldman, who, on a daily basis, churn out headlines like “OBAMA BIRTH SECRETS REVEALED! The Letters From His Dad Reveal Something SINISTER…” on their Liberty Writers News website. Other beneficiaries include a group of teenagers in the town of Veles, Macedonia, spreading similar falsehoods, like the story alleging that Pope Francis forbade Catholics from voting for Hillary Clinton in the recent US election.

Not only is the dissemination of blatant lies troubling, but these individuals accumulate thousands of dollars off of ad revenue when people view and share the fabricated stories.

Made-up stories are not new phenomena. When The War of the Worlds radio program was aired by  the Columbia Broadcasting System in 1938, it allegedly caused mass panic in the US. In Canada, CBC Radio’s satirical news program This is That has been receiving phone calls for years from listeners who take the content of the show seriously.

But the motivations behind more recent fabrications differ greatly from these examples, which are intentionally fictional and meant to entertain. The intent of some fake news sites isn’t to make you laugh or smile; instead, it is for you to share and proliferate false messages.

Fake news easily pervades sources deemed credible and competent. When individuals share fake news articles via Silicon Valley empires like Facebook and Google, these fabrications become legitimized by proxy.

For many people, Facebook and Google comprise their main interactions with the Internet: both websites are easily accessible for information about the world. A recent study shows that 62 per cent of Americans get news from social media networks, and 44 per cent of Americans derive their news from Facebook in particular. Social media provides lucrative audiences for fake news developers to try to reach.

It certainly doesn’t help that 2016 has been an emotionally-driven year. Prominent musicians such as Leonard Cohen and David Bowie passed away, Syria and Crimea continue to be plagued by conflict, and the US election was arguably the most divisive one in decades. All of this has accumulated to produce a rather stressful year.

According to a 2009 study, stories framed from a human-interest point of view elicit stronger emotional responses and are ranked higher in terms of how well they communicate their intended message. The study also shows that there were no perceived differences in the objectivity of the story or the credibility of the source across different types of frames used for the story.

Knowing how easy it is to pull at our heartstrings regardless of the validity of the evidence, fake stories and other sensationalist media outlets provide headlines and imagery to reel us in.

Facebook and Google are aware of this issue. While Facebook CEO Mark Zuckerberg incredulously downplayed the matter at one point — stating that “only a very small amount is fake news and hoaxes” — he and his company have since taken action on the issue, developing several strategies to uproot the issue and fix their news algorithms. Meanwhile, Google has blocked fake news sites from accessing their advertising networks.

However, this isn’t to say that these corporations should be the only ones held accountable. Just as they have the option of not appeasing only their shareholders for greater dividends, we too have the option of not using their services.

Within this context, we too must try to become more diligent, well-versed readers. This can be accomplished if we constantly challenge what we read and what we think we know. People will always be driven to find evidence that fits with their beliefs, to prove that they are in the right and their opponents are in the wrong. This basic drive is, in some ways, a defining characteristic of human beings: it is in our nature to defend our own perceptions of the world.

However, there is a difference between one’s perception of the world and a counterfeited ideal. Fake news capitalizes on shares and clicks; just ask the Macedonian site Meta, which has launched approximately 140 fake US political websites to profit from your views through their hybrid, fake, and plagiarized ‘articles.’

By specifically catering to one’s uncompromising views of the world, conflict becomes viciously easy; the effect of fake news sites is to reduce social media debates to nothing more than heated, fundamentally misinformed slap fights. And although sharing inaccurate tales may bolster one’s ego and provide a sense that you are winning the battle, only these news simulators come out the victors. It might take a piece of humility to restrain ourselves from those ‘I-told-you-so’ moments and not share a fake news link.

Companies like Facebook and Google will need to continue to take the threat of fake news seriously, for shareholders and customers alike. However, fighting fake news is a team effort. In order to distance ourselves from the ‘post-truth’ paradigm, all of us must think before we click and share.

Ian T. D. Thomson is a Master of Public Policy candidate at the School of Public Policy and Governance. He holds an Honours Bachelor of Science in Psychology and a Bachelor of Arts in Philosophy from the University of Manitoba.