What comes to mind when you think of artificial intelligence (AI)? Maybe dangerous fake videos, ChatGPT essays that students submit just before midnight, or the decapitator of your post-grad employment dreams. I don’t think it’s too far-fetched to say that AI generally has a rather controversial reputation, and the integration of AI into newsrooms is increasingly ringing alarm bells across the world.
However, I believe the current discussion of AI is oversaturated with a misunderstanding of the potential that AI has for newsrooms. In fact, AI has assisted journalists in major new companies for years now and alleviates the tedious aspects of their jobs.
What exactly is AI in the newsroom?
Currently, AI in the newsroom is mostly used for web scraping and extracting information from large data sets. For instance, the Organised Crime and Corruption Project uses pattern recognition models to track organized crime globally.
There are also financial uses for AI: The Globe and Mail utilizes Sophi — an AI software which uses behavioural data — to calculate the potential subscription revenue that an article would bring in when predicting whether or not to paywall it. This has allowed The Globe to increase its subscription revenue and boost reader engagement in the process.
In its guidelines, The Globe stressed that AI language tools can be a good starting point for researching and brainstorming, but every source should be approached with skepticism. Language tools cannot be used to “condense, summarize, or produce writing for publication.” Additionally, photo illustrations and videos that were entirely produced with AI tools should not be published without a label that explicitly credits them as an “AI-generated image” or “AI-generated illustration.”
In an interview with The Globe’s head of visual journalism, Matt Frehner, he said: “We don’t put anything into a language model that is pre-publication… We keep that out of any kind of third-party tool because we don’t know where that information is going and how it can be used in the future.” Instead, its newsroom uses tools that are provided within The Globe.
There are valid concerns, however
The first concern is newspapers’ growing reliance on tech companies. In light of Bill C-18, it’s reasonable to feel hesitant to put news companies further at the mercy of tech companies if they have to depend on external AI providers. These external providers can also undercut the autonomy of news companies as journalists are limited by the models that the AI software was trained to do and the biases it may possess, which influence the stories journalists tell. News companies, therefore, have begun to develop in-house software.
But therein lies a growing discrepancy between big news companies that can afford their own AI systems versus smaller local newspapers that cannot. This is when we should put forth the question of tech providers ‘democratizing’ AI — making them open-source and easily accessible.
Of course, the democratization of AI should be treated with caution. The Columbia Journalism Review warns that tech companies of all sizes have commercial incentives to “democratize” their software to influence market competition and shape standards to improve their corporate brand.
However, local newspapers probably have less need for AI software such as web-scraping tools since they’re unlikely to conduct large-scale investigative projects like national papers do. Their stories tend to spotlight human lives — something that computers can never articulate in the same way another person can.
Furthermore, given the mass layoffs in the news industry recently, there are justified concerns about how AI may replace journalists. It’s likely that AI will be increasingly employed for more data-focused articles that simply aggregate pre-existing information.
But stories that emphasize experience, commentary, and humanity cannot, or at least should not, be replicated by machines. AI is being employed in investigative newsrooms, but the investigative team still needs to be there to give meaning to the cold data. As Frehner said in my interview, “If there’s insider trading in a company that’s showing a massive stock movement, AI can tell you that the stock moves in a weird way. But it can’t tell you why. It can’t tell you about the email that the CEO sent to his buddy who owns stocks and told him to dump them.”
Changing our perception of AI
AI did not write this article. However, I did employ AI tools. Specifically, I used Otter.ai to transcribe my interview with Frehner. The transcript isn’t perfect — I had to edit the quotes to make them more comprehensible — but it saved me a whole afternoon of painstakingly transcribing each word of the audio second by second. In fact, The Varsity uses Otter.ai to transcribe nearly all recorded interviews.
It’s at this point where we must contend with how we want to define AI. Should we talk about transcription tools when we discuss AI regulation? What about Google Translate or Grammarly? These tools have become so ingrained in our regular lives that some of us may not even think about them when talking about AI anymore because they do not align with our futuristic perception of AI.
Ultimately, it boils down to the guidelines that newsrooms set. Journalists must recognize the limitations and biases of certain AI models, and treat these systems as an assisting tool rather than a source itself. Although the potentially negative aspects of AI can easily spiral out of hand, banning all forms of AI from the newsroom may harm journalists more than benefit them, making their jobs harder and less efficient. Perhaps this is also where we should question the role that legislation should play.
Nonetheless, it’s time we retire the sci-fi-esque perception of technology when evaluating AI in the newsroom. The tools currently employed are rather mundane and anti-climatic for the fear-mongering eye, but these very tools keep the wheels turning in the newsroom.
Charmaine Yu is a third-year student studying political science and English. She is an editor-in-chief of The Trinity Review and a features editor at The Strand. She is the What’s New In News columnist for The Varsity’s Comment section.