When consulting Google on what to do when the cheese on a freshly-made pizza keeps sliding off, beware; Google’s artificial intelligence (AI) Overview might advise you to “add some glue.”

As bizarre as this seems, it is but one in the widening sea of AI miscalculations. A Stanford University misinformation expert was recently caught using AI hallucinations — inaccurate AI-generated information — in his legal documents that intended to support a law against the use of ‘deepfakes’ to influence election laws. Deepfakes are artificially generated imagery of real or non-existent people saying or doing things that they have not actually said or done.

It seems as though in the transformation of the digital information landscape, we’re losing the excitement we used to have for engaging with the internet, while gaining artificial intelligence:  wrongly marketed as innovative and useful. While AI can be used to fight the spread of false information, the battle doesn’t appear to be going very well if we’re all still drowning in misinformation. In 2022, nearly three out of four Canadians reported “having seen content online that they suspected to be false or inaccurate.” 

The irony of these cases of Google AI and AI hallucinations reflects the broader mess that the digital information landscape has become. I believe this mess is causing mass fatigue and disillusionment with the quality of information that the internet has to offer, which is ultimately impeding progress on important environmental goals.

A dangerous apathy toward digital information

These stories of AI technology running wild are dizzying, astonishing, and incredibly disillusioning. When our feeds are constantly full of LinkedIn posts clearly made by ChatGPT and YouTube comments all but guaranteed to be pleading to the highest powers for you to invest in the latest cryptocurrency, this endless plague of AI-generated misinformation becomes both appalling and mind-numbing.

Achieving sustainability goals is increasingly threatened by the muddying of the digital information landscape. The production of AI-generated misinformation constitutes a system of cultural and technological processes that produces a culture of apathy toward digital information, which is the opposite of what we need if we’re going to use emotional reasoning to achieve sustainable development goals. 

In the Sustainable Development Goals at University of Toronto Student Advisory Committee (SDGs@UofT SAC), we care about collectively winning the battle with AI-generated misinformation, mediating the apathy it has produced, and progressing toward a set of goals that ensure a sustainable environment that will enable humans to thrive for generations to come. 

The concept of emotional investment is fundamental to SDGs@UofT SAC, because we recognize that our emotions are important for engaging with information. If you love a song, it’s memorized in minutes. if you hate a textbook, none of the content inside it sticks. 

The sheer quantity of social media’s AI-produced content — or as I see it, garbage — makes us presume that everything else that we see will be garbage. It’s no wonder that social media has increasingly made us feel like garbage, as excessive social media use has been tied to poor sleeping patterns, depression, academic underachievement, and memory loss

If we feel a sense of mistrust and even hatred toward the sources that provide us information, we can’t register the fact that AI is eating the environment, that we’re failing the United Nations’ SDGs, and that the world is quite literally on fire. We’ll presume that the places where we could have gotten these critical pieces of information are unreliable, and thus won’t even bother to consult them in the first place.

Events, resources, and opportunities that work to achieve our SDGs all require that participants truly care about the role of digital information in achieving sustainability.

Cleaning up the digital mess

In 2023, lecturer Cory Doctorow coined a term used to describe how platforms die: the ‘enshittification’ of TikTok. “First, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all value for themselves. Then, they die.” These platforms’ capitalization on the applications of AI to satisfy their big business customers has significantly contributed to the internet’s ‘enshittification.’

I can attest to the plague of AI-generated falsities not only on TikTok, but on other popular internet platforms. My family’s WhatsApp group chat used to be used exclusively for sending messages, but it is now an endless feed of posts containing things like questionable ‘cures’ for diseases, speeches no politician ever gave, and magical spells that turmeric and ginger will cast for you.

That’s why I’m so enthusiastic about SDGs@UofT SAC. At SDGS@UofT SAC we are nine students from across U of T’s three campuses that commandeer the power of emotional reasoning and connections in our work toward getting more people to affiliate with the SDGs Scholars Academy — which works to advance the United Nations SDGs — and engaging with pursuing environmental sustainability in their lives. 

What’s next? 

But how do we start caring about information again? 

Our first step toward regenerating the digital information landscape is calling on the U of T community to recognize that some of the deepest cuts made by AI misinformation lie in the emotional void felt when our screens are endlessly full of mind-numbing AI content. U of T is a gigantic community that relies on digital information to get people connected; feeling sapped of emotional connection to information only serves to fracture the community by thinning the ties which bind us together.

Our second step is forming real connections with people — a cornerstone of successful partnerships for the United Nations’ SDGs. To that end, we’re supporting the Munk School Undergraduate Research Symposium in “Rethinking the UN’s Sustainable Development Goals,” which will tackle the pressing question of how the SDGs can be reimagined to address the challenges of today’s digital age

If you’re wondering how you can help us take steps toward cleaning up the mess of AI-generated misinformation, make sure you’re engaging with information you care about. 

Write for a news outlet like The Varsity. Put your heart into researching something you care about. Every small outpouring of genuine care and concern is an important ingredient in revitalizing the information landscape where important pursuits — like the SDGs — presently find their home. 

Noah Khan is a third-year PhD student at the Ontario Institute for Studies in Education, studying Social Justice Education. He is a Chair of the SDG@UofT Student Advisory Committee.