Mental health apps are a booming business despite scientific and privacy concerns

Lack of regulation in the marketplace could lead to data breaches and false marketing
SKYLAR CHEUNG/THE VARSITY
SKYLAR CHEUNG/THE VARSITY

In the age of wellness and self-care, apps that claim to improve mental health are proliferating in app stores.

Health apps, including apps that address mental health, were valued at eight billion USD in 2018. This growing market is expected to be valued at around 111 billion USD by 2025.

However, research is still needed to validate some of these apps’ promises with scientific evidence, particularly as privacy breaches could pose a risk.

On the one hand, medical devices undergo stringent testing. For example, Apple spent years working to meet the Food and Drug Administration’s standards to be recognized as such before releasing certain features of its Apple Watch. Since then, there have been reported cases of the watch detecting irregular heartbeats and even saving someone’s life.

But mental health apps are not necessarily held to the same standards as traditional health devices.

Are mental health apps rooted in science?

In a 2019 review, researchers used topics like depression, self-harm, substance use, anxiety, and schizophrenia to find apps in both the Apple and Android app stores. The researchers found that of the 73 apps they analyzed, over 60 per cent claimed to be able to diagnose mental health conditions or improve symptoms.

But unlike medical devices, apps claiming to diagnose and manage mental health do not require government approval, and do not undergo years of testing. The same review reported that many such apps had little scientific backing and might even be using scientific jargon to mislead consumers.

Dr. Andrea Levinson, a psychiatrist and assistant professor at U of T, has found that her patients will tell her about the mental health apps that they are using or that they have found to be helpful. When interacting with patients, Levinson might recommend an app in conjunction with clinical care if the patient finds it helpful in managing their symptoms.

In an interview with The Varsity, Levinson explained that in a session with a patient, she may review how an app has been used, ask about their experience using it, and go over the generated data with them.

The Anxiety and Depression Association of America lists recommended apps based on several criteria, including scientific backing. Out of the 19 apps that the website has reviewed, most were backed by little to no research evidence.

“The question is not to dismiss [mental health apps], not to say [that they’re] good or bad, but to really use evidence-based evaluation to determine the efficacy of these apps, [their] safety, [and] issues around privacy,” said Levinson.

Mental health apps spark privacy concerns

While regulation of health apps may create a standardized approval system, the current market remains unregulated, and as such, consumers are still vulnerable to having their personal data sold and their privacy breached.

In a 2019 study, researchers found that out of 36 top-ranked apps, including ones that aim to manage symptoms of depression, 29 shared data with third parties. However, only 12 of them disclosed this in their privacy policies.

In fact, one mental health app called Moodpath shares user data with both Facebook and Google — a fact which they disclose in their    privacy policy.

While not an app, LinkMentalHealth is an online service that connects patients with mental health services that they might not be able to access otherwise.

According to co-founder Radwan Al-Nachawati, LinkMentalHealth was created as a response to long wait times for therapists, which its founders have experienced firsthand.

LinkMentalHealth seeks to streamline schedules between patients and providers, while also recommending therapists based on a short survey, which includes multicultural considerations.

“[You are] able to choose all types of therapists for ethnicity, religion, sexual orientation, all those other factors,” said Al-Nachawati in an interview with The Varsity. “That’s a really empowering thing to know.”

Because the website handles sensitive personal information, the startup consulted health privacy lawyers to protect user data, according to Al-Nachawati.

Protecting this information is important “not just from a legal perspective, but from a moral perspective as well,” said Al-Nachawati.

LinkMentalHealth’s website has a privacy policy and terms and conditions for both clients and therapists.

Ultimately, more research is needed to evaluate the claims of mental health apps and keep personal data secure, and users should read privacy policies in full before using such apps.

Editor’s Note (11:52 am, February 8): This article has been updated to correct that LinkMentalHealth does not suggest the burden of keeping personal information secure falls on users.

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter