paint-brush
The Rise of Mental Health Apps: What Are the Consequences?by@zacamos
772 reads
772 reads

The Rise of Mental Health Apps: What Are the Consequences?

by Zac AmosSeptember 23rd, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Mental health apps allow users to enhance their emotional, psychological or social well-being. They exploded in popularity during the pandemic. This gave more people access to mental health care but also encouraged competing companies to engage in unethical practices. Researchers are still undecided on whether mHealth apps work. Finally, the proliferation of apps allows greater chances of data breaches. As mobile mental health grows in popularity, companies must prioritize privacy, security, and efficacy.
featured image - The Rise of Mental Health Apps: What Are the Consequences?
Zac Amos HackerNoon profile picture

Mental health apps typically promise to alleviate symptoms, provide helpful information, or enable self-directed treatment. Their services are often reasonably priced — compared to professional help — or even free, contributing to their rising popularity. All of this sounds too good to be true. Is it?

What Are Mental Health Apps?

Mental health apps are mobile or desktop applications users can use to enhance their emotional, psychological, or social well-being. They typically gamify treatments to keep users coming back. There are several categories, each targeting unique conditions or behaviors.

Professional Support

Many mental health platforms offer one-on-one telemeetings with handpicked therapists, psychologists, or instructors. These platforms are often considered substitutes for traditional avenues of care because they make professional help more accessible and affordable.

Self-Care

Wellness apps fall under the mental health umbrella. They focus on mood monitoring, mindfulness, stress management, and sleep tracking. Their features often include activities like guided meditation, cognitive behavioral therapy exercises, or journaling.

Symptom Tracking

Symptom-tracking apps let patients track moods, outlook, symptoms or thoughts. Providers can use these records to see how much a patient is psychologically or socially affected by their stress levels, physical conditions, or mental health episodes.

Social Support

Sometimes, people who feel their mental health is suffering prefer a network of people in the same position over a professional. Social support platforms give them a place to gather, enabling them to vent their frustrations or discuss effective treatments.

While mental health platforms have existed for years, their popularity exponentially increased during the COVID-19 pandemic. People were cut off from professional help when lockdowns forced them inside and medical facilities were overrun. Of course, many were navigating loss and chronic anxiety for the first time. Mobile apps were a port in the storm.


The destigmatization of mental health has also contributed to these apps’ popularity. Many continued performing well after the pandemic because the general public is more accepting — individuals are no longer afraid to be frank about their emotional or psychological struggles.


These platforms have become popular alternatives to traditional avenues of professional care. Experts believe their value potential is promising. In fact, they expect the mobile health (mHealth) app market will reach $105.9 billion by 2030, achieving a compound annual growth rate of 11.8% from 2021 to 2030.

The Implications of Popularity

The exponential increase in mental health app popularity has impacted the mental health space for the better in many ways. In 2021, just under half of the approximately 20% of people in the U.S. with a mental illness received professional help. Since most people have a smartphone or computer, apps are one of the most effective ways to make care accessible.


The platforms that have dominated the mHealth market spend millions of dollars advertising on sites like Facebook, Spotify, YouTube, Snapchat, and Instagram. They reach tens of millions of people, simultaneously raising awareness and further destigmatizing mental health. As a result, they expose more people to therapists, support groups, and informational resources.


That said, the recent surge in mHealth popularity also has negative implications. When companies are competing for market dominance, they may engage in unethical practices or make unsubstantiated claims. Moreover, the rapid market growth has outpaced regulatory action, creating room for dishonest businesses or even scammers.


Out of the tens of thousands of mental health apps available online, only five have been formally approved by the U.S. Food and Drug Administration to date. Unlike most platforms, which offer the same care to all users, these apps cover particular age groups, mental health conditions, and treatment strategies.

Are Mental Health Apps Effective?

Although mobile mental health is a relatively new concept, various studies on its efficacy exist. While these apps can be effective, their clinical efficacy has yet to be proven. Further, research suggests they may not outperform alternatives.

Mental Health App Efficacy Claims

Many apps promise improvements regardless of consumers’ minds, goals, or moods. However, just 5% of mental health apps have been studied for efficacy. While larger companies produce their own studies, they have a clear conflict of interest — not to mention that self-reports are often inaccurate or biased indicators.


At best, such unsubstantiated claims are misleading. At worst, they are malicious — they may adversely affect people’s mental well-being or prevent them from seeking professional care. For those with complex problems like post-traumatic stress disorder, insomnia, or suicidal ideation, that action could have genuine consequences.

How Mental Health Apps Compare

Research suggests treatment only works when users believe in it. One study compared Headspace — a self-described all-in-one mental health app — to a fake wellness app over six weeks. While both groups reported an increase in mindfulness and critical thinking, the researchers didn’t find evidence to support those claims. The placebo effect was likely at play.


Does it matter that the placebo effect may be why users feel like they get something out of these platforms? To some extent, it doesn’t. However, telling people they’ll get better when no proof exists — or evidence suggests otherwise — is misleading at best and deceptive at worst. People with complex conditions who need genuine intervention are adversely affected.

The Consequences of mHealth

This industry skirts traditional regulations and guidelines the medical sector must follow. For instance, it isn’t a covered entity — meaning a provider, insurer, or clearinghouse — under the Health Insurance Portability and Accountability Act (HIPAA). As a result, many operating within it don’t preserve the privacy or security of personal health care information (PHI).


Even some of the most well-known names in the industry are prone to these mistakes. The U.S. Federal Trade Commission recently issued BetterHelp — a mental health platform that provides remote therapy sessions — a $7.8 million fine for exposing users’ PHI after it promised not to. It also sold names, IP addresses, birth dates, and contact details.


Crucially, BetterHelp didn’t create a clause requiring the buyers — which included Facebook, Pinterest, and Snapchat — to only use PHI and personally identifiable information (PII) for advertising. They could sell, use, or analyze that data however they wanted. At no point was any of this disclosed to users. In fact, the company lied to regulators when it was initially confronted.


While few situations are as significant or egregious as BetterHelp’s case, they are more common than app users think. This is an issue since the value of PHI on the black market is 10 times higher than that of credit or debit card details. A person can’t change their medical history the same way they could get a new card — that data remains marketable for a lifetime.

What Should Developers Do?

Mental health platforms are generally effective when used to supplement professional help. Even if people use an app as a stand-alone intervention, it is better than going without care altogether. However, the complete disregard for privacy and data security these businesses show — coupled with their dubious claims — is a problem.


One report from the Mozilla Foundation — a nonprofit dedicated to keeping the internet open and free — revealed that 59% of the mental health apps it investigated in 2023 participated in dishonest data practices despite dealing with highly sensitive PII and PHI.


Crucially, the Mozilla Foundation noted its 2022 report motivated some companies to do better. Almost 33% improved their practices year over year. Frankly, if the tech industry is going to mingle with the mental health space, it must do better. Until regulators catch up, developers must listen to criticism from nonprofits like these or create their own governance strategies.


Adhering to HIPAA is ideal for data security and patient privacy. As mHealth’s popularity rises and more people turn to mobile mental health, agencies like the U.S. Department of Health & Human Services may even require it. Developers should also safeguard their platforms against man-in-the-middle attacks using encryption and firewalls.

The Future of Mental Health Apps

Mobile mental health’s popularity is growing exponentially, outpacing regulatory action and skirting rules that apply to the medical sector. For this industry to remain stable and profitable, companies must prioritize privacy and security. Protecting PHI and PII is essential, especially when caring for vulnerable groups.