Mozilla Warns 28 of 32 Mental Health Apps Have Privacy Issues

-

In a recent study, Mozilla Foundation found that 29 out of 32 mental health apps have critical data privacy issues and other security concerns. The Foundation issued a warning about the potential dangers of these apps, which include depression, suicide, domestic violence, and PTSD. Many of these apps collect sensitive information such as user location, contact lists, and health data without adequate security measures in place. Third parties could access this information without the user’s knowledge or consent.

Mozilla Foundation is urging users to take caution when downloading mental health apps. Mozilla recommends only downloading apps that have been reviewed by a trusted source and carefully reading the app’s privacy policy before sharing any personal information. Beyond this being a general problem for any app, this being identified for apps that are meant to support people with their mental health makes it an even more significant issue.

If this is happening with the knowledge of the responsible people working at such companies, and if data is actively and willingly sold to third parties without it being mentioned in the terms and conditions of the service, this might not even be legal in many places.

Who to worry about in mental health apps?

But who are the offenders? Are that unknown startups hailing from exotic unregulated areas? Some might be, others you might be more familiar with than you think. Mozilla sorted these into six distinctive categories. We adapted their labels and classification for transparency.

The worst offenders

It’s the wild west in this category, meaning the worst privacy and security practices are common here.

  • BetterHelp (“incredibly vague and messy privacy policies”)
  • Youper (“sharing personal information with third parties”)
  • Woebot (“sharing personal information with third parties”)
  • Better Stop Suicide (“incredibly vague and messy privacy policies”)
  • Pray.com (“sharing personal information with third parties”)
  • Talkspace (“collecting chat transcripts”)

The companies that responded to questions

These are the organizations that managed to respond to Mozilla researchers when asking questions about privacy and security, unlike all the others.

  • Hallow
  • Calm
  • Wysa
Mental Health Apps - Mozilla Foundation Privacy Overview
Screenshot: Mozilla Foundation

The short “best of”

When assessing privacy and security quality, of course, there also have to be some companies that are doing a good job.

  • PTSD Coach (U.S. The Department of Veterans)
  • Wysa (AI chatbot)

Data harvesting

In short, Mozilla states that the mental health app space is a “data harvesting bonanza” and that nearly all reviewed apps collect and share or sell data with third parties, even though they might be very sensitive datasets.

Laughable security

In the report by Mozilla, they are concerned with the security of most reviewed apps ranging from the option to use weak passwords to keeping vulnerabilities unpatched. The mood and symptom journaling app Moodfit was mentioned in particular as they allow passwords that are as simple as a single letter or digit, which is indeed laughable for software that tracks this kind of data.

Don’t forget young people

Whether it’s intended or unintended, there are mental health and prayer apps that specifically target teenagers and young users with mental health issues. While targeted and personalized ads might seem like a lesser issue to some, they could also become part of a “keep them sad” mechanic outside the app to keep paying for premium methal health features inside the app.

What does Mozilla have to say about this?

Mozilla invested 255 hours in total on this research which was done in order to introduce their “*Privacy Not Included” label. Project lead Jen Caltrider shares, “The vast majority of mental health and prayer apps are exceptionally creepy. They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data. Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”

“Hundreds of millions of dollars are being invested in these apps despite their flaws. In some cases, they operate like data-sucking machines with a mental health app veneer. In other words: A wolf in sheep’s clothing.” Misha Rykov, Mozilla Researcher, adds.

The big list

Who’s included, and how did they fare in the Mozilla check on privacy and security? Here’s what they came up with for mental health apps. The list starts with the best results, and the worst are listed at the bottom.

  • Wysa by Touchkin
  • PTSD Coach by United States Department of Veterans Affairs
  • Bearable by Bearable Ltd
  • Woebot by Woebot Health
  • Recovery Record: Eating Disorder Management by Recovery Record, Inc
  • Glorify by Tupoe Ltd
  • Headspace by Headspace, Inc
  • Modern Health by Modern Life, Inc
  • Mindshift CBT by Anxiety Canada Association
  • Moodfit by Roble Ridge Software LLC
  • Superbetter by SuperBetter, LLC
  • RAINN by Rape, Abuse & Incest National Network
  • Breathe, Think, Do with Sesame by Sesame Workshop
  • The Mighty by Mighty Proud Media Inc
  • Pray.com by Pray Inc.
  • Sanity & Self by Moov Inc
  • Liberate by Zen Compass, Inc.
  • NOCD by NOCD INC
  • MindDoc by MindDoc Health
  • Happify by Happify, Inc.
  • Abide by Carpenters Code Inc.
  • 7 Cups by 7 Cups of Tea
  • Shine by Shine, Inc
  • Calm by Calm.com, Inc.
  • Sanvello by Sanvello Health, Inc.
  • Hallow by Hallow, Inc
  • Youper by Youper, Inc
  • Pride Counseling by BetterHelp
  • Talkspace by Talkspace
  • BetterHelp by BetterHelp
  • Kind James Bible – Daily Verse and Audio by iDailybread.org
  • Better Stop Suicide by The Better App Company Limited

If you want to check who got the Mozilla Foundation rating “*privacy not included with this product,” you can find more details on their website.

Is it all bad?

The report from Mozilla sounds grim, but it’s not a bad idea to play the blame game sometimes if it’s done with the intention to improve the situation. If a company is competent, it might decide to accept this sort of feedback and act on it, trying to fix issues and improving its overall service offering.

It’s also a wake-up call for mental health app users to be more concerned and conscious. If you know how your data is used and agree to that, in exchange for valuable services delivered to you, that’s fine, but it’s not okay if an organization keeps things in the dark.


YouTube: AI and mental health – revolutionary reboot or the rise of the “digital asylum”?

Photo credit: The feature image is symbolic and has been done by Albert Shakirov. The screenshot in the body of the article is owned by the Mozilla Foundation.
Source: Press release sent to us directly from the Mozilla Foundation.
Editorial notice: Update May-9 – We received an update from an Mozilla representative to advise us that Glorify finally did respond shortly after we published the article and that this reduces the number of the subject from 29 to 28 apps they warn about.

Was this post helpful?

Christopher Isak
Christopher Isakhttps://techacute.com
Hi there and thanks for reading my article! I'm Chris the founder of TechAcute. I write about technology news and share experiences from my life in the enterprise world. Drop by on Twitter and say 'hi' sometime. ;)
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -
- Advertisment -