Mental health apps identified capitalising on users’ sensitive data

May 12, 2022
Mental Health Mobile Apps User Sensitive Data InfoSec Privacy

Findings published by Mozilla exposed a worrying concern about the lack of security and user privacy for mental health apps found on mobile application stores. These apps are valuable for people undergoing personal distress like anxiety, PTSD, domestic violence, etc. Some religion-themed apps are also discovered in a similar case.

The study made by Mozilla detailed how these applications servicing sensitive users failed in user security and privacy policies. These apps are found to share data through a routine, allow setting up weak password combinations, target the users with personalised ads, and have poorly structured privacy guidelines.

 

32 mental health and religious apps are included in the study, with 25 of them failing Mozilla’s Minimum Security Standards.

 

From Mozilla’s Minimum Security Standards, the applications that did not meet most of their criteria have been downgraded as a vendor for the firm’s principles, including unauthorised sharing or sale of users’ data, weak password policies, lack of data encryption, vague data management guidelines, and more. Moreover, these deficient apps are added to Mozilla’s ‘Privacy Not Included’ warning label.

According to Mozilla, the mental health and prayer apps they found failing their security standards have been the worst case they assessed in the past six years. A few of the apps included in the study are Calm, 7 Cups, Talkspace, Better Help, Headspace, and Glorify. Mozilla also dedicated a separate space for these apps for users to browse on and learn more about their privacy and security rating.

Another mental health application, Better Stop Suicide, did not meet Mozilla’s assessment. Upon the firm contacting them for further inquiries, the app developers did not respond. A statement from Mozilla’s lead assessor explained that most of the tested apps are described as ‘creepy’ since they track, share, and capitalise on their users’ sensitive dispositions, such as their mental state.

Of all the applications reviewed by the organisation, a few only passed and were acknowledged for taking user security and privacy as a serious matter, which includes PTSD Coach and Wysa, an AI chatbot.

About the author

Leave a Reply