If you’ve downloaded any mental health apps, you might want to consider deleting them.
According to a study conducted by researchers at the Mozilla Foundation, many of these applications do not offer sufficient protection of users’ privacy and security.
Are mental health applications failing users on privacy and security? Although they deal with particularly important and often sensitive topics — such as depression, anxiety, violence, eating disorders, post-traumatic stress disorder or suicide — many of these applications share user data freely.
A study conducted by Mozilla researchers on 32 such applications highlights their lack of compliance with privacy standards. Moreover, these applications were found to collect more data than the vast majority of other applications and other connected devices.
“The vast majority of mental health and prayer apps are exceptionally creepy,” warns Jen Caltrider, Mozilla’s ‘Privacy Not Included’ lead.
“They track, share, and capitalise on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data.” These mental health applications are particularly targeted at younger people. Indeed, the latter are especially vulnerable to mental health issues and do not necessarily pay attention to the use of their data.
Some of this data could allow them to be targeted by personalised ads for years to come.
Moreover, a majority of these applications offer poor account security despite containing highly personal information about the users.
“In some cases, they operate like data-sucking machines with a mental health app veneer,” says Mozilla researcher Misha Rykov in a statement.
“In other words: A wolf in sheep’s clothing.” According to Mozilla, the applications with the worst security and privacy practices are: Woebot, Youper, Better Stop Suicide, Pray.com, Talkspace or Better Help.
Finally, despite Mozilla’s attempts to find out more about their privacy policies, only three (Hallow, Calm and Wysa) out of 32 applications responded.
Only two platforms respected basic privacy and security standards: PTSD Coach, an application created by the US Department of Veterans Affairs, and the AI chatbot Wysa.