If you have ever assumed that information shared on a mental health app was confidential, you are in good company with many others who likely assume that sensitive medical information is always protected. This is not true, however, and it is important to understand why.
Many of us are familiar with or active users of some type of digital health application. Whether it is nutrition, fitness, sleep tracking, or mindfulness, the arena for apps that can help us track aspects of our health has never been bigger. Similarly, platforms that help us reach out to health care providers and receive virtual care have become more available, and often necessary, during the pandemic. Online therapy in particular has grown over the years, and has become a critical resource for many people during quarantines and remote living.
Making health resources and care more accessible to people is vital, and the ease of accessing health resources right from your phone is obvious.
However, among the manyheavy implications of Roe v. wade having been overturned are a number of digital privacy concerned. Significant focus recently has been on period-tracking or fertility apps, as well as location information, and reasonably so. On July 8, the House Oversight Committee submitted letters to data brokers and health companies “requesting information and documents regarding the collection and sale of personal reproductive health data.”
What has been less discussed is the large gap in legal protections for all types of medical information that is shared through digital platforms, all of which should be subject to regulations and better oversight.
The US Department of Health and Human Services (HHS) recently released updated guidance on cellphones, health information, and HIPAAconfirming that the HIPAA Privacy Rule does not apply to most health apps as they are not “covered entities“under the law.” The Health Insurance Portability and Accountability Act (HIPAA) is a federal law that creates a privacy rule for our “medical records” and “other individually identifiable health information” during the flow of certain health care transactions. Most apps that are selected individually by the user are not covered — only platforms that are specifically used by or developed for traditional health care providers (ie a clinic’s digital patient portal where they send you messages or test results).
Mental health apps are a revealing example. They, like other digital health apps, generally are not bound by the privacy laws that apply to traditional health care providers. This is concerning especially because people often seek out mental health platforms specifically in order to discuss difficult or traumatic experiences with sensitive implications. HIPAA and state laws on this issue would need to be amended to specifically include digital app-based platforms as covered entities. For example, California currently has a bill pending that would bring mental health apps within the scope of their state medical information confidentiality law.
It is important to note that even HIPAA has exceptions for law enforcement, so bringing these apps within the scope of HIPAA would still not prevent government requests for this data. It would be more useful in regulating information that gets shared with data brokers and companies like Facebook and Google.
An example of information that does get shared is what is collected during an “intake questionnaire” that needs to be filled out on prominent services such as Talkspace and BetterHelp in order to be matched with a provider. The questions cover extremely sensitive information: gender identity, age, sexual orientation, mental health history (including details such as when or if you have thought about suicide, whether you have experienced panic attacks or have phobias), sleep habits, medications, current symptoms , etc. These intake answers were found by Jezebel to all be shared with an analytics company by BetterHelp, along with the approximate location and device of the user.
Another type is all the “metadata” (ie data about the data) about your use of the app, and Consumer Reports discovered this can include the fact that you are a user of a mental health app. Other information shared can include how long you are on the app, how long your sessions are with your therapist, how long you are sending messages on the app, what times you log in, what times you send messages/speak to your therapist, your approximate location, how often you open the app, and so on. Data brokers, Facebook, and Google were found to be among the recipients of this information from Talkspace and BetterHelp. Apps regularly justify sharing information about users if this data is “anonymized,” but anonymized data can easily be connected to you when combined with other information.
Along with the collection and sharing of this data, retention of the data by health apps is incredibly opaque. Several of these apps do not have clear policies on how long they retain your data, and there is no rule requiring them to. HIPAA does not create any records retention requirements — they are regulated by state laws and unlikely to include health apps as practitioners subject to them. For example, New York State requires licensed mental health practitioners to maintain records for at least six years, but the app itself is not a practitioner or licensed. Requesting deletion of your account or data also may not remove everything, but there is no way of knowing what remains. It is unclear how long sensitive information they collect and retain about you could be available at some future point to law enforcement.
Accordingly, here are a few things to keep in mind when navigating health apps that may share your data:
The accessibility to care that these types of apps have created is more than critical, and everyone should seek the care they need, including via these platforms if they are the best option for you (and they are for many people). The important takeaway is to be as informed as possible when using them and to take the steps that are available to you to maximize your privacy.