Health Apps and Services
A particularly dangerous way for a person to compromise his or her privacy is to use smartphone health apps and their associated cloud-based services. People distrust big tech companies to the extent that both Google and Microsoft have discontinued their health-related products. However, the same natural distrust doesn’t seem to extend to smaller health app developers, even though these smaller apps have been found to be less trustworthy than some of Big Tech’s offerings.
Big Tech and Health
For a period of time, the Big Tech companies all hoped to get access to Americans’ health data. However, people were reluctant to share this information with these giants, leading to the closure of Google Health in 20121 and the end of Microsoft HealthVault in 2019.2 Of the biggest tech companies, only Apple and Amazon are maintaining a significant consumer health presence as of early 2025. Google is instead trying to add health search capabilities to its AI-generated overviews, including crowdsourced answers from random Internet users (who are not medical professionals). For reference, this is the same system that suggested using glue to keep cheese from sliding off a pizza.3
Apple is the sole remaining big tech player with a comprehensive, cloud-based health application: Apple Health,4 which is subject to Apple’s rather extensive privacy policy.5 It is worth noting that, as of the time of this writing, Apple is facing a lawsuit on the basis that some of its privacy settings are allegedly misleading.6 Like most tech companies, Apple collects data about its users in a variety of ways and uses that information to target advertising. Its iOS platform devices, including iPhones, use advertising identifiers that are tied to age and gender, among other factors.7
Unlike most of its direct big tech competitors, Apple is continuing to expand its healthcare offerings. Newer Apple devices offer some compelling capabilities for various different user groups. For example, iPhones can use onboard sensors to determine if someone is in a car crash, Apple Watches can detect if a person falls (and also monitor heart rhythms), and the company has added a setting to allow AirPods to function as hearing aids. While these features carry clear user benefits, they also exist for an important alternate purpose, which is to expand the amount of data Apple has available for training health-related Artificial Intelligence (AI) models. Apple Health isn’t subject to HIPAA, so all the data that people share with the company might be used for AI training purposes.8
Health Apps
There are a wide variety of third-party health apps that are not owned by big tech companies. Instead, these apps are developed and released by random developers, many of whom might reasonably be described as “shady.” In a study published in 2021, researchers examined 497 health apps related to diabetes management. Around 60% of these apps requested dangerous phone permissions, 40% contained advertising, and over a quarter had no discernible privacy policy. 95% of apps studied were “free” in the sense that there was no upfront charge for the app.9 Of course, these “free” apps are really paid in the currency of personal data, and health data are particularly sensitive.
We’re going to look at three categories of health apps to understand some of the privacy implications: period trackers, men’s health apps (which seem to be sex trackers, for the most part), and mental health apps. Period trackers are marketed to women as a way to understand their cycles and track fertilization windows. Fewer apps seem to be aimed at men (apart from fitness apps), and the ones that are targeted at men quickly fall into the category of sexual activity tracking. Mental health apps are targeted at everyone and come with their own serious set of risks.
Period Trackers
An entire class of health apps exists for tracking a woman’s menstrual cycle. In the general case, Apple Health provides “cycle tracking” as part of its standard suite of tools. Newer Apple Watches are able to track basal body temperature – the lowest body temperature generally obtained during sleep – and record this information directly to Apple Health. Basal temperature is one metric that can be used to predict when ovulation has occurred, which can in turn be used to predict the start time of the next period.11 In addition to the automatic temperature tracking, Apple Health allows a woman to self-report menstrual symptoms, flows, and spotting. Tracking this information over time allows for cycle predictions to be made, including fertility windows and notifications of upcoming periods. Apple cautions that period and ovulation tracking should not be used a substitute for birth control.10 At least some of the symptom and flow self-reporting can be done via an Apple Watch.11
Although Apple has a better privacy track record that many other companies, there are specific concerns with sharing period data now that Roe v. Wade has been overturned. Since Apple Health uploads data into the cloud, it is entirely possible that law enforcement could decide to start mining data in an attempt to detect abortions.11 As a reminder from above, Apple Health also comes with the usual caveat that data shared with it are not protected by HIPAA, although this issue is moot from a law enforcement perspective, since law enforcement requests for health data can fall under a HIPAA exception.
Third-party period trackers exist, many of which also upload information to the cloud. Unfortunately, the track record of these third party apps isn’t always quite as positive as Apple’s has been. One app provider, Flo, settled with the Federal Trade Commission after it was found to be sharing period and ovulation data with third-party marketing companies, including Facebook. Although Flo didn’t admit any wrongdoing, researchers discovered that the app was notifying Facebook whenever a user got her period or indicated an interest in becoming pregnant.12
Men’s Health and Sex Trackers
There are fewer examples of men’s health apps, perhaps because we do a pretty lousy job of taking care of our health. The apps that are targeted at men tend to focus on either fitness (which we’ll cover in a later lesson) or sex. Speaking of the latter, one example of a sexual app targeted specifically at men (and, evidently, boys over 12) is Sequoia. Sequoia provides reminders and guidance for various types of sexual improvement exercises, self-reporting of erectile issues, and tracking of erections, sexual activity, and “rigidity” during sexual activity.13 Its privacy policy gives the company the right to “process sensitive personal information,” including general health data, sex life information, sexual orientation data, and genetic data. It collects this sensitive information whenever the user gives consent or “as otherwise permitted by applicable law,” which is a nice way to say that it doesn’t ask permission if it isn’t legally required to ask. Sequoia also collects personal information like email addresses, contact information, and place of residence. The privacy policy allows it to aggregate the information it collects with other information purchased from data brokers, use information for advertising purposes, and share information with affiliates and advertisers.14
Another example of an app created with men in mind is xTracker, which is specifically designed for tracking sexual activity. It includes the ability to record a date and time, first and last name of the partner(s), location, which specific sexual acts were performed, heart rates, specific positions used, how long the encounter lasted, whether or not protection was used, who reached orgasm, who initiated the activity, what mood the user was in, whether the other partner swallowed, and similar details (this list is a subset of what is shown in their screenshot).15 In the app’s privacy policy, the company acknowledges collecting IP addresses, obtaining information from data brokers, sharing information with vendors and business partners, using information for marketing, and retaining information for an indeterminate period of time. The policy is curiously silent about the sensitive information the app collects.16
Although these two examples appear to be targeted at men, some sex tracking apps are targeted at women, sometimes under the guise of improving sexual health.17 From a cybersecurity and privacy perspective, all these apps share the same basic problem: they collect highly intimate data, shuffle it off to a server farm somewhere, and then allow that data to be shared and monetized in various ways. Sex trackers that allow for the identity of partners to be recorded present additional concerns, since a partner might not willingly agree to the privacy policy of the company that created the app. However, the person using the app effectively gets to share the partner’s intimate data without the partner first having an opportunity to provide informed consent. Of course, these apps likely also don’t give their direct users true informed consent, either.
Mental Health
Mental health apps are targeted at everyone and are marketed as a way to help with mental issues, especially anxiety and depression (both of which are extremely common in the United States). Different apps provide different services. Some of these apps are merely diaries or ways for a person to track his or her mood, while others may have guided exercises or interaction with a chat algorithm. Some apps provide electronic appointments with human clinicians, including counselors and therapists. Visiting a counselor or therapist creates records that are subject to HIPAA and various confidentiality rules. These apps, on the other hand, are exempt from such regulations and are therefore a major privacy risk involving extremely sensitive personal information.18
The general state of these apps has been summed up like this: “[i]n the world of mental health apps, privacy scandals have become almost routine.”19 Mental health app privacy policies routinely give the company that develops and maintains the app the right to change the privacy policy at any time, possibly without giving direct notice to the user. Therefore, even if an app is initially developed with a reasonable set of privacy constraints, there is no guarantee that the company won’t change what it can do with your personal data after it has already collected it. Such a change is especially possible if the company is acquired by a different company, or if the app is sold to another developer.19
Even apps that connect users to other humans can have ways of skirting HIPAA requirements. For example, an app that offers a chat function might be staffed with non-professional employees who do not fall under the definition of a provider subject to HIPAA. Communications via these kinds of apps can be mined and used for commercial purposes.19 Even apps that connect a person with a licensed therapist (who is subject to HIPAA requirements) can sidestep HIPAA if the app acts as an intermediary between the user and the therapist.18 In this situation, the therapist’s notes would be covered by HIPAA, but the app’s data collection might not be covered – and the app may be able to record the entire session. As an example, BetterHelp settled with the Federal Trade Commission to the tune of $7.8 Million in 2024 after it was caught sharing sensitive mental health information with Facebook, Snapchat, and other advertisers.20
Mitigation
Mitigating risks from health apps is tricky, since two things need to be true in order to be able to use an app to store health data securely:
- The app must be private, not transmit any data to the cloud, and be open source so that its privacy claims can be verified; and
- The phone on which the app is running must not be transmitting information to the cloud that it collects from the app.
Satisfying the first objective is relatively straightforward, although the open source apps may not provide as many features as the proprietary ones do. MediLog21 and OpenTracks22 are examples of open source apps that can store general medical information, sensor data, and basic fitness data. Open-source period trackers include drip.23 and Mensinator.24 Sunrise Signal is an open-source men’s health and well-being app that rather humorously tracks nocturnal penile tumescence that persists after waking (“morning wood”) and relates it to alcohol intake, exercise, and other factors.25
Although using an open-source app to track health information is decidedly preferable to using some cloud-connected data harvester, there are still security and privacy concerns with the mobile platform itself. A platform compromise could cause sensitive health data to fall into the wrong hands, even if care is taken to avoid commercial apps that steal data. Using a high-security platform like GrapheneOS,26 particularly without Google Play Services enabled in the same profile as the medical apps, would be a way to mitigate this concern. However, it is a fairly cumbersome approach.
One way to track health information with little risk of it being siphoned up by a large company is to use the methods people used before smartphones were invented. If we step into a time machine and go back to the 1990s or early 2000s, we can use a paper notebook or diary to record health information and track it over time. Simple, disconnected basal thermometers are available from drugstores and other retailers, which allow accurate temperatures to be recorded first thing in the morning to predict ovulation. A simple paper calendar could be used for period tracking, although it might be somewhat less convenient for women with irregular cycles. Sexual activity could be tracked on paper, but WHY? (unless there is an underlying problem to address). It seems like it should be common sense that recording one’s sexual activity, and especially recording that of a partner without their explicit consent, is a bad idea in general.
For tracking health issues over time, many physicians now provide a patient portal that maintains health records in a single, convenient location. Although there are cybersecurity risks – these portals can be hacked, after all – these physician records are at least subject to HIPAA. For mental health concerns, it is best to find and work with a counselor, therapist, or clinical psychologist. They will keep notes on anything that is clinically relevant, and those notes are subject to HIPAA (and generally also enjoy at least some legal privilege). Although HIPAA is far from perfect and isn’t really a comprehensive privacy law, it is still probably the best available legal protection in the United States and is preferable to relying on some company’s privacy policy that can be changed at any time with little or no notice. To put it another way: seek mental health help by calling a provider and setting up an appointment, not by using an app. The same really could be said about scheduling any other kind of medical appointment.
Notes and References
-
Aaron Brown and Bill Weihl. An update on Google Health and Google PowerMeter. Official Google Blog. June 24, 2011. ↩
-
Mary Jo Foley. “Microsoft is closing its HealthVault patient-records service on November 20.” ZDNET. April 5, 2019. ↩
-
Noor Al-Sibai. “Google Says Its Error-Ridden ‘AI Overviews’ Will Now Give Health Advice.” Futurism. March 20, 2025. ↩
-
Apple Privacy Policy. January 31, 2025. ↩
-
Jonathan Stempel. “Apple must face narrowed privacy lawsuit over its apps.” Reuters. September 27, 2024. ↩
-
Matt Burgess. “All the Data Apple Collects About You—and How to Limit It.” Wired. January 16, 2023. ↩
-
Kaif Shaikh. “Is Apple’s health data gold rush benefiting users or exploiting them?.” Interesting Engineering. November 28, 2024. ↩
-
José Javier Flors-Sidro, Mowafa Househ, Alaa Abd-Alrazaq, Josep Vidal-Alaball, Luis Fernandez-Luque, and Carlos Luis Sanchez-Bocanegra. “Analysis of Diabetes Apps to Assess Privacy-Related Permissions: Systematic Search of Apps.” JMIR Diabetes 6(1): e16146. January 13, 2021. ↩
-
Track your period with Cycle Tracking. Apple Support. ↩
-
Catherine Roberts. “How to Keep Your Apple Watch Ovulation Data Private.” Consumer Reports. September 22, 2022. ↩↩↩
-
Natasha Lomas. “Flo gets FTC slap for sharing user data when it promised privacy.” TechCrunch. January 13, 2021. ↩
-
Privacy Policy. Sequoia Health. April 18, 2023. ↩
-
Privacy Policy. xTracker. May 26, 2018. ↩
-
Rebecca Saunders. “Sex tracking apps and sexual self-care.” New Media & Society 26(4): 2006-2022. March 11, 2022. ↩
-
Thomas Germain. “Mental Health Apps Aren’t All As Private As You May Think.” Consumer Reports. March 2, 2021. ↩↩
-
Nicole Wetsman. “Mental health app privacy language opens up holes for user data.” The Verge. May 4, 2022. ↩↩↩
-
“BetterHelp customers begin receiving refund notices from $7.8M data privacy settlement, FTC says.” AP News. May 8, 2024. ↩