The Cambridge Analytica scandal put the spotlight on other technological tools that rely on the input of personal data, but there are concerns that some are not totally secure. One area of focus is fertility apps which track a woman’s ovulation, among other personal information.
Some mornings, Renate Leite wakes up, takes her temperature and types it into an app on her smartphone. The app, Fertility Friend, helps let her know if she’s ovulating, and when she’s most likely to be able to conceive. She can also enter her mood, sex drive, and any pelvic pain. But she doesn’t “bother” to track the more intimate stuff because she’s concerned about her privacy.
“That’s my information, and it is private information,” the 41-year-old lab technician told WikiTribune from her home in Trondheim, Norway.
Leite’s unease about telling an app every private detail about her body underlines the current wave of data protection and privacy concerns accentuated by the Cambridge Analytica case, in which up to 87 million Facebook accounts had their information compromised.
In an age where everything from your kettle to your bed can be hooked up to the internet and “tracked” in some way, fertility or menstrual apps are just one example of the many health and “wellness” tools that are on the rise. Yearly, the number of apps in this market has increased by 25 percent. There were 325,000 mobile health apps available in 2017, according to health app analyst group Research2Guidance.
In return for inputting sensitive data about their bodies, fertility apps offer to help users prevent pregnancies or aid conception by tracking users’ menstrual cycles and alerting them when they’re most or least fertile. To do that, the apps need information that varies from the more impersonal, like temperature, exercise, and sleep cycles, to intimate data such as bowel movements, cervical fluid, and sex drive.
Women’s health is “big business” right now, says nonprofit the Electronic Frontier Foundation (EFF), which advocates for digital privacy. In response, even health products like Fitbit are moving to menstrual cycle tracking.
The apps that WikiTribune contacted maintained their commitment to personal privacy and data security. But some advocates and users have concerns about whether all these apps are completely secure and how they share user data with others.
A 2017 study by the EFF found that fertility apps have “serious privacy issues” and, while in some cases they’re useful, warned that women should be aware of the “privacy tradeoffs” of using them.
Some users of Glow, a Silicon Valley-based fertility app founded by PayPal’s Max Levchin that says it has helped more than 800,000 women get pregnant and has 12 million users, was found to have breached user data in 2016. A Consumer Reports investigation said it was easy for anybody to access the personal details and intimate information of a user by only knowing their email address.
Healthcare apps that deal with sensitive medical data are required under the U.S. Health Insurance Portability and Accountability Act (HIPAA) to tell the media about data breaches. However general health apps have “no requirement” to do so.
Kathy Downing, a health data privacy expert, said because of this users “probably wouldn’t hear about” other fertility app data breaches.
“It would just be an internal issue for them and they wouldn’t have to share it. Of course, why would they? It’s putting their consumers at risk and people would opt out of using that app if they knew,” she told WikiTribune.
Something missing from the story? Say so
Something missing from the story? Say soTalk
Regulation gray area
This gray area of regulation on apps means that some health data is protected but not others. Health data that is considered medical or associated with a healthcare organization, such as a hospital, is protected in the United States by HIPAA. But U.S.-based fertility apps would not be automatically obliged to adhere to HIPAA security and privacy laws, says Downing.
A potential solution to this unregulated world could be the new General Data Protection Regulation (GDPR) coming into action in all European Union (EU) countries on May 25.
GDPR applies to all businesses and apps that collect personal data, said Jovan Stevovic, the founder and CEO of Chino.io, a data security platform. Founded in 2014, Chino.io specializes in health data and Stevovic regularly analyzes how health apps work.
“It’s about giving control to the users about their data,” he told WikiTribune.
Under GDPR, health data, including fertility app data, will be protected and will be legally required to stay private without a user’s consent. Otherwise businesses face fines of up to €20 million, or 4 percent of a company’s annual turnover.
However, some apps, such as Glow, are taking the initiative to comply with HIPAA anyway.
A Glow spokesperson said in an email to WikiTribune that: “Glow received certification by an independent cyber risk management firm and is compliant with HIPAA.”
The statement also said: “Glow prioritizes the safety of its community’s personal information and has implemented multiple measures to ensure our users’ private information is secure.”
The new EU restrictions will give people more control over how companies use their data and will make companies use easy-to-understand language. Firms will also be made to report any data breaches to authorities within 72 hours.
But GDPR is already affecting businesses who are choosing to minimize the effect of the regulations. For example, Facebook is planning to move data from Ireland to the U.S. to reduce its exposure to GDPR.
The law also won’t solve privacy concerns in the United States, where people will be far less protected against having fertility app data shared than their European counterparts, said Stevovic.
“If you are implementing a fertility app in the U.S., you have even less distinction, because you are not even subject to HIPAA. You don’t, for example, need to encrypt this data by law … There is no law that says, this is sensitive data, this needs to be encrypted.”
Problems with anonymity
Many paid fertility apps do not contain advertising. But free ones, such as WomanLog’s free version, that has “several million” monthly users according to the app team, and Period Calendar, show users advertisements when using the app. When using WomanLog, for example, WikiTribune was advertised Amazon’s podcast service Audible, nearby IVF services, and discount offers for gas bills.
Organizations increasingly rely on anonymization techniques to enable wider use of data, says Britain’s Information Commissioner’s Office (ICO). This can include for research or data analysis purposes. Anonymous details of user demographics are also commonly shared with advertisers to see if they want to place ads with that service. Many fertility apps promise that when using a user’s data, it will be anonymized.
Berlin-based fertility app Clue says it uses anonymized user data to make its services better by helping the app spot new correlations and trends.
Meanwhile, Glow provided the National Institutes of Health (NIT) with an anonymized data set (Cision) of its users to help the NIT improve its fertility prediction algorithms in what Glow’s co-founder Ryan Ye said would help “broaden academic understanding of reproductive health.”
However, anonymized data doesn’t equate to total privacy, says Stevovic.
“In order to make [data] thoroughly anonymous you need to do something technically very challenging to anonymize this data … and use it for your business model without violating the privacy of end users. This is not really typically known by developers.”
He says to make data truly anonymous, information that can help identify a person, such as gender or zip code, needs to be removed.
“Therefore you need to remove a lot of info and reduce your data quality, losing some possibility to get some important info out of it afterwards. And, as the final result, you should never consider this data 100 percent safe and fully anonymous because algorithms and external sources of info are progressing rapidly. So at some point a specialized researcher would be able to link this data to the people and violate their privacy.”
Health data expert Kathy Downing says it doesn’t take much to turn that anonymized data into identifiable data, which should be a “red flag” to the user.
“They’re selling it, and the person buying it, in a lot of cases, based on your IP address, they know where your house is, or based on different things that you do, they can track that. I wouldn’t ever assume that it’s truly anonymized because with all the data analytics tools, it’s very easy to disconnect a few dots and figure it out.”
Downing says whether fertility apps are trying to be ethical and protect user data or not, they “owe you no privacy.”
Stevovic goes further, saying users can never know for sure what’s being done by apps with their data.
“You can’t really inspect what’s happening inside. It’s impossible. It’s technically impossible. You need to trust them, basically, that they will do proper things.”
These concerns are also extending to fertility app users.
Back in Norway, Leite says she “found herself questioning if she wanted to put in whether she had had sex with her partner.
“Then I started thinking, ‘What do these apps track and how safe are they.”
She said an app she previously used was storing backups of user accounts online but didn’t specify if they used or analyzed the data.
“They just stated that you could put a backup at their servers and that’s it, but nothing more. The lack of information made me a bit suspicious, and I started to check out for other apps.”
You can edit or expand this story
You can edit or expand this storyEdit
Despite the privacy concerns, Leite said she finds her current app, Fertility Friend, useful to help her conceive. Fertility Friend has not yet responded to WikiTribune‘s request for comment.
A spokesperson from WomanLog told WikiTribune in an email: “User data on WomanLog server is accessible only through authorized access (using WomanLog account) and stored in a high-level encrypted form. User data is not available to third parties in any form … We do not sell our users data.”
But some apps are willing to help users without intentionally breaching their privacy.
Stevovic, who said he knows the team behind Clue, believes they care about security and privacy. “They’re trying hard to keep the data secure.”
‘Above and beyond’ data protections
“Clue was already above and beyond the existing laws before the introduction of GDPR and has always been very careful with user data,” said Tin in a statement to WikiTribune.
“The principle of privacy by default and by design has always been a key software development principle for Clue. For example, we only collect a minimum amount of data during our onboarding.”
Tin added that users are not forced to create accounts and their data is stored locally on their phones. Clue uses the data of those who have created accounts when working with research groups, but only partners with companies with high standards of security, she said. “We do not sell nor share personal user data with anyone, and we never will.”
Leslie Heyer, the creator of U.S.-based pregnancy prevention and pregnancy planning app Dot, that has 60,000 active users, says it doesn’t harvest sensitive data and all personal information users input is stored only on their phones.
“Individual information that someone might enter, period start dates for instance, we’re not collecting that information, that just resides on your phone…we do not see any of that.”
She says however, that Dot does monitor how long people use the app for, how often, and for what purpose.
“A lot of apps I think are collecting individual data. They’re asking about your BMI or your sleep patterns. They’re collecting that and creating data sets around it. We are not doing that…We’re very clear about what we want to do here and it does not involve collecting people’s data.”
Proceed with caution
Ultimately, Stevovic says users should be “screening” apps as they use them in order to be aware of potential issues.
“[Users] need to be aware that the amount of information you’re giving them is the right amount that they need to run their service. Not more. When an app asks you too many permissions in your phone, you probably say, ‘Okay, why this app now is asking too much.’ Your screening, your awareness, your trust.”
Leite is open about her situation trying to conceive. But she’s concerned for women who may be more vulnerable.
“What I have been sensing when I have been talking to or chatting with people in some [online] forums and what I see other places is that a lot of women want to keep this very private. They don’t talk to their closest friends about even having problems conceiving. That makes me think that a lot of these women might feel extremely vulnerable if their data is going to be misused.”
She says that many people are unaware of the data risks that come with personal apps.
“If you’re not thinking about security or if you don’t know that you should be careful, it might be very easy to get an app that will steal your data.”