Full Disclosure, Your Health Information Isn't Private

Publication
Article
CUREWinter 2020
Volume 19
Issue 1

Personal health information isn’t truly private: Medical practices share records and companies sell internet search details. Here’s what patients should know.

Haley Morgan’s first experience with DNA testing was about as uneventful as exploring one’s fundamental genetic makeup can be. She swabbed her mouth, sent it to 23andMe and learned that she had virtually no unexpected ancestry and no health risks beyond a slight propensity toward celiac disease.

Her second experience, on the other hand, provides a vivid illustration of how any medical data that gets online can end up nearly anywhere and change your life in ways you could not predict— even if the medical data is not your own.

When Morgan’s grandmother underwent genetic testing, she signed an agreement allowing Ancestry.com to look for blood relatives who were also using the site, and Ancestry’s computers found something entirely unexpected: another granddaughter.

The woman had been born near the West Virginia College Morgan’s father had attended, just a few months after he graduated, and it quickly became apparent that a college girlfriend bore his child and put the baby up for adoption without ever telling him.

Stories like that tend to alarm privacy advocates. Neither Morgan nor her father had given Ancestry.com either genetic material or permission to unearth relatives they might not want to meet.

But there also are advantages to Ancestry’s policy. None of Morgan’s relatives felt their privacy had been violated; to the contrary, she and the rest of her family — father, mother and three siblings, as well as aunts, uncles and cousins — were delighted to hear the news. Ancestry.com’s analysis enriched their lives, even if they never agreed to it.

“I can see how some families might be upset to learn of a child that was the product of an affair or something like that, but given that my sister was born years before my parents even met, there was no reason for us to be anything other than happy,” says Morgan, an aspiring veterinarian who studies at Virginia Tech and works holidays as a vet tech at her mother’s practice in New Jersey. “It was obviously a huge surprise, but it was a great surprise.”

Even if no one in your family has undergone genetic analysis, a lot of your health information is out there. Health records and information about internet searches are shared, so you can be certain that at least some of your health-related data is being stored along with your name, address and other identifying details and sold by and to people and groups who are not part of your medical team. Indeed, it’s hard to fathom just how much medical data is online somewhere and how many people could access it.

Whether that will begin to routinely interfere with quality of life — for example, by compromising patients’ ability to secure life insurance, bank loans or jobs or enabling identity theft — remains to be seen.

MORE THAN A FAIR SHARE?

Electronic health records (generally called EHRs in the industry) make everything your doctor knows about you available throughout his or her practice, which might employ far more people than you realize. The nation’s largest health care provider, HCA Healthcare, has nearly 250,000 employees; the second largest, Ascension Health, has more than 150,000 employees.

If several of your doctors work for the same entity and you agree to the policy, EHRs allow them to see all the information their peers have collected, so your oncologist would know if you told your regular physician that you drink too much or have a sexually transmitted infection. Furthermore, up to one-third of hospitals share EHRs with provider groups outside their own.

And that’s not all.

Employees at both private and public health insurers can access any EHR information they deem necessary for evaluating claims, and public health officials can see anything in your record that might endanger others. Even outside contractors that help your health care provider operate effectively and efficiently may be able to access all your medical records, complete with your name and address, as these permissions are typically sought from patients when they are registering with a new health care provider, clinic or hospital.

How frequently do health care providers use such outside contractors? No one knows. There are no disclosure requirements. The largest known arrangement is between Google and Ascension, which operates 150 hospitals and thousands of medical offices in 20 states. Ascension collaborates with Google to help improve its “health care operations,” meaning that the hospital system is allowed by law to share sensitive information with the Internet- focused company.

Ascension had already transferred millions of medical records to Google by November 2019, when a story in The Wall Street Journal brought to light what the two companies dubbed Project Nightingale. Many of the records in question reportedly included a wide variety of identifying information, including full patient names and addresses. Neither affected patients nor their doctors were told about the project in advance or given a way to opt out.

Although attorneys who deal with the Health Insurance Portability and Accountability Act (HIPAA) and other privacy laws say everything revealed about the project is legal, news of Project Nightingale spurred both an ongoing congressional investigation and much criticism from privacy advocates, who noted that Google has already paid multiple fines for breaking privacy laws in nations around the world. But the coverage of Project Nightingale largely ignored two details: First, lots of large companies, including many health care providers, transfer sensitive data to Google because it is a major provider of cloud-based data storage and productivity software for firms that don’t want the trouble or expense of buying servers and managing software in-house. Second, the unusual aspects of the deal — the ones that apparently give some Google employees access to patient records — have the potential to greatly improve patient care.

Ascension and Google are already testing software that makes patient records far easier to search than current EHRs, so doctors who wish to know how a patient’s blood test results changed over time can type a single query and instantly see a graph with results on one axis and time on the other. The end goal, however, appears to be far more ambitious.

Google previously announced efforts to use artificial intelligence (AI) to read EHRs, make diagnoses and recommend treatments. Such endeavors require that the AI be “trained” by analyzing millions of medical records and seeing which symptoms were associated with which illnesses and which treatments produced the best outcomes. Records from Ascension’s 50 million patients could provide ample training material, and employees from Google’s medical AI unit apparently have access to the Ascension patient data.

“I’m biased because I work in the field, but I believe there is great promise in mining patient EHRs for actionable insights,” says Nicholas Tatonetti, director of clinical informatics at Columbia University’s Herbert Irving Comprehensive Cancer Center. Tatonetti has already made several such insights. For example, his data analysis uncovered dangerous drug interactions that were not revealed by trials conducted before federal approvals. Another team of researchers, led by UCLA’s Marc A. Suchard, analyzed data from 4.9 million patients who received treatment for high blood pressure and demonstrated, contrary to widespread belief, that some types of hypertension drugs are safer and more effective than others. (Thiazide or thiazide-like diuretics were shown to be so much safer and more effective than ACE inhibitors that if the 2.4 million study patients who used ACE inhibitors had instead used thiazide or thiazide-like diuretics, more than 3,100 major cardiovascular events could have been avoided.)

“Given that more than 100 million Americans have high blood pressure, that study will save thousands of lives per year, but if Ascension’s data is enough to help Google make a big breakthrough on medical AI, that could be a far bigger breakthrough,” Tatonetti says. “That doesn’t mean we shouldn’t find ways to maximize patient privacy when doing this research, but it does mean we need to weigh the potential benefits of data sharing, as well as the potential costs.”

Privacy advocates acknowledge that data sharing and analysis may lead to lifesaving discoveries, but those who have spoken out about Project Nightingale say Google and Ascension could have achieved their goals without compromising patient privacy.

For example, everyone interviewed for this story believes Google and Ascension probably could have anonymized patient data before sending it on to Google’s research teams.

Privacy specialists also believe that Ascension should have notified patients before transferring records and offered them the option to opt out of the project, but sources who focus on medical research disagree on grounds that such measures might reduce the value of the research.

“Opt-out systems raise ethical concerns, too. They permit free riders who will benefit from the knowledge gained from analyzing the medical records of people who opt in. Second, knowledge from a learning health system could be biased if enough people opt out and the opt-out population is different from those who allow their data to be used,” says Cason Schmit, an attorney and assistant professor of public health at Texas A&M University. “For example, African Americans are underrepresented in clinical research studies. This might be due to mistrust from a history of mistreatment and unethical studies by doctors, researchers and government. If they opted out at significantly higher rates than other groups, then the best treatments identified for all patients might not actually be the best treatments for African Americans.”

REVEALING FINDINGS

Project Nightingale might have received more media coverage than any other story related to medical privacy last year, but according to privacy experts, your health care providers aren’t the biggest threat to your medical privacy — you are.

Unless you work hard to make yourself anonymous, every health-related search you conduct online is noted by dozens of companies and sold by them to anyone who cares to buy it.

A Carnegie Mellon professor named Tim Libert demonstrated the extent of the problem in 2014, when he wrote a program that he dubbed webXray and used it to analyze the top 50 search results for almost 2,000 common diseases. He found that 91% of the more than 80,000 analyzed pages passed some information from patients’ searches on to at least one other party.

Commercial sites such as WebMD.com explicitly make money by selling information about searches to advertisers and data aggregators, but even sites run by nonprofit organizations such as the Centers for Disease Control and Prevention and Mayo Clinic can compromise privacy. Such sites use tools like Google Analytics to monitor website usage, often adding functionality to their websites by dropping in “free” widgets from data-collecting companies rather than building their own site code from scratch.

Facebook, too, collects information about users, including what they discuss, where they go and what they buy, and gets more information from third parties such as advertisers and app developers. It uses the information to personalize its service and to decide what ads people should see. Facebook promises not to sell users’ data, but it does share that information — some generalized, some identifiable — with third-party partners, including advertisers, data-measurement firms, vendors and researchers.

What does all that mean for patients who discuss their cancer or genetic status in a closed Facebook group? Although it doesn’t seem likely that the privacy of specific discussions will be breached, participants could start to see photos of suggested friends who have the same condition.

Wearable health devices, along with any other medical device that sends data to your smartphone, may also put privacy at risk. Millions of people have apps on their phones that track their diet, exercise, weight, blood pressure, sleep, blood sugar levels, menstrual cycles and other sensitive subjects. Some of those apps have great privacy policies. Others freely share information with advertisers.

Privacy advocates were particularly concerned with Google’s $2.1 billion acquisition of Fitbit, a maker of fitness trackers that reportedly has 27.6 million active users. Google has not said it won’t share users’ health details with third parties but has promised that it won’t sell personal information or use health data it collects from Fitbits in its own ads. Privacy advocates note that no existing regulation prevents the company from changing its policy, though it would have to notify users before proceeding.

Direct-to-consumer DNA testing services operate under the same regulations. Whereas genetic test results ordered by a doctor are protected under HIPAA, direct-to-consumer testing companies can share identifiable user information with anyone, so long as they tell users about their policies beforehand. According to a 2017 study of 90 DNA testing companies, most have lackluster privacy policies. Fortunately, the biggest companies are generally thought to have the best privacy policies, but that doesn’t mean users have nothing to fear. Hackers gained email and password information for 92 million users of the consumer genealogy website MyHeritage in 2018.

What’s more, genetic testing creates unique privacy risks because relatives share DNA. People who undergo genetic testing often both share and learn personal details about relatives without ever consulting them, as with Morgan’s grandmother. This sometimes proves harmful, but it can also be beneficial. If a close relative’s 23andMe report leads to you learning of BRCA gene mutations that increase your breast cancer risk, that invasion of your genetic privacy may well save your life.

BEYOND MEDICAL MATTERS

Your genetic test results could also send a relative to jail. At least one testing company, FamilyTreeDNA, allows police investigating rapes and murders to upload crime scene DNA and search most of its user data for suspects or their relatives. A site called GEDMatch, which performs no DNA testing but allows people to upload their results from companies such as 23andMe, recently restricted police access to its records until a judge issued a warrant for Florida police to search its entire database.

The company had restricted its access after it earlier allowed police to use its database to identify the notorious Golden State Killer, who is thought to have committed at least 13 murders and at least 50 rapes between 1974 and 1986. Policies that allow police to use the sites have helped law enforcement identify at least 40 suspects over the past couple of years.

Cooperation between genetic testing companies and the police has generated both passionate support and strenuous objections. Some say such cooperation is an unmitigated good that hurts no one other than murderers and rapists. Others say users are being tricked into ratting on relatives.

Such objections may constitute the single biggest backlash against any intentional distribution of private medical information, but there have been other breaches. For instance, news broke in 2013 that the credit rating company Experian had sold large amounts of data about consumers to criminals in Vietnam who posed as an American private investigator and later resold the information. In 2019, the company was hit with large fines related to the episode.

Privacy experts point out that when a company like Experian purchases data about customers, it combines that information with data it already has and can then sell a fairly detailed profile. One concern is that some of this information, such as an illness or a past bankruptcy, could be sold to banks that will then use it to refuse loan applications.

So far, despite all the people who have access to private medical data and others who could gain such access by purchasing files from data aggregators, there isn’t much evidence that people’s own medical data is routinely used against them. If there are tech company employees who use health records to blackmail people or employers who illegally use such data to avoid hiring chronically ill job applicants, their stories have yet to be widely reported. And patients have protections: The Genetic Information Nondiscrimination Act forbids health insurers or employers from discriminating based on genetic predispositions to disease.

Still, some experts advise bracing for more breaches of privacy.

“Even if there aren’t any widespread problems with abuse right now, I’d guess that it’s just a matter of time. There’s just too much information out there and too much money to be made by using it inappropriately. We need to update and expand our privacy laws,” says Arthur L. Caplan, the Drs. William F. and Virginia Connolly Mitty Professor of Bioethics at New York University’s medical school. “HIPAA was passed before most people were using either EHRs or the internet, and it’s simply not equipped to deal with the challenges of maintaining privacy in the modern era. As for the privacy laws and regulations governing the direct-to-consumer businesses, we essentially don’t have any, and we need them badly.”

Related Videos
Dr. Susan Parsons in an interview with CURE
An Overview of Diversity, Equity and Inclusion in Cancer
Related Content