Renowned genetics pioneer Dr. J Craig Venter is no stranger to controversy.
Back in 2000, he famously raced the public Human Genome Project to decode all three billion letters of the human genome for the first time. A decade later, he ignited a new debate when his team created a bacterial cell with a synthesized genome.
Most recently, he’s jumped back into the fray with a study in the September issue of the Proceedings of the National Academy of Sciences about the predictive potential of genomic data to identify individual traits such as voice, facial structure and skin color.
His study applied whole-genome sequencing and statistical modeling to predict traits in 1,061 people of diverse ancestry. His approach aimed to reconstruct a person’s physical characteristics based on DNA, and 74 percent of the time, his algorithm could correctly identify the individual in a random lineup of 10 people from his company’s database.
While critics have been quick to cast doubt on the plausibility of his claims, the ability to discern people’s observable traits, or phenotypes, from their genomes may grow more precise as technology improves, raising significant questions about the privacy and usage of genetic information in the long term.
Critics: Study Was Incomplete, Problematic
Before even redressing these potential legal and ethical considerations, some scientists simply said the study’s main result was invalid. They pointed out that the methodology worked much better in distinguishing between people of different ethnicities than those of the same ethnicity. One of the most outspoken critics, Yaniv Erlich, a geneticist at Columbia University, said, “The method doesn’t work. The results were like, ‘If you have a lineup of ten people, you can predict eight.”
Erlich, who reviewed Venter’s paper for Science, where it was rejected, said that he came up with the same results—correctly predicting eight of ten people—by just looking at demographic factors such as age, gender and ethnicity. He added that Venter’s recent rebuttal to his criticism was that ‘Once we have thousands of phenotypes, it might work better.’ But that, Erlich argued, would be “a major breach of privacy. Nobody has thousands of phenotypes for people.”
Other critics suggested that the study’s results discourage the sharing of genetic data, which is becoming increasingly important for medical research. They go one step further and imply that people’s possible hesitation to share their genetic information in public databases may actually play into Venter’s hands.
Venter’s own company, Human Longevity Inc., aims to build the world’s most comprehensive private database on human genotypes and phenotypes. The vastness of this information stands to improve the accuracy of whole genome and microbiome sequencing for individuals—analyses that come at a hefty price tag. Today, Human Longevity Inc. will sequence your genome and perform a battery of other health-related tests at an entry cost of $4900, going up to $25,000. Venter initially agreed to comment for this article, but then could not be reached.
Opens Up Pandora’s Box of Ethical Issues
Whether Venter’s study is valid may not be as important as the Pandora’s box of potential ethical and legal issues that it raises for future consideration. “I think this story is one along a continuum of stories we’ve had on the issue of identifiability based on genomic information in the past decade,” said Amy McGuire, a biomedical ethics professor at Baylor College of Medicine. “It does raise really interesting and important questions about privacy, and socially, how we respond to these types of scientific advancements. A lot of our focus from a policy and ethics perspective is to protect privacy.”
McGuire, who is also the Director of the Center for Medical Ethics and Health Policy at Baylor, added that while protecting privacy is very important, “the bigger issue is how do we understand and use genetic information and avoid harming people.” While we’ve taken “baby steps,” she said, towards enacting laws in the U.S. that fight genetic determinism—such as the Genetic Information and Nondiscrimination Act, which prohibits discrimination based on genetic information in health insurance and employment—some areas remain unprotected, such as for life insurance and disability.
Physical reconstructions like those in Venter’s study could also be inappropriately used by law enforcement, said Leslie Francis, a law and philosophy professor at the University of Utah, who has written about the ethical and legal issues related to sharing genomic data.
“If [Venter’s] findings, or findings like them, hold up, the implications would be significant,” Francis said. Law enforcement is increasingly using DNA identification from genetic material left at crime scenes to weed out innocent and guilty suspects, she explained. This adds another potentially complicating layer.
“There is a shift here, from using DNA sequencing techniques to match other DNA samples—as when semen obtained from a rape victim is then matched (or not) with a cheek swab from a suspect—to using DNA sequencing results to predict observable characteristics,” Francis said. She added that while the former necessitates having an actual DNA sample for a match, the latter can use DNA to pre-emptively (and perhaps inaccurately) narrow down suspects.
“My worry is that if this [the study’s methodology] turns out to be sort-of accurate, people will think it is better than what it is,” said Francis. “If law enforcement comes to rely on it, there will be a host of false positives and false negatives. And we’ll face new questions, [such as] ‘Which is worse? Picking an innocent as guilty, or failing to identify someone who is guilty?’”
Risking Privacy Involves a Tradeoff
When people voluntarily risk their own privacy, that involves a tradeoff, McGuire said. A 2014 study that she conducted among people who were very sick, or whose children were very sick, found that more than half were willing to share their health information, despite concerns about privacy, because they saw a big benefit in advancing research on their conditions.
“To make leaps and bounds in medicine and genomics, we need to create a database of millions of people signing on to share their genetic and health information in order to improve research and clinical care,” McGuire said. “They are going to risk their privacy, and we have a social obligation to protect them.”
That also means “punishing bad actors,” she continued. “We’ve focused a lot of our policy attention on restricting access, but we don’t have a system of accountability when there’s a breach.”
Even though most people using genetic information have good intentions, the consequences if not are troubling. “All you need is one bad actor who decimates the trust in the system, and it has catastrophic consequences,” she warned. That hasn’t happened on a massive scale yet, and even if it did, some experts argue that obtaining the data is not the real risk; what is more concerning is hacking individuals’ genetic information to be used against them, such as to prove someone is unfit for a particular job because of a genetic condition like Alzheimer’s, or that a parent is unfit for custody because of a genetic disposition to mental illness.
Venter, in fact, told an audience at the recent Summit conference in Los Angeles that his new study’s approach could not only predict someone’s physical appearance from their DNA, but also some of their psychological traits, such as the propensity for an addictive personality. In the future, he said, it will be possible to predict even more about mental health from the genome.
What is most at risk on a massive scale, however, is not so much genetic information as demographic identifiers included in medical records, such as birth dates and social security numbers, said Francis, the law and philosophy professor. “The much more interesting and lucrative security breaches typically involve not people interested in genetic information per se, but people interested in the information in health records that you can’t change.”
Hospitals have been hacked for this kind of information, including an incident at the Veterans Administration in 2006, in which the laptop and external hard drive of an agency employee that contained unencrypted information on 26.5 million patients were stolen from the employee’s house.
So, what can people do to protect themselves? “Don’t share anything you wouldn’t want the world to see,” Francis said. “And don’t click ‘I agree’ without actually reading privacy policies or terms and conditions. They may surprise you.”