Imagine this scenario: A couple is involved in a heated custody dispute over their only child. As part of the effort to make the case of being a better guardian, one parent goes on a "genetic fishing expedition": this parent obtains a DNA sample from the other parent with the hope that such data will identify some genetic predisposition to a psychiatric condition (e.g., schizophrenia) and tilt the judge's custody decision in his or her favor.
As knowledge of psychiatric genetics is growing, it is likely to be introduced in civil cases, such as child custody disputes and education-related cases, raising a tangle of ethical and legal questions.
This is an example of how "behavioral genetic evidence" -- an umbrella term for information gathered from family history and genetic testing about pathological behaviors, including psychiatric conditions—may in the future be brought by litigants in court proceedings. Such evidence has been discussed primarily when criminal defendants sought to introduce it to make the claim that they are not responsible for their behavior or to justify their request for reduced sentencing and more lenient punishment.
However, civil cases are an emerging frontier for behavioral genetic evidence. It has already been introduced in tort litigation, such as personal injury claims, and as knowledge of psychiatric genetics is growing, it is further likely to be introduced in other civil cases, such as child custody disputes and education-related cases. But the introduction of such evidence raises a tangle of ethical and legal questions that civil courts will need to address. For example: how should such data be obtained? Who should get to present it and under what circumstances? And does the use of such evidence fit with the purposes of administering justice?
How Did We Get Here?
That behavioral genetic evidence is entering courts is unsurprising. Scientific evidence is a common feature of judicial proceedings, and genetic information may reveal relevant findings. For example, genetic evidence may elucidate whether a child's medical condition is due to genetic causes or medical malpractice, and it has been routinely used to identify alleged offenders or putative fathers. But behavioral genetic evidence is different from such other genetic data – it is shades of gray, instead of black and white.
Although efforts to understand the nature and origins of human behavior are ongoing, existing and likely future knowledge about behavioral genetics is limited. Behavioral disorders are highly complex and diverse. They commonly involve not one but multiple genes, each with a relatively small effect. They are impacted by many, yet unknown, interactions between genes, familial, and environmental factors such as poverty and childhood adversity.
And a specific gene variant may be associated with more than one behavioral disorder and be manifested with significantly different symptoms. Thus, biomarkers about "predispositions" for behavioral disorders cannot generally provide a diagnosis or an accurate estimate of whether, when, and at what severity a behavioral disorder will occur. And, unlike genetic testing that can confirm litigants' identity with 99.99% probability, behavioral genetic evidence is far more speculative.
Genetic theft raises questions about whose behavioral data are being obtained, by whom, and with what authority.
Whether judges, jurors, and other experts understand the nuances of behavioral genetics is unclear. Many people over-estimate the deterministic nature of genetics, and under-estimate the role of environments, especially with regards to mental health status. The U.S. individualistic culture of self-reliance and independence may further tilt the judicial scales because litigants in civil courts may be unjustly blamed for their "bad genes" while structural and societal determinants that lead to poor behavioral outcomes are ignored.
These concerns were recently captured in the Netflix series "13 Reasons Why," depicting a negligence lawsuit against a school brought by parents of a high-school student there (Hannah) who committed suicide. The legal tides shifted from the school's negligence in tolerating a culture of bullying to parental responsibility once cross-examination of Hannah's mother revealed a family history of anxiety, and the possibility that Hannah had a predisposition for mental illness, which (arguably) required therapy even in the absence of clear symptoms.
Where Is This Going?
The concerns are exacerbated given the ways in which behavioral genetic evidence may come to court in the future. One way is through "genetic theft," where genetic evidence is obtained from deserted property, such as soft-drink cans. This method is often used for identification purposes such as criminal and paternity proceedings, and it will likely expand to behavioral genetic data once available through "home kits" that are offered by direct-to-consumer companies.
Genetic theft raises questions about whose behavioral data are being obtained, by whom, and with what authority. In the scenario of child-custody dispute, for example, the sequencing of the other parent's DNA will necessarily intrude on the privacy of that parent, even as the scientific value of such information is limited. A parent on a "genetic fishing expedition" can also secretly sequence their child for psychiatric genetic predispositions, arguably, in order to take preventative measures to reduce the child's risk for developing a behavioral disorder. But should a parent be allowed to sequence the child without the other parent's consent, or regardless of whether the results will provide medical benefits to the child?
Similarly, although schools are required, and may be held accountable for failing to identify children with behavioral disabilities and to evaluate their educational needs, some parents may decline their child's evaluation by mental health professionals. Should schools secretly obtain a sample and sequence children for behavioral disorders, regardless of parental consent? My study of parents found that the overwhelming majority opposed imposed genetic testing by school authorities. But should parental preference or the child's best interests be the determinative factor? Alternatively, could schools use secretly obtained genetic data as a defense that they are fulfilling the child-find requirement under the law?
The stigma associated with behavioral disorders may intimidate some people enough that they back down from just claims.
In general, samples obtained through genetic theft may not meet the legal requirements for admissible evidence, and as these examples suggest, they also involve privacy infringement that may be unjustified in civil litigation. But their introduction in courts may influence judicial proceedings. It is hard to disregard such evidence even if decision-makers are told to ignore it.
The costs associated with genetic testing may further intensify power differences among litigants. Because not everyone can pay for DNA sequencing, there is a risk that those with more resources will be "better off" in court proceedings. Simultaneously, the stigma associated with behavioral disorders may intimidate some people enough that they back down from just claims. For example, a good parent may give up a custody claim to avoid disclosure of his or her genetic predispositions for psychiatric conditions. Regulating this area of law is necessary to prevent misuses of scientific technologies and to ensure that powerful actors do not have an unfair advantage over weaker litigants.
Behavioral genetic evidence may also enter the courts through subpoena of data obtained in clinical, research or other commercial genomic settings such as ancestry testing (similar to the genealogy database recently used to identify the Golden State Killer). Although court orders to testify or present evidence are common, their use for obtaining behavioral genetic evidence raises concerns.
One worry is that it may be over-intrusive. Because behavioral genetics are heritable, such data may reveal information not only about the individual litigant but also about other family members who may subsequently be stigmatized as well. And, even if we assume that many people may be willing for their data in genomic databases to be used to identify relatives who committed crimes (e.g., a rapist or a murderer), we can't assume the same for civil litigation, where the public interest in disclosure is far weaker.
Another worry is that it may deter people from participating in activities that society has an interest in advancing, including medical treatment involving genetic testing and genomic research. To address this concern, existing policy provides expanded privacy protections for NIH-funded genomic research by automatically issuing a Certificate of Confidentiality that prohibits disclosure of identifiable information in any Federal, State, or local civil, criminal, and other legal proceedings.
But this policy has limitations. It applies only to specific research settings and does not cover non-NIH funded research or clinical testing. The Certificate's protections can also be waived under certain circumstances. People who volunteer to participate in non-NIH-funded genomic research for the public good may thus find themselves worse-off if embroiled in legal proceedings.
Consider the following: if a parent in a child custody dispute had participated in a genetic study on schizophrenia years earlier, should the genetic results be subpoenaed by the court – and weaponized by the other parent? Public policy should aim to reduce the risks for such individuals. The end of obtaining behavioral genetic evidence cannot, and should not, always justify the means.
On the morning of April 12, 1955, newsrooms across the United States inked headlines onto newsprint: the Salk Polio vaccine was "safe, effective, and potent." This was long-awaited news. Americans had limped through decades of fear, unaware of what caused polio or how to cure it, faced with the disease's terrifying, visible power to paralyze and kill, particularly children.
The announcement of the polio vaccine was celebrated with noisy jubilation: church bells rang, factory whistles sounded, people wept in the streets. Within weeks, mass inoculation began as the nation put its faith in a vaccine that would end polio.
Today, most of us are blissfully ignorant of child polio deaths, making it easier to believe that we have not personally benefited from the development of vaccines. According to Dr. Steven Pinker, cognitive psychologist and author of the bestselling book Enlightenment Now, we've become blasé to the gifts of science. "The default expectation is not that disease is part of life and science is a godsend, but that health is the default, and any disease is some outrage," he says.
The Rise and Fall of Public Trust<p>When the polio vaccine was released in 1955, "we were nearing an all-time high point in public trust," says Matt Baum, Harvard Kennedy School professor and lead author of <a href="http://www.kateto.net/covid19/COVID19%20CONSORTIUM%20REPORT%2013%20TRUST%20SEP%202020.pdf" target="_blank" rel="noopener noreferrer"><u>several</u></a> <a href="https://shorensteincenter.org/wp-content/uploads/2020/09/COVID19-CONSORTIUM-REPORT-14-MISINFO-SEP-2020.pdf" target="_blank" rel="noopener noreferrer"><u>reports</u></a> measuring public trust and vaccine confidence. Baum explains that the U.S. was experiencing a post-war boom following the Allied triumph in WWII, a popular Roosevelt presidency, and the rapid innovation that elevated the country to an international superpower.</p><p> The 1950s witnessed the emergence of nuclear technology, a space program, and unprecedented medical breakthroughs, adds Emily Brunson, Texas State University anthropologist and co-chair of the Working Group on Readying Populations for COVID-19 Vaccine. "Antibiotics were a game changer," she states. While before, people got sick with pneumonia for a month, suddenly they had access to pills that accelerated recovery. </p><p>During this period, science seemed to hold all the answers; people embraced the idea that we could "come to know the world with an absolute truth," Brunson explains. Doctors were portrayed as unquestioned gods, so Americans were primed to trust experts who told them the polio vaccine was safe. </p>
The Shift in How We Consume Information<p>In the 1950s, the media created an informational consensus. The fundamental ideas the public consumed about the state of the world were unified. "People argued about the best solutions, but didn't fundamentally disagree on the factual baseline," says Baum. Indeed, the messaging around the polio vaccine was centralized and consistent, led by President Roosevelt's successful <a href="https://files.eric.ed.gov/fulltext/EJ978264.pdf" target="_blank" rel="noopener noreferrer"><u>March of Dimes crusade</u></a>. People of lower socioeconomic status with limited access to this information were <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1551508/?page=3" target="_blank" rel="noopener noreferrer"><u>less likely to have confidence</u></a> in the vaccine, but most people consumed <a href="https://www.c-span.org/video/?506891-1/a-special-report-polio" target="_blank" rel="noopener noreferrer"><u>media that assured them</u></a> of the vaccine's safety and <a href="https://www.cbsnews.com/news/the-salk-polio-vaccine-greatest-public-health-experiment-in-history/" target="_blank" rel="noopener noreferrer"><u>mobilized them</u></a> to receive it. </p><p>Today, the information we consume is no longer centralized—in fact, just the opposite. "When you take that away, it's hard for people to know what to trust and what not to trust," Baum explains. We've witnessed an increase in polarization and the technology that makes it easier to give people what they want to hear, reinforcing the human tendencies to vilify the other side and reinforce our preexisting ideas. When information is engineered to further an agenda, each choice and risk calculation made while navigating the COVID-19 pandemic <a href="https://www.nytimes.com/2020/12/19/opinion/sunday/coronavirus-science.html?referringSource=articleShare" target="_blank" rel="noopener noreferrer"><u>is deeply politicized</u></a>. </p><p>This polarization maps onto a rise in socioeconomic inequality and economic uncertainty. These factors, associated with a sense of lost control, prime people to embrace misinformation, explains Baum, especially when the situation is difficult to comprehend. "The beauty of conspiratorial thinking is that it provides answers to all these questions," he says. Today's insidious fragmentation of news media accelerates the circulation of mis- and disinformation, reaching more people faster, regardless of veracity or motivation. In the case of vaccines, skepticism around their origin, safety, and motivation is intensified. </p><p>Alongside the rise in polarization, Pinker says "the emotional tone of the news has gone downward since the 1940s, and journalists consider it a professional responsibility to cover the negative." Relentless focus on everything that goes wrong further erodes public trust and paints a picture of the world getting worse. "Life saved is not a news story," says Pinker, but perhaps it should be, he continues. "If people were more aware of how much better life was generally, they might be more receptive to improvements that will continue to make life better. These improvements don't happen by themselves."</p>
The Future Depends on Vaccine Confidence<p>So far, the U.S. has been unable to mitigate the catastrophic effects of the pandemic through social distancing, testing, and contact tracing. President Trump has <a href="https://www.washingtonpost.com/politics/bob-woodward-rage-book-trump/2020/09/09/0368fe3c-efd2-11ea-b4bc-3a2098fc73d4_story.html" target="_blank" rel="noopener noreferrer"><u>downplayed the effects and threat of the virus</u></a>, <a href="https://www.washingtonpost.com/outlook/2020/07/14/cdc-directors-trump-politics/" target="_blank" rel="noopener noreferrer"><u>censored experts and scientists</u></a>, <a href="https://www.theatlantic.com/science/archive/2020/06/america-giving-up-on-pandemic/612796/" target="_blank" rel="noopener noreferrer"><u>given up on containing the spread</u></a>, and <a href="https://www.nytimes.com/2020/09/16/world/covid-coronavirus.html" target="_blank" rel="noopener noreferrer"><u>mobilized his base to protest masks</u></a>. The Trump Administration failed to devise a national plan, so our national plan has defaulted to hoping for the <a href="https://www.politico.com/news/2020/08/26/nation-of-miracles-pence-coronavirus-vaccine-rnc-402949" target="_blank" rel="noopener noreferrer"><u>"miracle" of a vaccine</u></a>. And they are "something of a miracle," Pinker says, describing vaccines as "the most benevolent invention in the history of our species." In record-breaking time, three vaccines have arrived. But their impact will be weakened unless we achieve mass vaccination. As Brunson notes, "The technology isn't the fix; it's people taking the technology."</p><p> Significant challenges remain, including facilitating widespread access and supporting on-the-ground efforts to allay concerns and build trust with <a href="https://www.newyorker.com/news/daily-comment/african-american-resistance-to-the-covid-19-vaccine-reflects-a-broader-problem" target="_blank" rel="noopener noreferrer"><u>specific populations with historic reasons for distrust</u></a>, says Brunson. Baum predicts continuing delays as well as deaths from other causes that will be linked to the vaccine. </p><p> Still, there's every reason for hope. The new administration "has its eyes wide open to these challenges. These are the kind of problems that are amenable to policy solutions if we have the will," Baum says. He forecasts widespread vaccination by late summer and a bounce back from the economic damage, a "Good News Story" that will bolster vaccine acceptance in the future. And Pinker reminds us that science, medicine, and public health have greatly extended our lives in the last few decades, a trend that can only continue if we're willing to roll up our sleeves. </p>
Imagine this scenario: you get an annoying cough and a bit of a fever. When you wake up the next morning you lose your sense of taste and smell. That sounds familiar, so you head to a doctor's office for a Covid test, which comes back positive.
Your next step? An anti-Covid nasal spray of course, a "trickster drug" that will clear the once-dangerous and deadly virus out of the body. The drug works by tricking the coronavirus with decoy receptors that appear to be just like those on the surface of our own cells. The virus latches onto the drug's molecules "thinking" it is breaking into human cells, but instead it flushes out of your system before it can cause any serious damage.
This may sounds like science fiction, but several research groups are already working on such trickster coronavirus drugs, with some candidates close to clinical trials and possibly even becoming available late this year. The teams began working on them when the pandemic arrived, and continued in lockdown.
Biochemist David Baker, pictured in his lab at the University of Washington.