August 11, 2015
Aug 11, 2015
6 Min read time
The limits of using DNA to create images of subjects.
Photograph: Heather Dewey-Hagborg
Artist Heather Dewey-Hagborg likes to make faces. But she doesn’t paint or sculpt them, precisely. She doesn’t even decide what they look like.
She started her project by wandering around New York picking up cigarette butts, pieces of chewing gum, and strands of human hair. Then she submitted her biologically marked trash to a laboratory for DNA extraction and analysis. The lab sent back information about the unknown litterers’ sex; ancestry; eye, hair, and skin color; and a variety of facial traits, such as the distance between the eyes and the prominence of the cheekbones. She entered these data into a computer program, which created models of the miscreants’ faces. Finally, she used a 3d printer to produce the wall-mounted objects, eerily like death masks, for her exhibit Stranger Visions.
Not only are the images arresting, but some police think the technology that formed them can actually facilitate an arrest. In January 2015 homicide detectives in Columbia, South Carolina used crime-scene DNA to produce a sketch of a suspect in a four-year-old unsolved murder of which there had been no eyewitnesses. The image depicts a young African American man with a generic face and a close-cropped Afro. The poster says his skin is dark or dark olive in color, he has no freckles, and his ancestry is 92 percent West African and 8 percent European. At least two commercial labs now market forensic DNA phenotyping kits, which have also been used by police in Toronto and Louisiana to develop descriptions of otherwise unknown suspects.
The process, called DNA phenotyping or molecular photo fitting, might be a scientific breakthrough that stands to revolutionize law enforcement. Or it might be incomplete science already gone awry. Will DNA-generated facial images lead to the arrest of innocent people? Are such potentially innocent people more likely to be “of color” or of non-European stock? Plenty could go wrong.
DNA phenotyping might revolutionize law enforcement. Or it might be incomplete science already gone awry.
While certain kinds of DNA identification have been used for some time—to settle paternity disputes, for example, and famously to find los desaparecidos, children kidnapped in the 1970s by the notorious Argentine fascist regime—creating computer images of specific faces is new. Most of the researchers testing the methods for accuracy and generality readily acknowledge that the technology is still in its early days. In broad strokes, here is what photo-fitting scientists do: first, using a group of living humans, they measure or otherwise make an image of a specific trait. Some researchers have focused on eye color, others on skin or hair color, and still others on the shape of the face. To record facial structure, for example, a group headed by anthropologist Marc Shriver used a 3d digital imaging device to create computer renditions of several thousand faces. The team also recorded subjects’ age, sex, and stated ethnicity and took a DNA sample from each “face subject.”
Shriver’s group sequenced the collected DNA looking for specific genetic variations (single nucleotide polymorphisms, or SNPs) thought to be associated with facial development, geographic ancestry, and sex. They used software so new that it is still being refined to draw correlations between particular SNP variations and facial shapes obtained from their 3d model good enough, they say, to reverse engineer. The image could be built from the SNPs alone, in other words—from that cigarette butt or drop of blood left at the crime scene. Shriver is now pushing precisely this use of the technology on behalf of Parabon NanoLabs, the company the Columbia police department hired.
The researchers themselves note, “The results certainly are promising but remain preliminary. Furthermore, methodologically the approach is novel”—meaning, among other things, that to gauge the accuracy of molecular photo fitting, more study and critical evaluation is needed. In a similar vein, the Dutch researcher Manfred Kayser, who has focused on DNA-based prediction of eye color using an all-Dutch sample base, acknowledges the need for data from more heterogeneous populations. Even though prediction of blue eye color works with up to 90 percent accuracy, the prediction of brown eyes is only 45–70 percent accurate. Furthermore the markers he and colleagues have used only have predictive ability for white European populations; the same SNPs have no predictive value in non-European samples.
Skin-color prediction doesn’t fare better. In one study of African Americans and African Caribbeans, light-skinned research subjects sometimes had more than 50 percent African ancestry, and the lightest-skinned subject had an estimated 85 percent African ancestry. In other words, the percentage of African ancestry does not correlate particularly well with skin color and thus is of limited value for drawing an image of a “person of interest.”
There are many criticisms of DNA photo fitting, not least that methods are still being developed and vetted. But the most profound is that it omits the entire story of development. For example, face shape changes with age, and the manner of change varies with sex and across human populations. Bony though the skull is, even its shape is not static across the life cycle. So there is considerable doubt as to whether a face can accurately be drawn from DNA sequences.
Indeed, one group of skeptical reporters gave the technique a try and came back with discouraging results. Two volunteers from the science desk of the New York Times—a thirty-one-year-old half-Korean, half–northern European woman and a sixty-five-year-old Ashkenazi Jewish man—donated DNA samples and asked Shriver to return computer-generated portraits. The Times testers distributed these to colleagues with a request to identify the individuals pictured. They warned that neither age nor weight could be estimated from DNA, and the real person might be older or younger, heavier or thinner than represented in the portrait. About a dozen respondents deemed the images too generic to allow a guess, but among the fifty or so who tried, none correctly identified their older male colleague. Several thought it might be Andrew Ross Sorkin, a well-known business columnist in his thirties, while others suggested mostly white men who worked at the science desk. About ten people correctly identified the woman, but the others guessed wrong. About half of the wrong guesses pegged women of European ancestry and half women of Asian descent.
These underwhelming results point to a major bias: the portraits evoke youth rather than age. And given that people are more adept at distinguishing faces within their own ethnic group, the potential for identifying the wrong young, black man based on one of these portraits seems large. Speaking to the Times, Shriver acknowledged that he is still working on this problem.
Since even the most active DNA photo-fitting scientists appreciate the novelty and provisional quality of their results, it is disconcerting that some of them have developed the Parabon tool, which they claim “accurately predicts genetic ancestry, eye color, hair color, skin color, freckling, and face shape.” Has the profit motive overcome scientific caution? How many police departments will get sucked in by the false hope Parabon offers? In Columbia, the police brought in a few suspects based on the computer-generated sketch, but their DNA did not match that of the crime scene, and no arrests were made in the case. Worse yet, generic faces produced by computer programs using faulty assumptions about how phenotypes develop may lead to the arrests of innocents.
While we have you...
...we need your help. Confronting the many challenges of COVID-19—from the medical to the economic, the social to the political—demands all the moral and deliberative clarity we can muster. In Thinking in a Pandemic, we’ve organized the latest arguments from doctors and epidemiologists, philosophers and economists, legal scholars and historians, activists and citizens, as they think not just through this moment but beyond it. While much remains uncertain, Boston Review’s responsibility to public reason is sure. That’s why you’ll never see a paywall or ads. It also means that we rely on you, our readers, for support. If you like what you read here, pledge your contribution to keep it free for everyone by making a tax-deductible donation.
August 11, 2015
6 Min read time