by Eike Blohm, MD
Parabon NanoLabs uses DNA from crime scenes to predict the appearance of suspects. A computer-generated mugshot released by the Edmonton Police Service was so generic that thousands of young Black men fit the profile.
The genetic information in our DNA is our genotype. It is comprised of encoded instructions of how to build and operate our bodies. The final product – our appearance – is called our phenotype.
Companies like Parabon NanoLabs feed thousands of data points into a deep learning algorithm, a computer program that receives both an individual’s photograph and their DNA as data input. Over time, the software associates certain genetic sequences with a person’s appearance and becomes able to make predictions of facial morphology based on DNA – at least in theory, that is.
In reality, the connection between genotype and phenotype is less predictable. Just because a person carries a gene does not mean the gene is expressed. Our cells modify their DNA, turning some genes off by chemically attaching methyl groups. Human phenotypic development is impacted by exposure to chemicals during pregnancy. Particularly, fetal alcohol exposure changes the appearance of a person’s face.
Parabon NanoLabs estimates an individual’s height based on DNA samples, but exposure to nicotine or cocaine during pregnancy limits growth as does inadequate nutrition during childhood. Acquired fac tors such as scars, tattoos, hairstyle, and body mass profoundly change a person’s appearance, yet such characteristics are not encoded in our DNA.
Worst of all, a DNA sample from a crime scene cannot determine a person’s age unless a witness can estimate the suspect’s age – and the presence of a witness renders DNA-based morphology predictions unnecessary – companies like Parabon NanoLabs are essentially making guesses.
Guesses can be wrong, with dire consequences. A person falsely accused of a crime because of the use of synthetic mugshots that are the product of very questionable science may incur legal costs, vigilantism, and have their reputation irreparably damaged. When the Edmonton Police Service published the very generic computer-generated image of a Black suspect, Callie Schroeder of the Electronic Privacy Information Center reacted with incredulity: “What are you going to do with this? Question every approximately 5’4” black man you see? … That is not a suggestion, absolutely do not do that.” The Edmonton Police Service removed the composite image from its website and apologized.
Parabon NanoLabs can point to several cases in which their phenotypic predictions reportedly helped solve murder and assault cases. Odds are their computer algorithm will come close in some compositions.
Yet, the overarching question remains unanswered. How many innocent people will suffer adverse consequences in exchange for the handful of successes that phenotype prediction accomplishes? How much privacy are we willing to sacrifice for a little extra security? According to Benjamin Franklin, a society willing to trade their rights for security deserves neither and will lose both.
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login