Prosecutors Dropping Child Porn Charges After Software Tools Are Questioned
More than a dozen cases were dismissed after defense attorneys asked to examine, or raised doubts about, computer programs that track illegal images to internet addresses.
Using specialized software, investigators traced explicit child pornography to Todd Hartman’s internet address. A dozen police officers raided his Los Angeles-area apartment, seized his computer and arrested him for files including a video of a man ejaculating on a 7-year-old girl. But after his lawyer contended that the software tool inappropriately accessed Hartman’s private files, and asked to examine how it worked, prosecutors dismissed the case.
Near Phoenix, police with a similar detection program tracked underage porn photos, including a 4-year-old with her legs spread, to Tom Tolworthy’s home computer. He was indicted in state court on 10 counts of committing a “dangerous crime against children,” each of which carried a decade in prison if convicted. Yet when investigators checked Tolworthy’s hard drive, the images weren’t there. Even though investigators said different offensive files surfaced on another computer that he owned, the case was tossed.
At a time when at least half a million laptops, tablets, phones and other devices are viewing or sharing child pornography on the internet every month, software that tracks images to specific internet connections has become a vital tool for prosecutors. Increasingly, though, it’s backfiring.
Drawing upon thousands of pages of court filings as well as interviews with lawyers and experts, ProPublica found more than a dozen cases since 2011 that were dismissed either because of challenges to the software’s findings, or the refusal by the government or the maker to share the computer programs with defense attorneys, or both. Tami Loehrs, a forensics expert who often testifies in child pornography cases, said she is aware of more than 60 cases in which the defense strategy has focused on the software.
Defense attorneys have long complained that the government’s secrecy claims may hamstring suspects seeking to prove that the software wrongly identified them. But the growing success of their counterattack is also raising concerns that, by questioning the software used by investigators, some who trade in child pornography can avoid punishment.
Bottom of Form
“When protecting the defendant’s right to a fair trial requires the government to disclose its confidential techniques, prosecutors face a choice: Give up the prosecution or give up the secret. Each option has a cost,” said Orin Kerr, an expert in computer crime law and former Justice Department lawyer. “If prosecutors give up the prosecution, it may very well mean that a guilty person goes free. If prosecutors give up the secret, it may hurt their ability to catch other criminals. Prosecutors have to choose which of those outcomes is less bad in each particular case.”
In several cases, like Tolworthy’s, court documents say that the software traced offensive images to an Internet Protocol address. But, for reasons that remain unclear, those images weren’t found on the defendant’s computer. In others, like Hartman’s, defense lawyers said the software discovered porn in areas of the computer it wasn’t supposed to enter, and they suggested the police conducted an overly broad search.
These problems are compounded by the insistence of both the government and the software manufacturers on protecting the secrecy of their computer code, so as not to imperil other prosecutions or make trade secrets public. Unwilling to take the risk that the sensitive programs could leak publicly, they have rejected revealing the software even under strict court secrecy.
Nevertheless, the software is facing renewed scrutiny: In another case where child pornography identified by the software wasn’t found on the suspect’s computer, a federal judge in February allowed a defense expert to examine it. And recently, the nonprofit Human Rights Watch asked the Justice Department to review, in part, whether one suite of software tools, the Child Protection System, had been independently tested.
“The sharing of child-sex-abuse images is a serious crime, and law enforcement should be investigating it. But the government needs to understand how the tools work, if they could violate the law and if they are accurate,” said Sarah St.Vincent, a Human Rights Watch researcher who examined the practice.
“These defendants are not very popular, but a dangerous precedent is a dangerous precedent that affects everyone. And if the government drops cases or some charges to avoid scrutiny of the software, that could prevent victims from getting justice consistently,” she said. “The government is effectively asserting sweeping surveillance powers but is then hiding from the courts what the software did and how it worked.”
The dismissals represent a small fraction of the hundreds of federal and state child pornography prosecutions since 2011. More often, defendants plead guilty in exchange for a reduced sentence. (Of 17 closed cases brought since 2017 by the U.S. attorney’s office in Los Angeles, all but two resulted in plea deals, ProPublica found.) Even after their charges were dropped, Tolworthy and Hartman are both facing new trials. Still, the dismissals are noteworthy because challenges to the software are spreading among the defense bar and gaining credence with judges.
Software developers and law enforcement officials say the detection software is an essential part of combating the proliferation of child pornography and exploitation on the internet.
“This is a horrendous crime, and as a society we’re obligated to protect victims this young,” said Brian Levine, a computer science professor at the University of Massachusetts at Amherst who helped develop one such tool, called Torrential Downpour. “There are a number of victims who are too young to speak, or can’t speak out of fear. This tool is available to law enforcement to rescue those children who are abused.”
In cases where previously flagged porn isn’t turning up on a suspect’s computer, investigators have suggested the files have merely been erased before arrest, or that they’re stored in encrypted areas of a hard drive that the police can’t access. Defense attorneys counter that some software logs don’t show the files were ever downloaded in the first place, or that they may have been downloaded by mistake and immediately purged.
Defense lawyers are given a bevy of reasons why porn-detection software can’t be handed over for review, even under a protective order that limits disclosure to attorneys and their experts. Law enforcement authorities often say that they’re prohibited from disclosing software by their contracts with the manufacturer, which considers it proprietary technology.
Prosecutors are also reluctant to disclose a coveted law enforcement tool just to convict one defendant. A Justice Department spokeswoman referred ProPublica to a government journal article, which argued peer-to-peer detection tools “are increasingly targeted by defendants through overbroad discovery requests.”
“While the Department of Justice supports full compliance with all discovery obligations imposed by law,” wrote lawyers for the Justice Department and the FBI, “those obligations generally do not require disclosure of sensitive information regarding law enforcement techniques which, if exposed, would threaten the viability of future investigations.”
One former Justice Department prosecutor said the government has shielded software in criminal cases for fear that disclosure could expose investigators’ capabilities or classified technology to criminals.
“They don’t want to reveal that in a case because it can be the last time they use it,” said the lawyer, who requested anonymity because of the sensitive nature of the topic. “It sounds like they may, in some circumstances, be using programs that are never intended to see the light of day in the criminal justice system.”
The government’s reluctance to share technology with defense attorneys isn’t limited to child pornography cases. Prosecutors have let defendants monitored with cellphone trackers known as Stingrays go free rather than fully reveal the technology. The secrecy surrounding cell tracking was once so pervasive in Baltimore that Maryland’s highest court rebuked the practice as “detrimental.” As was first reported by Reuters in 2013, the U.S. Drug Enforcement Administration relied in investigations on information gathered through domestic wiretaps, a phone-records database and National Security Agency intercepts, while training agents to hide those sources from the public record.
“Courts and police are increasingly using software to make decisions in the criminal justice system about bail, sentencing, and probability-matching for DNA and other forensic tests,” said Jennifer Granick, a surveillance and cybersecurity lawyer with the American Civil Liberties Union’s Speech, Privacy and Technology Project who has studied the issue.
“If the defense isn’t able to examine these techniques, then we have to just take the government’s word for it — on these complicated, sensitive and non-black-and-white decisions. And that’s just too dangerous.”
The software programs used by investigators scan for child porn on peer-to-peer networks, a decentralized connection of computers on the internet where users share files directly with one another. Those networks behave similarly to software like Napster, the popular file-sharing program used to download music in the early days of the commercial internet.
Although Napster may have faded, the trading of child pornography on peer-to-peer networks hasn’t. To keep up, police rely on modified versions of popular peer-to-peer programs to flag IP addresses of suspected child pornography, enabling investigators to subpoena the internet provider and unearth the internet subscriber. They then obtain a search warrant for computers at the physical location they say is involved in sharing porn.
One common suite of software tools, the Child Protection System, is maintained by the Florida-based Child Rescue Coalition. Although the coalition says it’s a nonprofit, it has ties to for-profit data brokers and the data company TLO. (TransUnion, the major credit-reporting agency, has acquired TLO.) CRC has hosted some of its computer servers at TransUnion since 2016, according to a review of internet records collected by the firm Farsight Security.
A redacted user manual filed in a federal case, portions of which were un-redacted by Human Rights Watch and confirmed by ProPublica, indicates that the Child Protection System draws on unverified data gathered by these firms. It says TLO“has allowed law enforcement access to data collected on internet users from a variety of sources,” with enhanced information that includes “marketing data that has been linked to IP addresses and email accounts from corporate sources.”
“No logs are kept of any law enforcement query of corporate data,” the manual continued. It cautioned that subscriber data was unconfirmed, and that it should “be confirmed through other investigative means that are acceptable with your agency and prosecuting attorney.”
Software that relies on unconfirmed information from big data brokers, civil liberties advocates say, may not only point police to the wrong internet address owner, but it also enables them to gather a mountain of personal details about a suspect without a court order, sidestepping constitutional protections.
The software’s makers have resisted disclosure of its coding. In May 2013, TLO asked a federal court in El Paso, Texas, to quash a subpoena to reveal the software known as the Child Protection System in a child-porn case. The materials sought, they said, “are protected under the law enforcement privilege and trade secrets laws.” After the judge ordered the software produced, prosecutors instead agreed to a plea deal that favored the defendant; he was sentenced to three years he had already served for “transportation of obscene material.”
CRC says on its website that its software is used in every state and more than 90 countries, and has tracked more than 54 million offenders. CRC President William Wiltse, a former Oregon police officer, has testified for the prosecution in cases in which investigators relied on the Child Protection System.
CRC did not respond to phone and email inquiries from ProPublica this month about its software. It told Human Rights Watch this year, “As a policy, we do not publicly share details of how we identify sex offenders online, as we do not want predators to learn better ways to hide their illegal activity.” A spokesman for TransUnion, which now owns TLO, said the company “supports Child Rescue Coalition in its work with law enforcement to protect children from sexual exploitation online.”
Another widely used detection tool, Torrential Downpour, was developed by the University of Massachusetts a decade ago with U.S. government funding, court records show. Levine told ProPublica in an interview that the program is accurate enough to find probable cause for a search warrant, but that it can only be effective if police and the courts do their jobs.
“The software is one part of an entire process,” Levine said, “followed by investigators and courts to produce reliable evidence and to follow a fair judicial process.”
Investigators using Torrential Downpour said they turned up damning evidence to ensnare Tolworthy, a software engineer from Mesa, Arizona. They accused him of possessing illicit files that included “Pedo Baby 03 - 2 yo Photos 56.jpg.” His IP address was “involved in making those types of files available,” Pinal County Sheriff’s Deputy Randall Snyder testified in May 2015, according to testimony obtained by ProPublica. “I selected five of them off of the laptop for investigative and charging purposes.” Tolworthy pleaded not guilty.
Asked if the files were “on his computer, or were they just observed being downloaded,” Snyder replied that references to the images were part of a torrent file — a kind of digital index that asks to download specific images or movies. But, he said, “We have not conducted a thorough enough investigation of the computers through our forensics yet to find those particular files.”
In other words, the state couldn’t say if half the files Tolworthy, 44, was arrested for possessing — and that were identified by the software — were indeed on his computer. After prosecutors assured grand jurors that the investigation was continuing, they indicted him anyway.
Yet by late 2016, Tolworthy’s defense expert began raising doubts about whether the files existed. “I was unable to locate the torrent, the info hash or the files of child pornography identified during the undercover investigation,” Loehrs said in an affidavit after conducting her own search of Tolworthy’s hard drive.
“In addition, the torrent, the info hash and the files of child pornography were not found by the State’s forensic examiner, either,” she wrote.
That “info hash,” as it’s called, is a fingerprint that identifies computer files, which investigators match against a database of known child porn. That’s how police detect illegal files that might have been renamed with mundane-sounding headings (such as “sunset.jpg”) to avoid detection. The hash is also important to the defense, Loehrs said, because a computer might mistakenly broadcast the hash of a downloaded file when, in fact, it’s the hash of a movie or video a user merely requested — sometimes by accident.
Arizona Superior Court Judge Mark Brain found the defense arguments persuasive. He said it appeared that Tolworthy had “a substantial need for the software” and ruled in February 2017 that Tolworthy’s lawyers could ask for it.
The defense pressed for the software program, but the University of Massachusetts balked. Its lawyer said in a court document that handing over the software would “destroy its value to the university and its faculty researcher,” citing a $440,000 annual FBI grant. “Releasing it to public view would frustrate public policy and impede law enforcement’s ability to deter peer-to-peer sharing of child pornography,” the lawyer added.
If the images identified by Torrential Downpour are missing from a suspect’s hard drive, as in Tolworthy’s case, that’s not the software’s fault, Levine told ProPublica. Suspects could delete contraband after downloading it, or they might encrypt their computers to prevent illicit materials from being found.
Prosecutors first indicated they’d drop only the charges associated with the search and leave those arising from images found on another computer during a search of Tolworthy’s house. In April 2018, though, they dismissed all charges, saying it was “in the interests of justice.” They have since re-filed charges against Tolworthy relating in part to the material on the old computer; that case is pending in state court.
Tolworthy, through his lawyer, declined to comment. Maricopa County Attorney’s Office spokeswoman Amanda Steele said there was no policy to dismiss charges rather than disclose secretive software tools. “Prosecutors regularly review cases to ensure appropriate charges are filed and just results are achieved,” she said.
The Tolworthy saga is strikingly similar to another Arizona case, which is in federal court. In late 2016 and early 2017, Torrential Downpour identified child pornography at the IP address of Anthony Gonzales, who lived with his family in Surprise, Arizona, northwest of Phoenix. As in the Tolworthy case, the files previously tagged by investigators online weren’t found on Gonzales’ computer, but police say other contraband turned up on a tablet after his house was searched. Gonzales pleaded not guilty.
In February, the judge ordered the software turned over to the defense. Loehrs, the expert for Gonzales as well as Tolworthy, “opined that all software programs have flaws, and Torrential Downpour is no exception,” U.S. District Judge David Campbell wrote.
“Defendant Gonzales has done more than simply request access to the software and argue that it is material to his defense,” the judge wrote. “He has presented evidence that calls into question the government’s version of events.”
Both sides in the pending case are in discussions about how to test the software, according to a person familiar with the matter.
“It matters to find out whether the government is abiding by the law and the Constitution,” said Barbara Hull, a Phoenix lawyer representing Gonzales. “People need to know what the government is doing on the internet, and whether their privacy and their rights are being violated.”
Even when the child porn identified by the software does show up on the suspect’s computer, some of the cases have unraveled, largely due to the government’s penchant for secrecy.
After the Child Protection System led police to illicit files on Hartman’s hard drive, such as the video “Cumming over loli_s pussy 2010 7yo and Dad Brilliant.wmv,” the former church youth counselor faced up to 50 years in prison if convicted. Hartman pleaded not guilty and his public defender, Andrea Jacobs, asked to inspect the software.
Prosecutors said its disclosure would allow child pornographers to evade detection. They also noted that the Child Protection System is available to law enforcement under a strict agreement that maintains its secrecy: “No persons shall publicly demonstrate this system or the software provided without the expressed permission of the software owner.”
To counter the evidence of child pornography turned up by Child Protection System, Hartman’s lawyer contended that an examination of the software was critical to his defense. The detection program, Jacobs said, likely searched files in private areas on his computer that weren’t ever meant to be found on peer-to-peer networks. Hartman, his expert said, had not shared the hundreds of images in four of the six files allegedly identified by Child Protection System and downloaded by a Newport Beach, California, police officer during the investigation. And the last time that the two remaining files were shared had been three months before the investigation started, so the software should not have caught them, she said.
Only the software itself could show whether it went too far, and the prosecution and the manufacturer refused to reveal the program. As a result, “Despite ample opportunity to do so, the government has not refuted this testimony,” U.S. District Judge Josephine Staton wrote in November 2015. She sided with Hartman’s lawyer and ordered disclosure of the software.
Prosecutors stalled. “The government has made substantial progress, but is requesting additional time because items pivotal to the requested testing are in the possession of a non-governmental entity” that also owns the intellectual property, Assistant U.S. Attorney Anne Gannon said on Jan. 8, 2016.
She promised an answer by Jan. 19, 2016. The next day, she dismissed the case in a one-sentence filing.
Jacobs, who is no longer representing Hartman, told ProPublica she couldn’t discuss the case. Hartman’s mother did not respond to an interview request, and family staying at his house in Yorba Linda, California, did not answer a knock at the door and a written message left by a ProPublica reporter in mid-March.
As with Tolworthy, the dismissal didn’t end Hartman’s legal troubles. He was later charged in California state court under child-porn and child-sex laws. That case is pending.
Both Tolworthy and Hartman are in jail awaiting trial.
---
Claire Perlman contributed to this report.
Jack Gillum is a senior reporter at ProPublica covering technology, specializing in how algorithms, big data and social media platforms affect people’s daily lives and civil rights.
This article was originally published April 3, 2019, by ProPublica (propublica.org); reprinted with permission. Copyright, 2019 Pro Publica Inc.