Identity

The Young Woman Behind the Software that Catches Child Predators

There are some jobs where you get to change the world for the better. Twenty-eight-year-old Harriet Lester is in one of those jobs. Her goal is to eliminate child abuse material from the Internet, and stop the sharing of horrific images of children being sexually abused.

The Internet Watch Foundation (IWF), the UK’s official hotline for reporting illegal online content, which works in partnership with police and governments around the world as well as the Internet industry, was launched in 1996. Back then, at the dawn of the Internet age, tackling images of child sexual abuse online was a brand new challenge. But with every evolution of the web, protecting these victims gets tougher, as those who share images of child abuse online find new ways to share images without being caught.

Videos by VICE

One of only two girls in her university year to study computer science, Lester started her career working in video gaming, but felt she wasn’t getting much out of it. When she heard of a job as a content analyst at the IWF—finding and assessing photos and videos of child sexual abuse—she saw a chance to make a difference.

It’s one of the world’s hardest jobs, though. “Nothing before this job could ever have prepared me for seeing the kind of content you do,” she says. “Last year 34 percent of the content we assessed was of child rape or sexual torture. This is not everyday sexting. But because I can get it removed from the Internet, it makes it easier to look.”

Lester, a self-confirmed techie “geek,” is now the technical projects officer for the IWF. And she’s rolling out innovative new technologies in the ever-more complex fight to protect children from online sexual abuse.

Read more: When I Dated a Pedophile

The IWF has already made a huge difference in the UK. Today only 0.2 percent of the world’s known child sexual abuse content is hosted in the UK, compared to 18 percent 20 years ago. (North America has a much bigger problem—it hosts 57 percent of this content.) Commercially distributed sexual abuse material is now almost entirely linked to just seven known criminal entities worldwide. But stopping thousands and thousands of people all over the world viewing and sharing content is very difficult. After all, the Internet is borderless.

One of the most important new technologies Lester is implementing is image hashing, which she describes as “a massive game changer.” She tells the story of one victim, Tara*, now an adult, who was repeatedly abused as a child and recorded with photography. Although Tara was rescued by US law enforcement, the online images of her abuse have been shared more than 70,000 times, and Tara has had to live with the knowledge that they continue to spread.

All photos by Mike Thornton

A hash is a unique digital fingerprint of an image. The new technology uses Microsoft PhotoDNA technology to spot duplicates of the same image so that they can be removed from the Internet in one go, instead of analysts having to search for every copy.

Over the last year Lester has developed an ever-growing hash list of, so far, 19,000 sexual abuse images which—like Tara’s—are reposted online faster than analysts can look for each one and take them down. Companies like Google, Apple, Facebook, Twitter, and Microsoft will now implement the hash list into their algorithms. If an offender tries to upload a known sexual abuse image onto a website, the site owner—for instance, Facebook—will be able to automatically red-light the image and stop the upload.

Image hashing can find images of abuse on the dark web, too, despite its private and unregulated networks. Lester explains that Tor and other browsers used to access the dark web are “an anonymous way for offenders to distribute material. Here we see a lot of new victims; this is where they tend to appear first.” While police work on catching pedophiles active on the dark web, Lester can now “grab those images and hash them so we block them from ever appearing on the open Internet.” Lester has also worked closely with a Google engineer to develop a “web crawler,” which searches websites for abuse images.

“Offenders always seem to be one step ahead of us and for once we’ve actually caught up with them,” says Lester. “Companies will have to implement hash lists into their internal systems. We’re hoping the amount of images will start dropping.”

Harriet Lester talking to a colleague.

There is a new problem, though: new domains and disguised websites. In the last two years, more top-level domains have been released: dot coms are becoming dot anythings. “2015 was the first time we saw a new generic top-level domain being used for the purpose of distributing child sexual abuse material,” says Lester. “We saw 436 websites in 2015 using generic top-level domains to share this material. The majority of these were disguised websites, an ever-increasing method for offenders of distributing material.”

If, as a regular web browser, you clicked on a disguised website, you would see an adult porn site. But these sites are designed so that offenders hit certain links and perform specific clicks to access a hidden pathway onto the site. “If you follow the pathway you see the most horrendous child sexual abuse material,” Lester says. “This is the main technique criminals are using to cover their tracks.”

She remains confident, though. “We daily find new techniques they’re using; we work hard to spot the patterns these people are using and try to disrupt them.”

For More Stories Like This, Sign Up for Our Newsletter

At most tech organizations, women are still in a minority, but many of the IWF’s staff are female. Lester is one of several women working in key roles, from Internet content analysts to Susie Hargreaves OBE, the charity’s CEO, a rare role for a woman in the tech industry.

For Lester, who describes herself as “a massive feminist”, her job is “a chance to make history.”

“We’re getting there,” she says. “And we’re good.”