Values, Victims, and Other Broken Eggs: Encryption and Child Sexual Exploitation Online

A combination lock cracked in half like an egg, with an egg spilled onto the table.

In 1984, Apple Inc. produced an iconic Super Bowl commercial that kicked off the tradition of entertaining advertising that so many now look forward to during the championship game.

The commercial picks up on the famous novel by George Orwell. In a bleak and colorless future, masses of despondent citizens march mindlessly to a dark auditorium. From a cinema-sized screen, an imposing authority extols the virtues of their uniform ideology.

A cutaway to a vivid, colorful hero sprinting into the auditorium interrupts the dreary sequence. Pursuing riot police are unable to stop her from hurling a sledge-hammer at the screen. In a blinding flash she shatters the image of oppression and snaps the audience out of their dystopian trance. Apple’s message was to denounce conformity and the tyranny of (other) large corporations that controlled the PC market at the time.

A Promise of Privacy

Thirty-six years later, Apple’s marketing team has a new message for the people: “Some things shouldn’t be shared. Privacy. That’s iPhone.”[1] With this presumptively empowering message, Apple entrenches itself on one side of an evolving social, moral, and policy debate, which we already know has clear benefits and horrible trade-offs.

We are all growing familiar, if not uncomfortable, with the inescapable collection of our data in an ever-connected world. Apple is capitalizing on that. This trend in data collection has nourished the idea of a “Big Brother” dimension that George Orwell could not have imagined when he coined the term back in 1949.

New terms like “surveillance capitalism” describe an economic system centered around the collection and commodification of personal data. While nearly every industry hoards and trades this data, consumers in this system – the oil wells from which this new oil is drawn – ignore the flaws or else struggle to keep it in check. Regular reports of huge data breaches have become so commonplace that many are resigned to the fact that our data is perpetually compromised—as if it were the digital equivalent to an unavoidable natural disaster.

Many also worry that privacy as we know it is dead. If so, we’re talking about abandoning a core Western value that undergirds principles like personal autonomy and freedom of speech.

While this recent shift in Apple’s marketing will surely push the discussion of data collection and protection further into the mainstream, it overlooks the negative trade-offs inherent in part of the company’s promise. Enabling privacy from invasive data collectors or overreaching governments means enabling those people with more sinister things to hide.

A Back Door to Trouble

By design, not even Apple can unlock your iPhone. This aspect of its privacy promise is enabled by encryption, a technology that essentially scrambles data so it is unreadable to anyone without the keys. Whenever the phone is locked, its contents are protected by encryption.

In 2015, after the horrific San Bernardino terrorist attack left 14 people dead and 22 others injured, Apple refused the FBI’s request to unlock the shooter’s phone, explaining that to do so would compromise the data on every phone it had ever sold. The FBI was looking for a way to bypass the phone’s encryption—such access is otherwise known as a “back door.” A court order was soon followed by applications to the U.S. Department of Justice to compel Apple to comply. The suit was eventually withdrawn before Apple could appear in court when a third-party contractor working with the FBI was able to break into the phone.[2] Apple then patched the software vulnerability that allowed the third party to do so.

In this instance U.S. national security was at odds with U.S. corporate interests. But so was the safety of the public from terrorist attacks and government overreach, depending on one’s point of view. The tension has only increased over the last five years and the debate pits technologists, policymakers, and human rights advocates against one another.

Apple’s positive message for its users highlights an important dimension in the broader privacy discussion, but consumers may struggle with conflicting messaging as it plays out in legislatures and courts around the world. While the message is simple, the technology is not. An effective justice system needs to be able to investigate crimes and encryption makes that difficult and sometimes impossible. Unfortunately, the primary function of encryption becomes ineffectual the moment a back door is created. It is currently unfeasible to selectively grant even lawful authorities access while ensuring unlawful parties won’t be able to gain access as well. There is no real room for compromise.

The history of computer technology demonstrates this with near certainty. Alan Turing, the famous mathematician and forefather of modern computing, is known for his effort to crack the Enigma Machine, Germany’s analog encryption system used to communicate in secret during World War II. When encryption can be unscrambled without the keys, it is no longer useful. The risk of creating a back-door to such essential security is thus too great for Apple, its customers, and many other developers and manufactures that use encrypted technology.

Digital encryption has existed for decades. It is used to secure mundane tasks, from online banking and shopping to communicating via voice and text. It also enables citizens to fight for civil rights against oppressive regimes around the world, as was the case in Hong Kong during the 2019-2020 Anti-Extradition Law Amendment Bill Movement.[3] Pro-democracy protestors were able to organize and largely avoid surveillance by encrypting their communications with the free app Telegram.

What Lurks in the Dark

The 2015 San Bernardino case reveals the dilemma created by this technology. There is no digital equivalent to a search warrant that would grant lawful access to a private, protected physical space, like a home or a safe. This in turn makes investigating potential crimes difficult and provides little protection or legal recourse for victims of those crimes. In addition to enabling terrorists to communicate “in the dark,” it also enables criminal enterprises like illegal drug dealing to thrive online.

And, as has been the case over the past two decades, it plays a part in the horrifying increase in the spread of child sexual abuse material (CSAM) online. Messaging around this crisis now drives much of the ethical and policy debate surrounding encryption technology.

The National Center for Missing & Exploited Children (NCMEC) is an international clearinghouse that collects reports on issues related to child victimization. The Center operates the CyberTipline, which allows the public and electronic service providers (ESPs[4]) to report various forms of child sexual exploitation. These reports are then made available to appropriate law enforcement agencies.

In 1998, ESPs and members of the public reported over 3,000 images of sexual abuse. By 2014 reports exceeded one million. Just five years later, over 16.9 million reports were filed, of which 16.8 million were reported by ESPs like Verizon Media, Google, and Facebook.[5] [6]

ESPs based in the U.S. are required by federal law to report illicit content when it is found. They are not required to search out instances of CSAM on their service, though many do. And while over 1,400 companies from around the world are registered with the CyberTipline, not all report to it regularly. The already- disturbing number of reports and known illicit content circulating online is considered to be a fraction of the actual total. Unfortunately, the NCMEC and investigative authorities are inundated, understaffed, and underfunded.

Researchers working with the NCMEC have analyzed the rapid growth of CSAM and point to several factors that contribute to this problem.[7] The researchers describe a proliferation of new methods for creating and trading illicit content, including video-capable smartphones, access to faster Internet connections, and easier means of editing content. Their analysis indicates that chat applications and anonymous communication software like the Tor Project [8] make up as little as 1% of the distribution methods. The researchers do not address how encryption might prevent CSAM from being identified in the first place, but their research demonstrates that encrypted technology is one of many contributors to the CSAM crisis. Policymakers and child protection advocates claim the problem will only grow worse if more ESPs adopt encryption.

Policing Encrypted Technology

At first glance, new legislation that would hold ESPs more accountable or discourage them from implementing encryption at all might seem like a logical path toward a solution to the CSAM crisis. Outgoing Attorney General William Barr has regularly espoused the dangers of the technology. In a petition signed by an international collaboration of government agencies, he singled out Facebook urging the company to abandon plans to encrypt their Facebook Messaging app, fearing that it will exacerbate the crisis.[9] In 2019, Facebook reported 15.9 million instances of apparent child pornography out of the 16.8 million total instances reported that year.[10] Encryption could very well make identifying CSAM on the social media platform difficult if not impossible.

There is also bipartisan support for the EARN IT Act of 2020, which seeks to establish a National Commission on Online Child Sexual Exploitation Prevention, which would then define “best practices” for ESPs. Should an ESP fail to adhere to these yet-to-be-defined practices, they would lose existing content liability protections under Section 230 of the Communications Decency Act. This means ESPs could be made liable for the content users share via their products or services. Freedom from this liability under Section 230 is credited as one of the key reasons for the explosive growth of the Internet over the past three decades.

The NCMEC testified in support of the proposed legislation,[11] but critics of the act claim it is primarily an indirect attempt to create legal pathways to demanding access to encrypted systems and not an honest effort to stem the spread of CSAM. There is no mention of encryption in the proposed bill. But critics bolster their argument by pointing out how Congress regularly reallocates about half of the $60 million already promised to organizations like the NCMEC in the PROTECT Our Children Act of 2008. Of those funds, close to $6 million was recently diverted to immigration enforcement.[12]

While encryption does seem to play a part in the CSAM crisis, it may be disproportionately targeted by policymakers. And if the true intentions of recently proposed legislation are about getting around encryption and not the protection of exploited children, then we can hardly expect the policies to efficiently mitigate the crisis.

Regardless, regulating encryption would be largely ineffective, if not impossible. Even if there were a foolproof method of providing select lawful access to encrypted systems, determined individuals looking to trade in illicit content have numerous alternatives to turn to, and many are well outside U.S. or Western jurisdiction.

There are countless web-based file storage services available. Some examples, like the website Megaupload, have been shut down for hosting illegal content, but are soon reborn under a new name or replaced by an alternative based out of  countries with fewer content policies.

The encrypted chat app Telegram, originally developed in Russia, but now based out of Germany, touts over 400 million active users. The app allows users to create encrypted chat rooms and was recently at the center of the “Nth Room” criminal case in South Korea involving blackmail and cybersex trafficking of at least 103 victims, including 26 minors.[13] Over the course of two years, two men coerced victims into sharing increasingly explicit material and then charged over 15,000 users for access to the content.

The previously mentioned Tor anonymity network was originally developed by the U.S. Naval Research Laboratory, and later by the U.S. Defense Advanced Research Projects Agency, before being released to the public. It grants access to the “Dark Web” – the part of the Internet that is not accessible through ordinary web browsers like Windows Edge, Firefox, or Chrome. Tor also prevents someone from knowing what websites you visit. It is almost entirely decentralized, thereby making it difficult to govern.

The prevalence of encrypted technology and general ease with which a programmer can create new encrypted apps[14] casts further doubt on the policy approach of the EARN IT Act. If policymakers focus new laws on punishing companies for failing to comply with poorly defined “best practices,” they will find themselves in a perpetual game of whack-a-mole. With manpower and funding already in short supply, such an exhausting approach seems unlikely to keep pace with the CSAM growth we’ve already seen. It might even cause greater harm as the majority of companies affected by such policy would be those already willing to collaborate on mitigating the CSAM crisis.

Hack the Problem

If technology is at the center of the crisis, it might also be at the center of the solution. Many of the same companies being criticized and scrutinized for enabling the crisis are also developing the technology that automatically identifies, removes, and reports the illicit content.

As far back as 2006, members of The Technology Coalition, a network of tech giants like Amazon, Adobe, and PayPal, collaborated and shared expertise to address the emerging CSAM problem. In 2009, Microsoft developed PhotoDNA to convert known images of CSAM to numerical values, which could then be used by other software to automatically identify illicit content. The software was eventually donated to Project VIC, an initiative managed by the NCMEC to streamline the investigation process.

Actors Ashton Kutcher and Demi Moore founded an organization now known as Thorn, which develops cutting-edge countermeasures including Safer, a tool that stops the viral spread of CSAM on third-party websites.

Facebook – often credited as one of the largest platforms complicit in the spread of CSAM – developed technology that can automatically detect identical and nearly identical photos and videos. They then made it free to the public and industry partners. Automating the process of identifying and removing CSAM is a promising countermeasure.

At the same time, Facebook founder and CEO Mark Zuckerberg is adamant that we should preserve technology that empowers important rights like free speech and privacy. Zuckerberg, Apple, and others in favor of protecting encryption, recognize the inherent problems and consider the risk of enabling “truly terrible things”[15] necessary to maintaining these critical values.

On Making Omelets

During a recent conversation, I overheard one privacy advocate defending the protection of privacy through encryption and downplaying the CSAM crisis by quoting the clichéd idiom, “You can’t make an omelet without breaking a few eggs.”

That is a costly omelet. And these are not eggs anyone should be so content to break. I believe defending fundamental values like privacy and free speech are critical and that encryption is one of the last bastions in the fight to hold this ground. I am also deeply disturbed by how the technology can be misused. The decisions before us are not so clear-cut, heroic, or trivial as the portrayal in Apple’s 1984 Superbowl commercial.

I do not believe encryption could, much less needs, be abandoned, nor do I expect the CSAM crisis would go away if it were. Policymakers should avoid political posturing or proposing legislation ill-equipped to keep pace with technology. They should enforce existing laws meant to support front-line organizations like the NCMEC. No amount of additional ESP policing will resolve the gross underfunding of cybercrimes units and state and local law enforcement agencies who are tasked with physically addressing millions of CSAM reports around the world.

Rather than create policies designed to punish or gain leverage over technology companies, policymakers should incentivize and collaborate with them to address the problems head on. These companies are best suited to develop scalable and effective solutions that would reduce the proliferation of CSAM online and help alleviate existing bottlenecks in funding and investigative manpower.


REFERENCES

[1] Apple Inc. “Privacy. That’s iPhone. – Over Sharing”, YouTube, 2020. https://www.youtube.com/watch?v=-l61NE0eqkw

[2] Dale, Jack; Levine, Mike; Newcomb, Alyssa. “Justice Department Withdraws Request in Apple iPhone Encryption Case After FBI Accesses San Bernardino Shooter’s Phone”. ABC News. https://abcnews.go.com/Technology/justice-department-withdraws-request-apple-iphone-encryption-case/story?id=37986428, 28 March 2016.

[3] Vincent, Danny. “How Apps Power Hong Kong’s ‘Leaderless’ Protests.” BBC News, https://www.bbc.com/News/Technology-48802125, 29 June 2019.

[4] ESPs are persons or entities providing any electronic communication service. It is a term defined in section 2510 of title 18, United States Code

[5] “By the Numbers.” National Center for Missing & Exploited Children, https://www.missingkids.org/gethelpnow/cybertipline#bythenumbers, retrieved 15 Sept. 2020.

[6] Keller, Michael H., and Gabriel J.X. Dance. “The Internet Is Overrun With Images of Child Sexual Abuse. What Went Wrong?” The New York Times, https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html, 29 Sept. 2019.

[7] Bursztein, Elie., et al. “Rethinking the Detection of Child Sexual Abuse Imagery on the Internet” In The World Wide Web Conference (pp. 2601-2607) https://dl.acm.org/doi/abs/10.1145/3308558.3313482, May 2019.

[8] The Tor Project is a 501(c)3 US nonprofit with the mission to advance human rights and freedoms by creating and deploying free and open source anonymity and privacy technologies, supporting their unrestricted availability and use, and furthering their scientific and popular understanding. https://www.torproject.org/

[9] Patel, Priti, William P. Barr, Kevin McAleenan, Peter Dutton. “Open Letter: Facebook’s ‘Privacy First’ Proposals” The U.S. Department of Justice, https://www.justice.gov/opa/press-release/file/1207081/download, 4 Oct. 2019.

[10] “By the Numbers.” National Center for Missing & Exploited Children, https://www.missingkids.org/gethelpnow/cybertipline#bythenumbers, retrieved 15 Sept. 2020.

[11] Shehan, John. “The EARN IT Act: Holding the Tech Industry Accountable in the Fight Against Online Child Sexual Exploitation” U.S. Senate Committee on the Judiciary. https://www.judiciary.senate.gov/imo/media/doc/Shehan%20Testimony.pdf 11 Mar. 2020.

[12] Keller, Michael H., and Gabriel J.X. Dance. “The Internet Is Overrun With Images of Child Sexual Abuse. What Went Wrong?” The New York Times, https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html, 29 Sept. 2019.

[13] “Ruling party, gov’t push for abolishing statute of limitations for child sex crime” Yonhap News Agency, https://en.yna.co.kr/view/AEN20200406002200315, 6 April 2020.

[14] Parsons, Nick. “Build an Encrypted Messaging App for Android” The Stream Blog, https://getstream.io/blog/encrypted-messaging-app-android/ 27 Mar. 2020

[15] Zuckerberg, Mark. “A Privacy-Focused Vision for Social Networking,” Facebook, https://www.facebook.com/notes/mark-zuckerberg/a-privacy-focused-vision-for-social-networking/10156700570096634/, 6 March 2019.