The UK’s Online Safety Bill, designed to protect online users, especially children, has stirred debate with its Clause 122. The provision challenges encryption, a security tool that protects data by scrambling it. It mandates platforms, like WhatsApp and Signal to scan for child sexual abuse material, but their reliance on end-to-end encryption makes content access impossible, even for companies themselves. Importantly, however, the UK government has verbally assured that companies won’t be forced into proactive scanning if a technically feasible solution isn’t available. This raises a question: Is the government sidelining the much-touted client-side-scanning (CSS) method, earlier proposed as a solution to protect children and privacy at the same time? The answer is unclear. CSS which scans unsent messages against illegal content presents privacy risks and potential misuse for surveillance and censorship. History also cautions us; past surveillance efforts, like the EU Data Retention, didn’t live up to their promise. Given the absence of a definitive solution in sight that can single out harmful content without compromising encryption, governments must re-visit and re-assess their regulatory demands. The UK’s evolving stance prompts countries like India, grappling with similar encryption and privacy issues, to recalibrate their strategies.
The UK Parliament recently passed the Online Safety Bill, concluding one of the most debated legislative processes in UK history.[1] The legislation which aims to make the internet safer, particularly for children, now awaits Royal Assent to become a law. One of the most controversial provisions in the Bill is Clause 122 which threatens to break encryption on personal messaging services. Encryption ensures the secure exchange of information online by converting plain text into scrambled data, decipherable only by someone who has the secret code or key to decrypt it.[2] Clause 122 also known as the ‘spy clause’, mandates user-to-user services to proactively scan content for child sexual abuse material (CSAM).[3] Given that personal messages on WhatsApp are secured with end-to-end encryption, not even the company itself can see their contents.[4] Thus, asking services to identify CSAM shared on their platform, could inevitably compromise this end-to-end encryption. Although the text of the Bill does not demand removal to E2E encryption, the obligations set forth would de facto mean breaking encryption.
Notably, however, the UK government revised its position on encryption, albeit only in verbal commitments, not in the codified law.[5] During the House of Lords debate last month, Stephen Parkinson declared that tech companies will not be forced to proactively employ technology on private communications to detect CSAM.[6] He emphasized that the regulatory body, Ofcom, can only advocate for the use of a technology that is both technically feasible and precisely targets CSAM. If the appropriate technology which meets these requirements does not exist, Ofcom cannot require its use.[7]
Does this mean that client-side scanning (CSS), touted as the technology that can protect children and privacy at the same time[8], is now being dismissed by the UK Parliament? The answer remains unclear. CSS scans unsent draft messages against a database of illegal content, using unique digital fingerprints (hashes) generated by software on the user’s device. If there’s a match, the software will either block the message or report it to the authorities.[9] However, CSS poses significant privacy and security concerns. Foremost, it undermines the fundamental assurance of encryption: that only the sender and the intended recipient can access the message’s content.[10] If hash comparisons are done on a remote server, the service provider or and anyone else with access to the server could monitor and filter content users wish to send.[11] Further, adding CSS features broadens the attack surface by creating additional ways to manipulate the database of objectionable content.[12]
What’s more, CSS can be exploited for censorship and suppression of free speech by preventing legitimate content from being shared between users.
Most importantly, a CSS system exclusively targeting CSAM is unfeasible. Modifications to the database or its guiding algorithms, lets those in control of the system screen for any content of interest. This can risk misuse for surveillance, censorship and even false positives that might tag legitimate content as harmful, which in turn can have devastating consequences for free speech.
Moving beyond the fallibility of CSS, there is also a looming question about the efficacy of such mass surveillance tools. Historical instances, like the EU Data Retention Directive between 2008-10, serve as cautionary tales.[13] Lauded by intelligence agencies as a breakthrough tool, it aimed to harness bulk data to trace CSA offenders through IP addresses. However, evidence suggests that it barely made a dent in crime detection or deterrence. So, when governments advocate another tryst with diluting privacy by scanning all private communications to unearth more CSAM, we must weigh it against the backdrop of past ineffective endeavours and proceed with caution.[14]
The UK government’s rollback on encryption is viewed by many as a turning point.[15] This narrative shift, even if partial, offers hope to privacy advocates globally and could guide nations grappling with similar issues, like India. In India, Rule 4(2) of the Information Technology (Intermediary Guidelines) Rules 2021 envisages significant social media intermediaries (entities with a subscriber base of 5 million or more) providing messaging services to introduce features that allow for identification of the first originator of a message. While the intent is to curb the spread of malicious content, the rule stirs a debate about its potential ramifications on user privacy and the integrity of end-to-end encryption. As India’s Supreme Court weighs in on this provision[16], the UK’s evolving perspective could offer valuable insights. Given the current technological paradigm, there is no available solution to target malicious content or CSAM without breaking end-to-end encryption. In light of this, it becomes imperative for governments to re-evaluate the implications and practicalities of their regulatory asks.
—
[1] A. Martin, ‘UK passes the Online Safety Bull- and no it doesn’t ban end-to-end encryption’, The Record (September 2023)
[2] C. Stouffer, ‘What is encryption? How it works?’, Norton (July 2023)
[3] Liberty Human Rights, ‘Joint second reading briefing on the Online Safety Bill for House of Lords: Private Messaging’ (Jan 2023)
[4] J. Porter, ‘The UK’s tortured attempt to remake the internet, explained’, The Verge (May 2023)
[5] C. Trueman, ‘UK rolls back controversial encryption rules of Online Safety Bill’, Computerworld (September 2023)
[6] Online Safety Bill, ‘Lord Parkinson of Whitley Bay at 4:15pm, Volume 832: debated on Wednesday 6 September 2023’, House of Lords, UK Parliament
[7] Online Safety Bill, ‘Lord Parkinson of Whitley Bay at 4:15pm, Volume 832: debated on Wednesday 6 September 2023’, House of Lords, UK Parliament
[8] The deployment of CSS on private messaging systems was highlighted in a research paper published by the technical directors of GCHQ and the National Cybersecurity Centre. Available here: I. Levy & C. Robinson, ‘Thoughts on child safety on Commodity Platforms’, Arxiv (July 2022)
[9] ‘Fact Sheet: client-side scanning’, Internet Society (September 2022)
[10] E. Portnoy, ‘Why adding client-side scanning breaks end-to-end encryption’, Electronic Frontier Foundation (Nov 2019)
[11] ‘Fact Sheet: client-side scanning’, Internet Society (September 2022)
[12] ‘Fact Sheet: client-side scanning’, Internet Society (September 2022)
[13] R. Anderson, ‘Chat Control or Child Protection?’, Foundation for Information Policy Research, Arxiv (October 2022
[14] R. Anderson, ‘Chat Control or Child Protection?’, Foundation for Information Policy Research, Arxiv (October 2022)
[15] P. Guest, ‘Britain admits defeat in the controversial fight to break encryption’, Wired (September 2023)
[16] Diary No. 32478/2019 and Diary No. 32487/2019