Technology companies that offer encrypted messaging services, such as WhatsApp, could be required to introduce technology to identify child sexual abuse (CSA) material or risk the threat of large fines.
Home secretary Priti Patel yesterday published an amendment to the draft Online Safety Bill that will give powers to regulators to require tech companies to develop or roll out new technologies to detect harmful content on their platforms.
The move will impact companies such as Facebook, owner of WhatsApp, which has faced repeated attacks from government ministers over its plans to introduce end-to-end encrypted messaging services on its Facebook Messenger and Instagram services.
“Child sexual abuse is a sickening crime,” said Patel. “We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe.
“Privacy and security are not mutually exclusive – we need both, and we can have both, and that is what this amendment delivers.”
The amendment requires technology companies to use their “best endeavours” to identify, and to prevent people seeing, CSA material posted publicly or sent privately.
Telecommunications regulator Ofcom will have the power to impose fines of up to £18m or 10% of the turnover of companies that fail to comply under the amendment.
Client-side scanning
The draft legislation is likely to put pressure on messaging companies to incorporate technologies such as client-side scanning, which uses software placed on mobile phones or computers to inspect the content of messages before they are encrypted.
According to the National Crime Agency (NCA), there are an estimated 550,000 to 850,000 people in the UK who pose a serious risk to children. In the year to 2021, the NCA said that more than 33,000 obscene publications offences were recorded by the police.
Ministers argue that end-to-end encryption makes it difficult for technology companies to see what is being posted on messaging services, although tech companies have argued that there are other ways to police child sexual abuse.
“Tech firms have a responsibility not to provide safe spaces for horrendous images of child abuse to be shared online,” said digital minister Nadine Dorries. “Nor should they blind themselves to these awful crimes happening on their sites.”
Apple attempted to introduce its own client-side scanning software in August 2021, but abandoned its attempt after 14 top computer scientists, including encryption pioneers Ron Rivest and Whit Diffie, found Apple’s plans were unworkable, were open to abuse, and threatened internet security.
Their paper Bugs in our pockets: the risks of client-side scanning, published by Columbia University and available on Arxiv, identified 15 ways that states or malicious actors, and even targeted abusers, could turn the technology around to cause harm to others or society.
Neil Brown, a technology lawyer, wrote in a detailed analysis of the proposals on Twitter, that there was a need for Parliament to hold a serious debate over whether the proposed measures are necessary and proportionate.
“I hope Parliament has a robust and detailed debate as to whether forcing what some have called ‘bugs in your pocket’ – breaking end-to-end encryption (unsurprisingly, others argue it doesn’t) to scan your private communications – is a necessary and proportionate approach,” Brown wrote.
The Department for Digital, Culture, Media and Sport (DCMS) has funded a Safety Tech Challenge Fund in September 2021 which aims to develop technologies to detect child sexual abuse material in end-to-end-encrypted services, while, it claims, respecting the privacy of users.
The government announced five winning projects in November 2021, but has yet to publish an independent assessment of whether the technologies are effective both at detecting abuse material and protecting the privacy of people using end-to-end encryption.
Brown asked on Twitter: “What is the point of this very sensible idea from DCMS – running a challenge to determine viability – if we are to see legislation before the outcomes of the challenge are published and capable of scrutiny?”
‘Scope creep’
Critics say the technology could be subject to “scope creep” once installed on phones and computers, and could be used to monitor other types of message content, potentially opening up backdoor access to encrypted services.
Writing in the Telegraph, Patel said the Online Safety Bill would protect both the safety of users as well as their right to privacy and freedom of expression.
“Things like end-to-end encryption significantly reduce the ability for platforms to detect child sexual abuse,” she wrote. “The Online Safety Bill sets out a clear legal duty to prevent, identify and remove child sexual abuse content, irrespective of the technologies they use.”
Patel said that while the government supports the responsible use of encryption technologies, for example to protect financial transactions, “the implementation of end-to-end encryption or other technologies in a way that intentionally binds companies to abhorrent child sexual abuse happening on their platforms will have a disastrous impact on child safety”.
The onus is on tech companies to develop or source technologies to mitigate the risk, regardless of their design choices, she wrote.
Rob Jones, director general of the NCA, said online platforms can be a key tool for child abusers, who use them to view and share abuse material, identify potential victims and to discuss their offending.
“Identifying these individuals online is crucial to us uncovering the real-world abuse of children,” he said.
Peter Wanless, chief executive of the NSPCC, said the amendment would “strengthen the protections around private messaging”, adding: “This positive step shows there doesn’t have to be a trade-off between privacy and detecting and disrupting child abuse material and grooming.”
Data protection watchdog the Information Commissioner’s Office (ICO) said in January that the debate around end-to-end encryption had become unbalanced, with too much focus on the risks and not enough on the benefits.
Stephen Bonner, the ICO’s executive director of innovation, defended end-to-end encryption services, saying: “E2EE serves an important role in safeguarding both our privacy and online safety. It strengthens children’s online safety by not allowing criminals and abusers to send them harmful content.”