The head of Meta’s WhatsApp messaging service has traveled to the UK to whip up a row with the government about end-to-end encryption. Speaking to journalists in London on Thursday, Will Cathcart did everything but compare the UK’s proposed new internet law to the erosion of online privacy in countries like Iran, India, and Brazil. Out of all the regulations he has seen in the Western world, he says, the UK’s Online Safety Bill is the one he’s most alarmed about.

Cathcart says he is concerned that the bill could make it harder for WhatsApp and other messaging platforms to provide end-to-end encryption, a security measure that means that no one other than the sender and recipient can see the content of a message.

“It’s hard to imagine we’re having this conversation about a liberal democracy that might go around people’s ability to communicate privately,” he says.

But, despite what Cathcart and others say, the bill isn’t really about encryption. It’s a sprawling, Frankenstein’s monster of a bill that has endured a period of extreme turbulence in British politics, outlasting four prime ministers and five digital ministers—with each change of government adding in new amendments and concessions. It is supposed to tackle a broad range of potentially harmful content on social media and to hold tech companies accountable for a lot of the activity on their platforms. But Cathcart’s worries come mainly from a single sentence, which outlines requirements for tech companies to use “accredited technology” to identify child abuse content being sent publicly and privately on their platforms. That technology, WhatsApp asserts, doesn’t exist.

“I haven’t seen anything close to effective,” Cathcart said. 

In 2021, Apple did try to introduce a system that would scan users’ iCloud photos for child sexual abuse material (CSAM). Critics of that plan said that there was a risk that governments could use the system to look for other types of content, and it was shelved in late 2022.

If the technology to scan messages for CSAM can’t be developed, the only way for companies to comply with the law would be to break their encryption, which platforms like WhatsApp and Signal have refused to do. In February, Signal threatened to leave the UK if the new law compelled it to weaken its encryption. “We would absolutely 100 percent walk rather than ever undermine the trust that people place in us to provide a truly private means of communication,” Signal president Meredith Whittaker told the BBC.

Cathcart says WhatsApp would not comply with any efforts to undermine the company’s encryption. “We’ve recently been blocked in Iran,” he says. “We’ve never seen a liberal democracy do that, and I hope it doesn’t come to that. But the reality is, our users all around the world want security.” 

The bill does not explicitly call for the weakening of encryption, but Cathcart and others who oppose it say it creates legal gray areas and could be used to undermine privacy down the line.

“It is a first step,” says Jan Jonsson, CEO of Swedish VPN company Mullvad, which counts the UK as one of its biggest markets. “And I think the general idea is to go after encryption in the long run.” 

“Nobody’s defending CSAM,” says Barbora Bukovská, senior director for law and policy at  Article 19, a digital rights group. “But the bill has the chance to violate privacy and legislate wild surveillance of private communication. How can that be conducive to democracy?” 

The UK Home Office, the government department that is overseeing the bill’s development, did not supply an attributable response to a request for comment. 

Children’s charities in the UK say that it’s disingenuous to portray the debate around the bill’s CSAM provisions as a black-and-white choice between privacy and safety. The technical challenges posed by the bill are not insurmountable, they say, and forcing the world’s biggest tech companies to invest in solutions makes it more likely the problems will be solved.

“Experts have demonstrated that it’s possible to tackle child abuse material and grooming in end-to-end encrypted environments,” says Richard Collard, associate head of child safety online policy at the British children’s charity NSPCC, pointing to a July paper published by two senior technical directors at GCHQ, the UK’s cyber intelligence agency, as an example.  

Companies have started selling off-the-shelf products that claim the same. In February, London-based SafeToNet launched its SafeToWatch product that, it says, can identify and block child abuse material from ever being uploaded to messengers like WhatsApp. “It sits at device level, so it’s not affected by encryption,” says the company’s chief operating officer, Tom Farrell, who compares it to the autofocus feature in a phone camera. “Autofocus doesn’t allow you to take your image until it’s in focus. This wouldn’t allow you to take it before it proved that it was safe.” 

WhatsApp’s Cathcart called for private messaging to be excluded entirely from the Online Safety Bill. He says that his platform is already reporting more CSAM to the National Center for Missing and Exploited Children (NCMEC) than Apple, Google, Microsoft, Twitter and TikTok combined. 

Supporters of the bill disagree. “There’s a problem with child abuse in end-to-end encrypted environments,” says Michael Tunks, head of policy and public affairs at the British nonprofit Internet Watch Foundation, which has license to search the internet for CSAM. 

WhatsApp might be doing better than some other platforms at reporting CSAM, but it doesn’t compare favorably with other Meta services that are not encrypted. Although Instagram and WhatsApp have the same number of users worldwide according to data platform Statista, Instagram made 3 million reports versus WhatsApp’s 1.3 million, the NCMEC says.

“The bill does not seek to undermine end-to-end encryption in any way,” says Tunks, who supports the bill in its current form, believing it puts the onus on companies to tackle the internet’s child abuse problem. “The online safety bill is very clear that scanning is specifically about CSAM and also terrorism,” he adds. “The government has been pretty clear they are not seeking to repurpose this for anything else.”