The Metropolitan Police commissioner accused the tech giants of making it harder to detect and stop terrorists. Dame Cressida Dick wrote in the Telegraph on Saturday that the tech giants’ focus on end-to-end encryption has made it “impossible in some cases” for the police to do their job.
On Wednesday, Home Secretary Priti Patel launched a new fund for technologies that will keep children safe. She also urged tech firms to put user safety before profit, she said. But cybersecurity experts said they weren’t sure it was possible to build the solutions the government wanted.
Dame Cressida, writing on the occasion of the 20th anniversary of the September 11 attacks, emphasized that advances in communication technologies mean that terrorists can now empower “anyone, anywhere and anytime” through social media and the internet. By contrast, the UK constantly needs to develop its own digital capabilities to keep up with terrorists who use technology to their advantage.
Dame’s message echoes that of Ms Patel, who launched the Safety Tech Challenge Fund at the G7 interior ministers meeting earlier this week. Open to experts from around the world, the fund aims to combat child sexual abuse online. Up to £85,000 will be awarded to five applicants for developing new technologies that enable the detection of child sexual abuse material (CSAM) online without breaking end-to-end encryption.
End-to-end encryption is a privacy feature that makes it impossible for anyone other than the sender and recipient to read messages sent online. While tech giants like Facebook have said that using this type of technology will protect users’ privacy, many governments, including the US, UK and Australia, have repeatedly challenged the idea since 2019.
Apple plan controversy
Cybersecurity and privacy experts believe Ms Patel and Dame Cressida’s comments may be in response to Apple’s decision earlier this month to delay its plan to scan iPhones for CSAM.
First announced in August, the detection technology compares images to unique “digital fingerprints” or hashes of known CSAM material from a database provided by the National Center for Missing and Exploited Children before uploading them to iCloud.
Apple’s technology has been widely criticized by privacy groups and the cybersecurity industry as it involves using one’s own device to check for a potential criminal, setting a dangerous precedent.
“We already have end-to-end encryption on Apple’s iMessage messaging technology. It’s strange that law enforcement and the government haven’t reached out to Apple about this. Instead it’s all about attacking Facebook and WhatsApp.”
Much has been written about the wealth of data technology giants have about the users of their services, especially the fact that they constantly monitor user behavior and interests to provide personalized ads. He argues that tech firms already have the technology they need to detect pedophiles and terrorists simply by monitoring their behavior. In this respect, he thinks they don’t have to compromise a user’s privacy by looking at all the personal files on their phones.
Muffett, who has more than 30 years of experience in cybersecurity and cryptography, said: “If you have the Facebook account of a middle-aged man randomly texting a dozen young people, then you have potentially suspicious activity. It may be innocent, but it’s definitely a topic worth investigating.”
“The UK government is trying to detect CSAM by looking at the content, as when trying to spy, rather than trying to observe the behavior.”
In addition, he says, multiple cybersecurity researchers have tested Apple’s NeuralHash algorithm and discovered that it has mixed two completely different images as the same photo, which is why they fear Apple will falsely accuse users of having criminal content.
Criticism of the new technology fund
A leading cybersecurity expert, who did not want to be named, said what the government wanted was technically not possible.
“You can change the law of the land, but you cannot change the law of science.
“There is no way to allow mass scanning of devices without undermining the protections of end-to-end encryption.” “If someone manages to maintain a valid end-to-end encryption while detecting images of child sexual abuse, they will earn a lot more than £85,000. So I don’t understand what economics is.”
Another cybersecurity boss agrees: “Wherever the government is, Facebook and other sauces he is making a statement to do more and give them (pedophilia and terrorists) greater reach. “If you read between the lines, Ms. Patel basically says they want to hire hackers.”
There are also privacy concerns. Dr Rachel O’Connell, an online child safety expert and founder of TrustElevate, said: “Can we trust those in power will not abuse these powers?”
According to data protection expert Pat Walshe, Apple’s solution is illegal. He said he had asked the tech giant to explain how it could be deployed in Europe and has yet to receive a response. “The European Court of Justice (ECJ) says the mobile phone is an extension of our private sphere. The courts state that the device and any information related to it is part of the private sphere of our lives, that is, the mobile phone must be protected within the scope of the private sphere. In other words, it requires protection under the European Convention on Human Rights (ECHR).
Mr Walshe, who leads a team of government and law enforcement agencies at mobile operator Three, also has serious concerns about the tech funding proposal, saying it raises too many questions about privacy. Instead, he says, there should be better, more direct reporting channels to enable both citizens and communications providers to report CSAM to tech firms or law enforcement. “And law enforcement needs a huge boost in training, manpower and funding to deal with the reports,” he said. “I want more emphasis on that rather than breaking down the technology that keeps us safe every day.”