It is no news that technological advancements come loaded with data security and privacy concerns. I don't think this is the first time we are witnessing Government(s) demand a "backdoor" undermining encrypted messaging protocols. The latest salvo where Meta-owned WhatsApp
The bill includes provisions that could require messaging app companies to implement content moderation policies that would only be possible to comply with by compromising end-to-end encryption(E2EE). Failing, it could face fines of up to 4% of its parent company's annual turnover. This significant penalty would leave companies with little choice but to either comply with the regulations or withdraw from the UK market altogether.
E2EE is regarded as the “gold standard” for secure communication. The technique employs a unique algorithm to transform messages into a seemingly random string of characters, making it practically impossible for anyone to decipher the message without the encryption "key." The key is only available on the sending and receiving devices, meaning that even if hackers gain access to the message while in transit, they cannot decode it without the key. While E2EE does not guarantee complete security, keeping the encryption keys on devices makes it extremely difficult for unauthorized parties to access the contents of messages.
Secure messaging apps like WhatsApp, Signal, and Element that safeguard users' privacy by employing this robust encryption technology are openly criticizing the OSB, arguing that the bill will jeopardize online safety and they will stop providing services if it passes.
"Our users all around the world want security - 98% of our users are outside the UK, they do not want us to lower the security of the product," Will Cathcart, UK WhatsApp head
said , “And the app would rather accept being blocked in the UK.”
Meredith Whittaker, the president of Signal, has also
While detractors of the bill argue that it would grant Ofcom(Office of Communications) the authority to mandate that private, encrypted messaging apps and other services implement "accredited technology," the government believes the bill “does not represent a ban on end-to-end encryption.”
The legislation seeks to tackle a range of issues that the government deems as dangers posed by the internet, including illegal content such as child sexual abuse and terrorism, as well as "harmful" content such as pornography and bullying. The bill aims to increase the responsibility of technology platforms for the type of content they host, requiring them to prevent such content from appearing or swiftly remove it when it does.
According to a
Of these cases, more than 75% of reports that included social media or gaming sites were attributed to only two companies: Snapchat and Meta. Snapchat was responsible for over 4,000 incidents, while Meta's flagship apps - Facebook, Instagram, and WhatsApp - were mentioned in more than 3,000 incidents.
For years, the government and child protection organizations have been advocating that encryption poses a major obstacle in combating online child abuse.
“It is important that technology companies make every effort to ensure that their platforms do not become a breeding ground for pedophiles," the Home Office said.
In response to such criticisms, platforms like Meta have
While the government strives to make the internet a safe place by introducing “accredited technology”, many public activists and IT professionals fear that doing so will bring more harm than good.
Dr. Monica Horten from the Open Rights Group said: "With over 40 million users of encrypted chat services in the UK, this turns it into a mass-surveillance tool, with potentially damaging consequences for privacy and free-expression rights."
“Rather than using kids and terrorists as an excuse to expand bulk intercept capabilities, governments need to calmly revisit several policy areas, including family violence, political violence, and online crime. Details matter; they will vary from one country to another depending on local law, police practice, the organization of social work, the availability of firearms, and political polarisation (this list is not exhaustive),”
states Prof Ross Anderson in his paper titled “Chat Control or Child Protection?”
Governments have a history of compelling tech companies to weaken or eliminate end-to-end encryption in their products, or to create "back doors" like the
In 2016, the FBI made an
“Maybe it’s an operating system for surveillance, maybe the ability for the law enforcement to turn on the camera,” Mr. Cook
said . “I don’t know where it stops.”
Apple’s feud with the FBI does not seem to rest. Back in December 2022, the FBI
In 2018, Australian legislators approved a bill called “
This law is modeled after Britain’s 2016
Other countries are also exploring the possibility of implementing new encryption laws. For instance, in India, officials informed the country's Supreme Court in October 2019 that Facebook is required by Indian law to decrypt messages and provide them to law enforcement when requested.
“They can’t come into the country and say, ‘We will establish a non-decryptable system,’” India’s attorney general, K.K. Venugopal,
As it stands, The United States is a party to several international intelligence-sharing arrangements—one of the most prominent being the “
With the addition of India and Japan, the surveillance group in 2020
“We call on technology companies to work with governments to take the following steps, focused on reasonable, technically feasible solutions: Embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offenses and safeguarding the vulnerable; enable law enforcement access to content in a readable and usable format where an authorization is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight; and engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.”
While the tech agencies are “pushing back,” concerned government institutions are optimistic that the proposed laws will help find a middle ground where national security is assured without compromising data privacy.
"It is not a choice between privacy or child safety - we can and we must have both,” the UK government statement read.
In the summer of 2021, Apple
This was the kind of middle ground that the government appreciated, but soon after the announcement, the company faced extreme backlash from privacy and security experts. Apple initially fought back but then postponed and later on
Activists suggest that the FBI and other authorities should pump up their technical expertise instead of depending on tech vendors or third-party security experts to crack encryption and other security systems.
"Enhancing the government's technical capability is one potential solution that does not mandate backdoors," Representative Diana DeGette, a Colorado Democrat,
said during a hearing of the House of Representatives Energy and Commerce Committee's oversight subcommittee.
FBI representative argued that it is unlikely that the FBI can hire the experts it needs to keep up with new encryption services that continue to roll out.
"We live in such an advanced age of technology development, and to keep up with that, we do require the services of specialized skills that we can only get through private industry."
Privacy advocates hold that implementing encryption backdoors cannot guarantee safety and is not foolproof. Even if vulnerabilities are hidden or kept secret, there is a risk of them being discovered by others and possibly misused. The 2015 paper "