Apple has mysteriously deleted the mention of its controversial iCloud photo-scanning tech for detecting Child Sexual Abuse Material from the company’s Child Safety database, months after the feature was delayed citing a ton of privacy and safety concerns. In a nutshell, the scanning tech would look at images that depict child sexual abuse and exploitation in any form before they are uploaded to its iCloud online storage service. All the scanning is supposed to happen on-device, and the technique behind it involves cryptographical hashing.

A hash is essentially a digital fingerprint of an image that is obtained by passing it through an algorithm for the sake of matching. In simple terms, Apple would get a hashed version of a problematic CSAM image from an authority like National Center for Missing & Exploited Children and would store it on iPhones. The photos uploaded to iCloud were supposed to be scanned, and if they match the CSAM hashes stored on an iPhone, they would be analyzed, and authorities would be notified accordingly. Activists and cybersecurity experts raised many red flags around the whole implementation, and it appears to have made a definitive impact.

Related: Here’s How Apple Should Make Smart Homes Smarter

First spotted by MacRumors, Apple has inexplicably removed all mention of the iCloud photo-scanning tech for CSAM imagery from its Child Safety database. However, the FAQ document for its Expanded Protections for Children is still live on its servers, and it answers some of the most important questions (Read: concerns) about the whole system. So far, Apple hasn’t provided an official explanation for why all traces of its CSAM detection feature for photos destined to iCloud storage have vanished. Based on fierce criticism that Apple has received from non-profits like the Electronic Frontier Foundation, one might assume that Apple has finally caved to the pressure and decided to pull the plugs on it. But in the past few months, Apple has also defended the feature on multiple occasions. So it is entirely plausible that Apple is simply reworking the language or making a few necessary tweaks before bringing it back on the Child Safety database. As of now, the feature remains in a state of unspecified delay.

Why Apple’s Move Stirred A Storm?

Apple CSAM iCLoud Scan

While the system is complex in itself and experts have broken down almost all aspects of it in detail, there’s a whole blank violation here — the promise of a safe and private ecosystem. The whole idea of scanning photos before cloud storage and in the Messages app was seen by many as crossing a sacred line. Then there’s the whole issue with iCloud photo storage not being encrypted, which means it is vulnerable to excessive data requests by oppressive governments, third-party hacking, and even possible intrusion by Apple employees. The latter is not unheard of, as a recent investigation revealed that Amazon employees routinely spied on customers’ sensitive data.

Coming back to Apple’s iCloud photo-scanning plans for spotting CSAM imagery, there were also worries that Apple might be forced to widen the scope of the content that it must actively look for in images. This could have huge implications, especially in countries where freedom of expression is in peril and activists and journalists are at constant risk of surveillance. Even though Apple has assured that it won’t budge under pressure and will not entertain such demands, the company is known to have made significant compromises in the past. An explosive report from The Information also shed light on a $275 billion deal that Tim Cook helped secure to evade regulatory pressure in China, one of its biggest markets and the hub of its hardware assembly.

Next: Should Apple Kill Its iPhone Photo-Scanning Plans?

Sources: Apple, MacRumors, The Information