Meta executives have cautioned that the proposed encryption plan might hinder the detection of child abuse cases
Meta executives cautioned that the encryption of messages on Facebook and Instagram might hinder the detection and reporting of child exploitation cases.
Executives at Meta Platforms have proceeded with plans to encrypt messaging services associated with its social media platforms, even in light of internal warnings that this change could significantly hinder the company’s capacity to identify and report cases of child exploitation to authorities, according to newly disclosed court filings.
The internal documents submitted in a lawsuit by Raúl Torrez in a New Mexico state court disclose apprehensions among high-ranking safety and policy officials as the company geared up to implement default end-to-end encryption across messaging services linked to Facebook and Instagram.
“We are on the verge of making a questionable decision as a company.” “This is so irresponsible,” Monika Bickert remarked in an internal chat exchange from March 2019, as Mark Zuckerberg was getting ready to publicly unveil the encryption plan.
The documents released on Friday include internal emails, messages, and briefing materials acquired during the discovery phase of the case against Meta. The documents offer a glimpse into the company’s internal evaluations of the risks linked to the encryption initiative and the perspectives of senior executives regarding its potential impact at that time.
Torrez’s lawsuit claims that Meta permitted online predators extensive access to underage users on its platforms, facilitating connections with victims and, in some cases, resulting in real-world abuse and human trafficking. This month, the case that reached trial marks the first of its kind against the company to be presented to a jury.
The disclosures arise as Meta encounters increasing legal and regulatory examination globally regarding the safety of young users on its platforms. Alongside the New Mexico case, a coalition comprising over 40 state attorneys general across the United States has initiated lawsuits asserting that the company’s products negatively impact youth mental health. Multiple school districts have initiated legal proceedings, and Zuckerberg recently provided testimony in a separate case filed by attorneys representing a teenager who claims to have been harmed by the company’s products in Los Angeles County Superior Court.
The New Mexico filing explicitly charges Meta with misrepresenting the safety implications of its plan to implement default end-to-end encryption in its Messenger service, initially announced in 2019 and subsequently expanded to encompass direct messages on Instagram.
End-to-end encryption guarantees that messages are sent in a format that only the recipient’s device can interpret, a privacy feature commonly utilized in messaging platforms such as iMessage, Google Messages, and WhatsApp.
Child safety advocates, such as the National Center for Missing and Exploited Children, have cautioned that the integration of such technology into large social networks may increase risks, as it allows children to easily connect with strangers.
Internal communications from Meta reveal that certain safety officials within the company expressed comparable concerns.
Bickert alleged that the company made “gross misstatements of our ability to conduct safety operations” in its promotion of encryption, as stated in the documents.
“I must admit, I’m not particularly invested in assisting him with this sale,” Bickert noted about Zuckerberg’s attempts to advocate for the initiative on privacy grounds. “Due to end-to-end encryption, it is impossible to detect the planning of terror attacks or instances of child exploitation,” and to proactively report those cases to law enforcement.
A briefing document from February 2019 referenced in the filings estimated that Meta’s reports to the National Center for Missing and Exploited Children concerning child nudity and sexual exploitation imagery would have decreased from 18.4 million to 6.4 million if Messenger had been encrypted — a reduction of 65%.
An update indicated that the company would have been “unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases [and] 9 threatened school shootings.”
Safety officials have also indicated that children may be groomed via the company’s social networks prior to being exploited in private messaging channels.
“FB enables pedophiles to connect with one another and locate children through the social graph, facilitating a seamless transition to Messenger,” stated Antigone Davis in a 2019 email assessing the strategy.
Davis highlighted the differences in risks associated with WhatsApp, Meta’s encrypted messaging service, pointing out that it is not directly connected to a social network.
“WA (WhatsApp) does not facilitate social connections easily, which suggests that implementing end-to-end encryption on Messenger will be significantly more challenging than anything we have experienced or observed on WA,” she wrote.
In response to inquiries from Reuters, Meta spokesperson Andy Stone indicated that the issues highlighted by Bickert and Davis prompted the company to create enhanced safety features prior to the launch of encrypted messaging on Facebook and Instagram in 2023.
“The issues highlighted in 2019 are precisely why we created a variety of new safety features aimed at detecting and preventing abuse, all intended to function within encrypted chats,” Stone stated.
With the revised system, messages are encrypted by default; however, users retain the ability to report any problematic conversations to Meta. The company can then review these messages and, when necessary, refer cases to law enforcement.
Meta has also implemented specific protections for underage users, including measures aimed at preventing adults from reaching out to minors they are unfamiliar with.