Staff expressed concern about company’s inadequate reaction to child grooming, as per disclosed internal records in lawsuit.
Meta’s internal documents, revealed in a legal filing, suggest around 100,000 children on Facebook and Instagram face daily online sexual harassment, including the sending of explicit adult images. The unsealed filing, sourced from the New Mexico attorney general’s office, presents multiple allegations against Meta, derived from employee presentations and internal communications. One incident in 2020 involves the 12-year-old daughter of an Apple executive being solicited through Instagram’s messaging feature, IG Direct.
According to the documents, a Meta employee expressed concern that incidents like these could anger Apple to the point of considering removal from the App Store. A high-ranking Meta employee testified to the US Congress last year, revealing that his daughter had been solicited through Instagram. Despite his efforts to address the issue, he claimed that they were disregarded.
The recent legal submission is part of an ongoing lawsuit launched by the New Mexico attorney general’s office on December 5. The lawsuit contends that Meta’s social networks have transformed into hubs for child predators. Raúl Torrez, the state’s attorney general, accuses Meta of facilitating adults in locating, messaging, and grooming children. In response, the company refutes the claims in the lawsuit, stating that it “mischaracterizes our work using selective quotes and cherry-picked documents.”
In response to the filing on Wednesday, Meta released a statement, saying, “We aim to provide teenagers with safe, age-appropriate online experiences, offering over 30 tools to support both teens and their parents. Over the past decade, we have dedicated efforts and hired professionals committed to ensuring the safety and support of young people in the online space.”
The complaint also underscores Meta employees’ apprehensions regarding child safety. In an internal Meta chat from July 2020, one employee inquired, “What specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?” As per the complaint, the response he received was, “Somewhere between zero and negligible.”
Meta’s statement acknowledges the company’s significant efforts to prevent teenagers from encountering unwanted contact, especially from adults.
The New Mexico lawsuit comes after an April investigation by The Guardian, which revealed Meta’s failure to report or detect the use of its platforms for child trafficking. The investigation also brought to light how Messenger, Facebook’s private messaging service, serves as a platform for traffickers to communicate for the purpose of buying and selling children.
Documents included in the lawsuit reveal that Meta employees discussed the use of Messenger for “coordinating trafficking activities” and serving as a platform for “every human exploitation stage (recruitment, coordination, exploitation).”
However, an internal email from 2017 outlined executive resistance to scanning Facebook Messenger for “harmful content,” citing concerns that it would put the service “at a competitive disadvantage compared to other apps that might offer more privacy,” as stated in the lawsuit.
In December, Meta faced widespread criticism for implementing end-to-end encryption for messages on Facebook and Messenger. Encryption renders message contents unreadable to anyone except the sender and intended recipient by converting text and images into indecipherable codes that are decrypted upon receipt. Child safety experts, policymakers, and law enforcement argued that encryption hinders efforts to rescue victims of child sex trafficking and prosecute predators. Meanwhile, privacy advocates applauded the decision for protecting users from government and law enforcement surveillance.