Snapchat failed to adequately warn its users about the extent of rampant “sextortion schemes” targeting underage users – even as employees debated internally about how to address the crisis without causing panic, according to an unredacted version of a lawsuit released Tuesday.

The new details surfaced in a complaint originally filed last month by New Mexico Attorney General Raúl Torrez.

It alleges that the photo sharing app popular with kids is a major platform for online sex predators who coerce minors into sending graphic images and then use them as blackmail.

Internal data showed that Snap was receiving “around 10,000 user reports of sextortion each month,” a member of the company’s trust and safety team said in a November 2022 email, according to the updated suit. The employee described the situation as “incredibly concerning.”

Another employee replied that the data, while “massive,” was likely a “small fraction of this abuse” that was actually taking place on the app because it is an “embarrassing issue” for users to report, according to the complaint.

“It is disheartening to see that Snap employees have raised many red flags that have continued to be ignored by executives,” Torrez said in a statement Tuesday.

The unredacted lawsuit also included an internal Snap marketing brief sent in December 2022 that acknowledged that “sexting or the sending of nudes has become a common behavior” that can “lead to disproportionate consequences and severe harms” for users.

The document called for Snap to provide information to users about the risks “without striking fear into Snapchatters,” according to the lawsuit.

“We can’t tell our audience NOT to send nudes; this approach is likely futile, “tone deaf” and unrealistic,” the document said. “That said, we also can’t say, ‘If you DO do it: (1) don’t have your face in the photo, (2) don’t have tattoos, piercings, or other defining physical characteristics in view, etc.”

A Snap spokesperson said the app was designed with “built-in safety guardrails” and with “deliberate design choices to make it difficult for strangers to discover minors on our service.”

“We continue to evolve our safety mechanisms and policies, from leveraging advanced technology to detect and block certain activity, to prohibiting friending from suspicious accounts, to working alongside law enforcement and government agencies, among so much more,” the spokesperson said in a statement.

“We care deeply about our work here and it pains us when bad actors abuse our service,” the statement added.

Snapchat — known for messages that disappear in 24 hours — is one of several social media apps that has drawn the ire of lawmakers for allegedly failing to protect kids online.

As The Post has reported, Snap has broken ranks with other social media firms and endorsed the Kids Online Safety Act, a bipartisan bill that would impose a legal “duty of care” on firms to ensure their apps don’t fuel child sex abuse and other online harms.

In March 2022, a Snap consultant warned company employees that the “ephemeral nature of Snaps” can lull young users into a “false sense of privacy.”

Elsewhere in New Mexico’s complaint, a Snap executive emailed colleagues in 2022 expressing concern about the firm’s ability to “actually verify” user ages – despite its claim that it did not allow kids under 13 to use Snapchat.

“[T]he app, like many other platforms, doesn’t use an age-verification system, so any child who knows how to type a fake birthday can create an account,” the executive said.

In August 2022, a Snap employee discussed the importance of taking steps to “ensure that user reports related to grooming and sextortion are not continuing to fall through the cracks.”

Other employees replied to the email, with one citing a case in which a particular user’s account had “75 different reports against it since Oct. ’21, mentioning nudes, minors and extortion, yet the account was still active.”

At one point, a fed-up Snap employee vented that the app was “over-run by this sextortion s— right now.”

“We’ve twiddled our thumbs and wrung our hands all f—king year,” the employee said, according to the complaint.

Last December, the New Mexico attorney general’s office sued Facebook and Instagram parent Meta for failing to protect kids from outreach by sex predators on the apps.

Share.
2024 © Network Today. All Rights Reserved.