Snapchat failed to adequately warn its users about the extent of rampant “abuse schemes” targeting underage users — even as employees debated internally how to handle the crisis without causing panic, according to an unredacted version of a lawsuit released on Tuesday.
The new details emerged in a complaint first filed last month by New Mexico Attorney General Raúl Torrez.
It claims the photo-sharing app popular with children is a key platform for online sex predators who coerce minors into sending graphic images and then use them as blackmail.
Internal data showed Snap was receiving “about 10,000 user reports of sex each month,” a member of the company’s trust and safety team said in a November 2022 email, according to the updated lawsuit. The employee described the situation as “extremely disturbing”.
Another employee responded that the data, while “massive,” was likely a “small fraction of this abuse” that was actually happening on the app because it’s an “embarrassing thing” for users to report , according to the complaint.
“It’s disheartening to see that Snap employees have raised many red flags that have continued to be ignored by management,” Torrez said in a statement Tuesday.
The unredacted lawsuit also included an internal Snap marketing summary sent in December 2022 that acknowledged that “sexting or sending nudes has become a common behavior” that could “lead to disproportionate consequences and serious harm” to users.
According to the lawsuit, the document called for Snap to provide information to users about the risks “without causing fear in Snapchatters.”
“We can’t tell our audience NOT to send nudes; that approach is likely to be pointless, ‘tone-deaf’ and unrealistic,” the document states. “That said, we also can’t say, ‘If you do this: (1) don’t have your face in the photo, (2) don’t include tattoos, piercings or other defining physical characteristics, etc.'”
A Snap spokesperson said the app was designed with “built-in security safeguards” and “intentional design choices to make it difficult for outsiders to detect minors on our service.”
“We continue to evolve our security mechanisms and policies, from using advanced technology to detect and block certain activities, to banning suspicious accounts from friending, to working together with law enforcement and government agencies, among many more,” the spokesperson said in a statement. .
“We care deeply about our work here and it hurts us when bad actors abuse our service,” the statement added.
Snapchat — known for messages that disappear within 24 hours — is one of several social media apps that has drawn the ire of lawmakers for allegedly failing to protect children online.
As The Post has reported, Snap has broken ranks with other social media firms and passed the Children’s Online Safety Act, a bipartisan bill that would impose a legal “duty of care” on firms to ensure that their apps not promote child sexual abuse and other harm online.
In March 2022, a Snap consultant warned company employees that the “ephemeral nature of Snaps” could lull new users into a “false sense of privacy.”
Elsewhere in the New Mexico complaint, a Snap executive emailed colleagues in 2022 expressing concern about the firm’s ability to “actually verify” the ages of users — despite its claim that it did not allow children under 13 to used Snapchat.
“[T]that app, like many other platforms, doesn’t use an age verification system, so any kid who knows how to write a fake birthday can create an account,” the executive said.
In August 2022, a Snap employee discussed the importance of taking steps to “ensure that user reports of care and fraud do not continue to fall through the cracks.”
Other employees responded to the email, with one citing a case in which a certain user’s account had “75 different reports against it since October. ’21, citing nudity, minors and extortion, yet the account was still active.”
At one point, a fed-up Snap employee said the app was “overpowered by this scam — right now.”
“We’ve been crossing our fingers and shaking hands all year long,” the employee said, according to the complaint.
Last December, the New Mexico Attorney General’s office sued Facebook and Instagram parent Meta for failing to protect children from contact by sexual predators on the apps.
#Snapchat #failed #properly #warn #extortion #schemes #targeting #underage #users #lawsuit
Image Source : nypost.com