Technology

Meta tests tools to protect teen users from sextortion and unwanted nudes

New Tools to Help Protect Against Sextortion and Intimate Image Abuse on Meta

Meta is testing new tools and features to protect young people from unwanted nudity and sextortion scams, including Nudity Protection in DMs, a feature that automatically blurs nude images.

Part of this crackdown is because Meta says it’s seeing a trend of financial sextortion, in which someone gets nude photos from another person and says they’ll post them online unless the victim sends money or gift cards.

“It is a really horrific crime that preys on making people feel alone and ashamed,” Antigone Davis, Meta’s director of global safety, told CNN. “It’s been well-documented that the crime is growing, and this is one of the reasons that we want to get out ahead and make sure people are aware of what we’re doing as we continually evolve our tools.”

Nudity Protection in DMs will be the default for people under 18 years old and will be encouraged for all users. When users who have the protection turned on attempt to send a photo containing nudity, they will be reminded that they need to be cautious and they can unsend it at any time. If a user tries to forward a nude image, they’ll be encouraged to reconsider and be “responsible and respectful.” And, finally, when a user receives an image containing nudity, it will be blurred automatically, and they will be encouraged not to feel pressured to respond in kind. People will also be directed to safety tips when sending or receiving nude images. 

“This feature is designed not only to protect people from seeing unwanted nudity in their DMs but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Meta said in a blog post.

Meta can see if an image contains nudity due to on-device machine learning, which includes end-to-end encryption protection. So, unless someone chooses to report an image in a DM, Meta says it won’t have access to it.

Meta is also adding to its work to stop sextortion on its app by developing technology to identify accounts engaging in sextortion as a precautionary step. If an account is flagged as a potential sextortion account, messages sent by that account will go straight to a recipient’s hidden request folder. And there won’t be a “Message” button on a teen’s profile if an account is flagged as a potential sextortion account, even if they’re already connected to the teen. In addition, if someone is interacting with an account that has been removed for sextortion, they’ll see a pop-up message directing them to more resources. Finally, when teens report relevant issues, they’ll be directed to local child safety helplines.

“This industry cooperation is critical because predators don’t limit themselves to just one platform – and the same is true of sextortion scammers,” Meta said. “These criminals target victims across the different apps they use, often moving their conversations from one app to another. That’s why we’ve started to share more sextortion-specific signals to Lantern, to build on this important cooperation and try to stop sextortion scams not just on individual platforms, but across the whole internet.”

Mashable