Tech | Visa | Scholarship/School | Info Place

Meta will automatically blur nudity in Instagram direct messages in latest teen safety measure

Meta has announced that it is testing a new feature on Instagram designed to help protect young people from unwanted nudity or sextortion scams. These include a feature in DM called “Nudity Protection” that automatically blurs images detected as containing nudity.

The tech giant will also issue a warning to encourage teens to think twice before sharing intimate images, urging them to protect themselves. Meta said it hopes this will increase protection against scammers who may send nude photos to trick people into sending images of themselves in return.

This is also make the changes it recommends Make it harder for potential scammers and criminals to find and interact with teens. Meta said it is developing new technology to identify accounts “likely” to be involved in sextortion scams and to impose some restrictions on how these suspicious accounts can interact with other users.

In another move announced Thursday, Meta said it was increasing the data it shares with Lantern, a cross-platform online child safety program, to include more “signals of targeted extortion.”

The social networking giant has a long-standing policy against sending unwanted nude photos or trying to force other users to send intimate images. However, that doesn’t stop these issues from being prevalent online and causing distress to many teenagers and young adults, sometimes with extremely tragic consequences.

We’ve summarized some of the latest changes in more detail below.

Nudity

Nudity Protection in DMs aims to protect Instagram teen users from online flashing by placing nude photos behind a secure screen. The user can then choose whether to view it.

“We will also show them a message encouraging them not to feel pressure to respond, with the option to block the sender and report the chat,” Meta said.

Nudity safety screens are turned on by default for children under 18 around the world. Returning users will see a notification encouraging them to open it.

“When nudity protection is on, people sending images containing nudity will see a message reminding them to be careful when sending sensitive photos and can unsend them if they change their mind,” it added.

Anyone trying to repost a nude photo will see the same warning, encouraging them to reconsider.

The feature is powered by on-device machine learning, so Meta says it will work in end-to-end encrypted chats because image analysis is done on the user’s own device.

safety warning

In another protective measure, Instagram users who send or receive nude photos will be directed to safety tips with information about potential risks, which Meta says were developed with the guidance of experts.

“These tips include warnings that people may screenshot or forward images without your knowledge, that your relationship with that person may change in the future, and that you should review profiles carefully in case they are not who they say they are, ” it read. “They also link to a range of resources, including Meta’s Safety Center, support hotlines, StopNCII.org for those 18+ and Take It Down for those under 18.

It is also testing pop-up messages for users who may have interacted with accounts that Meta removed for sextortion, which will also direct them to relevant expert resources.

“We’re also adding new child safety helplines from around the world to our in-app reporting process. This means that when teens report relevant issues – such as nudity, threats to share private images, or sexual exploitation or solicitation – We will direct them to the child safety hotline available locally,” it added.

Techniques for spotting sextortionists

While Meta says it will remove sextortionists’ accounts once it discovers them, it first needs to identify bad actors in order to shut them down.So Meta is trying to go even further: It says it’s “developing technology to help identify where accounts may be potentially Participating in a sextortion scam based on a series of signals that may indicate sextortion behavior.”

“While these signals are not necessarily evidence that an account is violating our rules, we are taking precautions to help prevent these accounts from finding and interacting with teen accounts,” it continued, adding: “This is Building on our work.” Steps have been taken to prevent other potentially suspicious accounts from discovering and interacting with teens. “

It’s unclear what technology Meta uses for this, or what signals might indicate potential sextortionists (we’ve asked for more) – but presumably it might analyze communication patterns to try to detect bad actors.

Accounts flagged by Meta as potential blackmailers will face restrictions on messaging or interacting with other users.

“[A]Any message requests a potential ransomware account attempts to send will go directly to the recipient’s hidden requests folder, meaning they will not be notified of the message and never have to see it,” it reads.

Users who are already chatting with potentially scam or sextortion accounts will not have their chats closed, but will be shown a security notice, “encouraging them to report any threats sharing their private images and reminding them that they can take action against any actions that may pose a threat.” Say no”. They feel uncomfortable,” Meta said.

Teen users are already protected from receiving private messages from adults they are not connected to on Instagram (and in some cases from other teens). But Meta is taking a step further by not showing the “Message” button on teens’ profiles to potential sextortion accounts, even if they are connected.

“We are also testing hiding teens from these accounts in people’s follower, following and likes lists and making it harder to find teen accounts in search results,” it added.

Notably, the company has come under increasing scrutiny in Europe over child safety risks on Instagram, with law enforcement officials questioning its practices since the EU Digital Services Act (DSA) came into effect last summer.

The long, slow process toward safety

Meta has previously announced measures to combat sextortion, most recently in February when it expanded access to Take It Down.

Third-party tool lets people generate hashes of intimate images locally on their devices and share them with the National Center for Missing and Exploited Children — creating a repository of non-consensus image hashes that companies can use Library to search and eliminate revenge porn movies.

Meta’s previous practices have been criticized for requiring young people to upload nude photos. Lacking hard laws regulating how social networks protect children, Meta has been self-regulating for years, with mixed results.

However, with some requirements coming to the platform in recent years, such as the UK Children’s Act coming into force in 2021 and more recently the EU’s DSA, tech giants like Meta have finally had to pay more attention to protecting minors.

For example, in July 2021, Meta defaulted young people’s Instagram accounts to private ahead of a UK compliance deadline. In November 2022, privacy settings for teenagers will become stricter on Instagram and Facebook.

In January, Meta also announced that it would default to stricter messaging settings for teens on Facebook and Instagram, but restrictions on messaging for teens who are not yet connected were still in place shortly before the DSA’s full compliance deadline in February.

When it comes to protections for younger users, Meta’s slow and iterative feature creep raises questions about why it took so long to adopt stronger protections — suggesting it’s opting for the bare minimum, To manage the impact on usage and prioritize engagement and security. (This is exactly why Mehta whistleblower Frances Haugen repeatedly denounced her former employer.)

When asked why Facebook wasn’t rolling out the latest protections announced for Instagram users, a Meta spokesperson told TechCrunch: “We want to respond to what we feel is the greatest need and relevance – when it comes to protections from unprotected users. Welcoming nudity and educating teens about the risks of sharing sensitive images – we think that’s on Instagram DMs, so that’s where we focus first.”

#Meta #automatically #blur #nudity #Instagram #direct #messages #latest #teen #safety #measure

Leave a Reply

Your email address will not be published. Required fields are marked *