Instagram’s rolling out some additional protection measures to combat sextortion scams in the app, while also providing more informational notes to help teens understand the implications of intimate sharing online.
First off, Instagram’s launching a new process that will blur DMs which are likely to contain nude images, as detected by its systems.
As you can see in this example, potential nudes will now be blurred by default for users under the age of 18. The process will not only protect users from exposure to such, but will also include warnings about replying, and sharing their own nude images.
Which may seem like a no-brainer, as in if you don’t want your nudes to be seen by others, don’t share them on IG. Or even better, don’t take them at all, but for younger generations, nudes are, for better or worse, a part of how they communicate.
Yeah, I’m old, and it makes no sense to me either. But given that this is now an accepted, and even expected sharing process in some circles, it makes sense for IG to add more warnings to help protect youngsters, in particular, from exposure.
And as noted, it will also help in sextortion cases:
“This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return.”
in addition, Instagram says that it’s also developing new technology to help identify where accounts may potentially be engaging in sextortion scams, “based on a range of signals that could indicate sextortion behavior”. In such cases, Instagram will take action, including reporting users to NCMEC where deemed necessary.
Instagram will also display warnings when people go to share nude images in the app.
Instagram’s also testing pop-up messages for people who may have interacted with an account that it’s removed for sextortion, while it’s also expanding its partnership with Lantern, a program run by the Tech Coalition which enables technology companies to share signals about accounts and behaviors that violate their child safety policies.
The updates build on Instagram’s already extensive child protection tools, including its recently added processes to limit exposure to self-harm related content. Of course, teens can opt out of such measures, but Instagram also can’t be responsible for all elements of protection and safety in this respect.
Instagram also has its “Family Center” oversight option, so parents can keep tabs on their kids’ activity, and in combination, there are now a range of options to help keep younger users safe in the app.
You can read more about Instagram’s new sextortion protection measures here.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : SocialmediaToday – https://www.socialmediatoday.com/news/instagram-launches-process-protect-teens-sextortion-scams/713012/