People sliding into Instagram DMs with unsolicited nudes and sexually explicit images has been a longstanding problem among users.
It looks like Instagram is finally working on a solution to protect its users from this unpleasant experience.
App researcher Alessandro Paluzzi published an early image of the new safety feature on Twitter this week.
Meta confirmed the feature to Metro.co.uk, saying that the new set of user controls to help people protect themselves from unwanted DMs will be optional.
‘This technology doesn’t allow Meta to see anyone’s private messages, nor are they shared with us or anyone else,’ a Meta spokesperson told Metro.co.uk.
This checks out with Paluzzi’s image which shows options for people to shield themselves from nude photos.
The new feature will make sure that explicit messages containing nudity will stay covered unless you choose to view them.
‘Technology on your device covers photos that may contain nudity in chats. Instagram CAN’T access photos,’ tweeted Paluzzi.
‘We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive.’ said the Meta spokesperson.
According to a report published earlier this year by the Center for Countering Digital Hate, a British nonprofit organization, Instagram’s tools failed to act upon 90 per cent of image-based direct messages sent to high-profile women.
Many were sent sexual images by men, and not even its ‘hidden words’ feature could completely filter out swear words like ‘b*tch’.
In June, Instagram introduced new parental supervision tools for the accounts of teenagers in the UK and Ireland.
Sign Up for News Updates
Get your need-to-know latest news, feel-good stories, analysis and more
Not convinced? Find out more »