Instagram Downranks Sexual Content, Fixes Bug In Stories
Facebook-owned messaging app Instagram has decided to demote sexually and morally vague content – video snippets, memes and pictures – which are being shared and viewed on its platform by its one billion global users.
"We have begun reducing the spread of posts that are inappropriate, but do not go against Instagram's Community Guidelines, limiting those types of posts from being recommended on our Explore and hashtag pages," Guy Rosen, Vice President of Integrity and Tessa Lyons, Head of News Feed Integrity, Facebook, wrote in a blog-post on Wednesday.
Instagram has always maintained strict policies against nudity on its platform.
It states that nudity is only allowed if the photos show a nude sculpture or a painting, post-mastectomy scarring, or a woman breastfeeding, The Next Web (TNW) reported.
As part of its plans, Instagram specified that any sexually suggestive post shared on the app would still appear in Feed for followers, however, the content may not appear for the broader community.
In 2014, there was a huge uproar when Instagram banned global artiste Rihanna's topless photo. A #FreeTheNipple movement also came up on the app at the time, but Instagram refused to alter its policies.
Adding to the breach scandals surrounding Facebook and its family of apps, this time a bug on photo-messaging app Instagram exposed Stories from certain users to complete strangers.
On Thursday, TechCrunch reported that it "first received word of the problem from a user InternetRyan via Twitter, who was confused about seeing strangers in his Instagram Stories tray."
On probing inquiry, the Facebook-owned photo-messaging app confirmed that the glitch that had affected "a small number of people" was real and has been resolved.
Out of Instagram's total global one billion followers, over 500 million users actively use the Stories feature and this incident, once again, raised major safety and privacy concerns among users.