Last year, Instagram added a way for users to filter certain types of “sensitive” content out of the Explore tab. Now, Instagram is expanding this setting, allowing users to opt out of this content in app recommendations.
Instagram doesn’t offer much transparency on how it defines sensitive content or what even matters. When it introduced Sensitive Content Control last year, the company defined sensitive content as “posts that do not necessarily violate our rules but could potentially upset some people, such as posts that may be sexually suggestive or violent”.
Expanded content controls will soon apply to search, reels, hashtag pages, “accounts you might follow” and suggested posts in the feed. Instagram says the changes will be rolling out to all users in the coming weeks.
Rather than letting users opt out of certain content topics, Instagram’s controls only have three settings, one that shows you less of that bucket of content, the standard setting, and an option to see more responsive content. Instagram users under the age of 18 will not be able to opt in to this last setting.
In a Help Center article explaining the content controls in more detail, describing the category as content that “impedes our ability to foster a safe community.” By Instagram, this includes:
“Content that may depict violence, such as people fighting. (We remove graphically violent content.)
Content that may be sexually explicit or suggestive, such as photos of people wearing see-through clothing. (We remove content containing adult nudity or sexual activity.)
Content promoting the use of certain regulated products, such as tobacco or vaping products, adult products and services, or pharmaceutical drugs. (We remove content that attempts to sell or trade most regulated goods.)
Content that may promote or describe cosmetic procedures.
Content that may attempt to sell products or services based on health claims, such as promoting a supplement to help someone lose weight. »
In images accompanying her blog posts, Instagram notes that “some people don’t want to see content about things like drugs or guns.” As we noted when introducing the option, Instagram’s lack of transparency about how it defines sensitive content and its decision not to offer users more granular content controls is troubling. especially given his decision to lump sex and violence together as “sensitive.”
Instagram is a platform known for its hostility towards sex workers, sex educators, and even sexually suggestive emoji. The update is generally bad news for accounts affected by Instagram’s aggressive settings for sexual content, but these communities are already well accustomed to bending over backwards to stay in the platform’s good graces.
From where we stand, it’s not at all intuitive that a user who doesn’t want to see posts promoting weight loss scams and diet culture would also be opposed to photos of people wearing see-through clothes, but Instagram is clearly painting in broad strokes here. The result is a tool that prompts users to turn off an opaque blob of “adult” content rather than a meaningful way for users to easily avoid the things they’d rather not see while surfing Instagram’s algorithms.