Instagram has added a new way to help users better manage their on-platform experience with a variable ‘Sensitive Content Control‘, which provides three options for restricting the content that you are shown in the app.
As you can see here, the new Sensitive Content Control options are now available in the latest version of the app.
If you head to Settings > Account > Sensitive Content Controls, you’ll now be able to choose between these options:
- Allow – You may see more photos or videos that could be upsetting or offensive
- Limit (Default) – You may see some photos or videos that could be upsetting or offensive
- Limit Even More – You may see fewer photos or videos that could be upsetting or offensive
The middle ground ‘Limit’ is the default setting for all users, while only those over the age of 18 will be able to select the ‘Allow’ option, which removes any restrictions on displayed content.
To be clear, Instagram notes that it already has various rules and processes in place to protect users from offensive content, with specific parameters in place both for your regular feed/Stories and for Explore.
“We don’t allow hate speech, bullying, and other content that might present a risk of harm to people. We also have rules about what kind of content we show you in places like Explore; we call these our Recommendation Guidelines. These guidelines were designed to help ensure that we don’t show you sensitive content from accounts you don’t follow. You can think of sensitive content as posts that don’t necessarily break our rules, but could potentially be upsetting to some people – such as posts that may be sexually suggestive or violent.”
So it’s about providing an extra level of protection for users who may not want to see any of this type of material, with Instagram’s systems now able to detect certain types of content automatically, and then keep it out of view for those that choose to up their sensitivity settings.
Instagram has been advancing its systems on this front over time. Back in 2019, Instagram outlined how its image recognition systems were increasingly able to identify content that came close to violating its community guidelines, but didn’t quite cross the line.
That borderline content will often see less reach, as part of Instagram’s effort to protect users from offensive content exposure in the app – but in order to ensure that it’s not incorrectly penalizing creators, it needs to ensure that its systems have a high level of accuracy in detecting these elements within uploaded posts.
For this, Instagram’s content moderators have been labelling borderline content within their regular work, which Instagram then uses to train its AI systems. That process, over time, has better enabled the platform to limit the reach of borderline material, while it’s also now advanced to the stage where Instagram can give users more control options, based on this advanced system understanding.
Which won’t always be right. As with any AI system, there will be false positives, but if you’re looking to avoid this type of material, it could be a simple way to limit exposure, and improve your in-app experience.
And given the younger skew of Instagram’s audience, and the reach it now sees, it’s important for Instagram to protect its audience where it can – though it is an element worth noting in your marketing approach, particularly if you’re looking to push the boundaries, or if your visuals could, potentially, be misidentified as offensive, based on the above-noted categories.
The new Sensitive Content Control options are now available in the latest version of the app.