Instagram has announced it is rolling out new tools to help lead the fight against online bullying. Wohoo!
The tools are grounded in a deep understanding of how people bully each other and how they respond to bullying on Instagram and are only two steps on a longer path.
Instagram has been using artificial intelligence (AI) for years to detect bullying and other types of harmful content in comments, photos and videos. In the last few days, it has started rolling out a new feature powered by AI that notifies people when their comment may be considered offensive before it is posted.
This intervention gives people a chance to reflect and undo their comment and prevents the recipient from receiving the harmful comment notification. Early tests have shown that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect.
As well as identifying and removing bullying on the platform, Instagram is also empowering its community to stand up to this kind of behaviour. It is creating a feature that allows people to control their Instagram experience, without notifying someone who may be targeting them. Instagram will soon begin testing Restrict, a new way for people to protect their account from unwanted interactions.
Once people Restrict someone, comments on their posts from that person will only be visible to that person. People can choose to make a restricted person’s comments visible to others by approving their comments. Restricted people won’t be able to see when you’re active on Instagram or when you’ve read their direct messages. The rationale behind the Restrict feature is that young people are reluctant to block, unfollow or report their bully because it could escalate the situation, especially if they interact with their bully in real life.