Instagram to train Artificial Intelligence to spot offensive comments

Instagram is to warn users when their captions on a photo or video could be considered offensive.

According to BBC, the Facebook-owned company says it has trained an AI system to detect offensive captions. The idea is to give users “a chance to pause and reconsider their words”.

Instagram announced the feature in a blog on Monday, saying it would be rolled out immediately to some countries.

The tool is designed to help combat online bullying, which has become a major problem for platforms such as Instagram, YouTube, and Facebook.

In July 2017 Instagram was ranked as the worst online platform in a cyber-bullying study.

If a user with access to the tool types an offensive caption on Instagram, they will receive a prompt informing them it is similar to others reported for bullying.

Users will then be given the option to edit their caption before it is published.

“In addition to limiting the reach of bullying, this warning helps educate people on what we don’t allow on Instagram and when an account may be at risk of breaking our rules,” Instagram wrote in the post.