Instagram has released a tool that will now make it easier for users to keep their own posts safer. The photo-sharing app has introduced a keyword moderation tool that allows for comments to be filtered, thus fighting back against cyberbullying and abuse.
Instagram’s CEO and co-founder Kevin Systrom announced the update through Instagram’s blog, explaining, “All different types of people — from diverse backgrounds, races, genders, sexual orientations, abilities and more — call Instagram home, but sometimes the comments on their posts can be unkind. To empower each individual, we need to promote a culture where everyone feels safe to be themselves without criticism or harassment.”
To utilize the new function, users can tap the gear button at the top of their profile and choose to filter comments in one of two ways: hiding inappropriate comments based on default keywords provided by Instagram or entering custom keywords deemed offensive or inappropriate. Any comments that contain that custom word or phrase will be hidden from the post.
According to TechCrunch, this feature was first made available to business accounts in July, reportedly going through a test phase by deleting snake emoji from Taylor Swift’s photos. Model Chrissy Teigen used the tool to filter out words like “trump,” “colon” and “cleanse.”
Additional features, including swipe-to-delete comments, reporting inappropriate comments and blocking accounts, were previously rolled out to the app’s 500 million users in an effort to combat online harassment. Seventy-three percent of adult internet users were found to have witnessed online harassment, according to a 2014 Pew Research Center study.
TechCrunch reported that Instagram will also be debuting a new feature that displays personalized comments from friends in a post’s comments preview section. “My commitment to you is that we will keep building features that safeguard the community and maintain what makes Instagram a positive and creative place for everyone,” Systrom wrote.
Do you think the keyword moderation tool will help prevent cyberbullying? Share your thoughts in the comments section below!
Erica has been a part of Business 2 Community since becoming an intern in 2014. During this time, she has focused on writing articles on trending and entertainment topics, and will be learning the ins and outs of the editorial side of the site in her current role. She graduated… View full profile ›
Tweet
<> Embed
@ Email
Report
Instagram Introduces Keyword Moderation Tool to Filter Inappropriate Comments
September 13, 2016
Instagram Blog
Instagram has released a tool that will now make it easier for users to keep their own posts safer. The photo-sharing app has introduced a keyword moderation tool that allows for comments to be filtered, thus fighting back against cyberbullying and abuse.
Instagram’s CEO and co-founder Kevin Systrom announced the update through Instagram’s blog, explaining, “All different types of people — from diverse backgrounds, races, genders, sexual orientations, abilities and more — call Instagram home, but sometimes the comments on their posts can be unkind. To empower each individual, we need to promote a culture where everyone feels safe to be themselves without criticism or harassment.”
To utilize the new function, users can tap the gear button at the top of their profile and choose to filter comments in one of two ways: hiding inappropriate comments based on default keywords provided by Instagram or entering custom keywords deemed offensive or inappropriate. Any comments that contain that custom word or phrase will be hidden from the post.
According to TechCrunch, this feature was first made available to business accounts in July, reportedly going through a test phase by deleting snake emoji from Taylor Swift’s photos. Model Chrissy Teigen used the tool to filter out words like “trump,” “colon” and “cleanse.”
Additional features, including swipe-to-delete comments, reporting inappropriate comments and blocking accounts, were previously rolled out to the app’s 500 million users in an effort to combat online harassment. Seventy-three percent of adult internet users were found to have witnessed online harassment, according to a 2014 Pew Research Center study.
TechCrunch reported that Instagram will also be debuting a new feature that displays personalized comments from friends in a post’s comments preview section. “My commitment to you is that we will keep building features that safeguard the community and maintain what makes Instagram a positive and creative place for everyone,” Systrom wrote.
Do you think the keyword moderation tool will help prevent cyberbullying? Share your thoughts in the comments section below!
Digital & Social Articles on Business 2 Community
Author: Erica Abbott
Erica has been a part of Business 2 Community since becoming an intern in 2014. During this time, she has focused on writing articles on trending and entertainment topics, and will be learning the ins and outs of the editorial side of the site in her current role. She graduated… View full profile ›
(24)
Related
Embed Pin on Your Blog
Email This Pin
Report This Pin