Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

admin
Pinned August 10, 2018

<> Embed

@  Email

Report

Uploaded by user
Facebook pulls back the curtain on its content moderators
<> Embed @  Email Report

Facebook pulls back the curtain on its content moderators

Richard Lawler, @Rjcc

July 26, 2018
 
 

Facebook pulls back the curtain on its content moderators | DeviceDaily.com

 

When someone reports an offensive post on Facebook (or asks for a review on a message caught by its automatic filters, where does it go? Part of the the process is, as it always has been, powered by humans, with thousands of content reviewers around the world. Last year Facebook said it would expand the team to 7,500 people, and in an update posted today explaining more about their jobs, it appears that mark has been hit.

The size is intended to have people available for review in a post’s native language, although some items like nudity might be handled without regard to location. Of course, there’s extensive training and ongoing reviews to try and keep everyone consistent — although some would argue that the bar for consistency is misplaced.

Facebook didn’t reveal too much about the individuals behind the moderation curtain, specifically citing the shooting at YouTube’s HQ, even though it’s had firsthand experience with leaking identities to the wrong people before. It did however bring up how the moderators are treated, insisting they aren’t required to hit quotas while noting that they have full health benefits and access to mental health care. While it might not make understanding Facebook’s screening criteria any easier — or let us know if Michael Bivins is part of the rulemaking process — the post is a reminder that, at least for now, there is still a human side to the system.

Engadget RSS Feed

(23)