admin
Pinned February 11, 2022

<> Embed

@  Email

Report

Uploaded by user
Instagram will now reduce the visibility of ‘potentially harmful’ content
<> Embed @  Email Report

Instagram will now reduce the visibility of ‘potentially harmful’ content

Facebook and Instagram reveal content ‘recommendation guidelines’

The guidelines are Facebook’s internal rulebook for recommended content.

Karissa Bell
K. Bell
August 31st, 2020
Instagram will now reduce the visibility of 'potentially harmful' content | DeviceDaily.com
NurPhoto via Getty Images

Facebook and Instagram are trying to peel back the curtain on what has long been one of the least understood parts of its platform: how it recommends content users don’t already follow. Today, the company published its “recommendation guidelines” for both Facebook and Instagram.

The guidelines are essentially Facebook’s internal rulebook for determining what type of content is “eligible” to appear prominently in the app, such as in Instagram’s Explore section or in Facebook’s recommendations for groups or events. The suggestions are algorithmically generated and have been a source of speculation and scrutiny. 

Notably, the guidelines shared today don’t shed much light on how Facebook determines its recommendations. In a statement, Facebook’s Guy Rosen notes suggestions are personalized “based on content you’ve expressed interest in and actions you take on our apps,” but doesn’t offer specifics. What the guidelines do detail is the type of content Facebook blocks from recommendations throughout its platform.

Specifically, the posts list five categories of content that “may not be eligible for recommendations.” This includes borderline content that doesn’t break the company rules, but that Facebook considers objectionable, such as “pictures of people in see-through clothing;” spammy content, like clickbait; posts “associated with low-quality publishing;” and posts that have been debunked by fact checkers. 

Though the rules themselves aren’t new  — Facebook says it’s been using the guidelines since 2016 — it’s the first time the company has made these policies visible to users. 

It may also help Facebook address criticism as the social network has come under increasing scrutiny for its algorithmically generated recommendations. The suggestions have been widely criticized for leading people to conspiracy theories or extremist content they may not otherwise go searching for. People who follow anti-vaccine pages on Instagram, for example, may also see recommendations for QAnon accounts and conspiracy theories about  COVID-19. (Facebook’s guidelines confirm that vaccine misinformation and QAnon are both considered ineligible for recommendations.) 

At the same time, users have long accused Facebook of censorship and “shadow bans,” the idea that the company hides some content for real or perceived infractions. By opening up these guidelines, it will at least be more clear why not all posts make it onto Instagram’s Explore section, for example. 

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics   

(15)


Top