The narratives Facebook uses to keep its employees happy are crumbling

By Mark Sullivan

When you run a business whose very design lends itself to nefarious—even dystopian—uses, you need a set of carefully crafted narratives to keep your mainly younger, liberal-minded workforce feeling good about coming to work every day. Sometimes the high salaries and wicked perks just aren’t enough.

Now is one of those times. Facebook employees are complaining loudly after CEO Mark Zuckerberg’s decision last week to allow a Donald Trump Facebook post to remain, unlabeled, on the social network. The post encouraged the use of force to put down nationwide protests against the death of George Floyd at the hands of Minneapolis police officer Derek Chauvin, and included the historically loaded statement “when the looting starts, the shooting starts.” That recalled the statement of Miami’s police chief circa 1968 and was interpreted by many Facebook employees as a call to violence.

Zuckerberg defended the decision in a post Friday, saying “people need to know if the government is planning to deploy force.” Facebook employees staged an online walkout to protest the decision that same day. So far, two Facebook employees have resigned. Mark Zuckerberg held an emergency town hall for Facebook employees on Tuesday afternoon to address the matter and held fast to his earlier decision. He also offered multiple ways that Trump’s post could be interpreted, reports said.

But among tech workers, Twitter’s decision to fact-check or hide Trump tweets containing misinformation or threats is getting the applause. Zuckerberg remains in the crosshairs for criticism.

Zuckerberg’s decision, while far, far from the company’s first policy missteps, seems to be hitting a raw nerve for some Facebook employees. The company’s acquiescence to Trump, in particular, may call into question some of the core narratives Facebook employees need to believe to feel good about working there.

Such as:

“Facebook is a champion of free speech”

Facebook says it operates as a neutral, free-speech zone and stays out of many content moderation decisions because a tech company should not act as an “arbiter of truth,” as Zuckerberg puts it. The company’s policy shows special deference to public officials, who can post misinformation and even calls to violence because making their speech known is “in the public interest.”

Facebook’s real reason for abdicating its moderation duties may be less philosophical and more practical. Its hands-off approach makes the company look more like a neutral tech platform instead of a publisher that makes judgments on content. That’s important because the moment a tech company begins making editorial decisions it might open the door to a different kind of regulation. For example, if Facebook were to be considered a publisher, it would lose its current legal immunity from lawsuits stemming from harmful content posted by users. Trump retaliated after the Twitter fact-check by issuing an executive order that would remove that immunity, which is granted in Section 230 of the Communications Decency Act. Without those protections, Facebook could be forced to spend far more resources on content moderation.

“Facebook’s leaders care mainly about users”

Some Facebook employees are now openly questioning the credibility of the company’s leaders. One employee, software engineer Timothy Aveni, quit his job Monday as a result of Facebook’s allowance for Trump’s posts. The last straw for Aveni was when Facebook decided not to remove or label Trump’s “when the looting starts, the shooting starts” post on Friday.

“Mark always told us that he would draw the line at speech that calls for violence,” Aveni said in a Facebook post announcing his departure Monday. “He showed us on Friday that this was a lie.” Aveni said Facebook has constantly “moved the goalposts” to allow for more kinds of toxicity in Trump’s posts.

“Facebook, complicit in the propagation of weaponized hatred, is on the wrong side of history,” Aveni concluded.

“Facebook’s business model is a virtuous cycle”

Internally, Facebook defines its business model as having a social network component and an ad network component, the two working together in a “virtuous cycle” to create ever more value for both users and advertisers (and Facebook and its investors). The revenue from the ad network is used to create cooler experiences for users on the social network, which attracts more users and still greater “reach” for advertisers. Facebook, and Facebook’s culture, constantly message to employees that their work, either directly or indirectly, connects users to content in ever more fun and meaningful ways and that the advertising network is a necessary business component that makes the development of those things possible.

Facebook has always explained its strategic and policy decisions to the world outside the company in terms of their benefit to users, while directing attention away from the clear benefits those decisions hold for the advertising business. Internally, the company tells employees that its decisions are driven by its mission to connect the world, but that they simultaneously benefit the advertising business. This belief system starts with Mark Zuckerberg (who may honestly believe in this serendipity) and is circulated throughout the company. It makes it easier for Facebook’s employees to spend their days working for a company that enriches itself and its investors by harvesting the personal data of its users. The combination of that and the generous salaries they’re paid creates a good reason to stay.

This has worked nicely for most of Facebook’s history. But now Facebook employees are being asked to accept its employer’s decision to host content that’s obviously harmful to the public. The company has said it shouldn’t act as the “arbiter of truth,” but it’s clear to many Facebook employees that there are obvious limits to that, and that Trump seems intent on pushing—and perhaps moving—those limits.

“Facebook will keep moving the goalposts every time Trump escalates, finding excuse after excuse not to act on increasingly dangerous rhetoric,” says Timothy Aveni in his post.

“People have been murdered this weekend at the protests and we’ve hosted content encouraging it,” another employee wrote on Facebook’s internal chat platform, The Washington Post‘s Rachel Siegel and Elizabeth Dwoskin report.

Meanwhile, it’s no secret that Mark Zuckerberg has a friendly relationship with Donald Trump. The CEO reportedly talked on the phone with the president about the inflammatory posts. It’s also known that Trump is important to Facebook; he’s a huge social media figure who gives many Facebook users a reason to come to the site every day, share content about the president’s latest exploits, and share opinions about them. Among political campaigns, it’s also clear that the Trump operation has been the biggest customer of Facebook’s advertising products.

Facebook employees may have an increasingly difficult time balancing in their minds their company’s allowance of Trump’s posts and the idea that the company truly cares about its users and the tenor of public discourse on its platform.

“Facebook is the town square”

Much of the public discussion about the unrest over the death of George Floyd will take place on social media. Social media is the new gatekeeper for news and information, not traditional media outlets, many of which are now struggling to survive.

But social media platforms may not be the ideal forums for a national discussion on race.

Social networks appreciate that the most sensational or polarizing content attracts and holds the most eyeballs for the longest time, and therefore creates the most opportunity to show ads. The Wall Street Journal‘s Jeff Horwitz and Deepa Seetharaman recently reported that Facebook research in 2018 confirmed as much.

“Our algorithms exploit the human brain’s attraction to divisiveness,” says a slide from the research results. The researchers concluded that if Facebook’s algorithms were left in place, users’ feeds would contain “more and more divisive content in an effort to gain user attention and increase time on the platform.” Facebook, however, decided not to change its algorithms based on the research findings.

There is debate about whether or not social media algorithms promote filter bubbles—online experiences where users see content that reinforces, and doesn’t challenge, their existing beliefs. This may happen organically, as a function of people associating and sharing content with a circle of social friends that share their values and beliefs. Some lawmakers have called on social media giants to expose their algorithms to public scrutiny, but so far the platforms have not done so, and likely will not. But there’s little debate over whether filter bubbles exist.

Zuckerberg often talks about Facebook being a virtual town square where people can debate issues, fact-check news items, and learn from each other’s points of view. But because of the current hyperpartisan atmosphere, Facebook may be more often used to spread misinformation about the George Floyd demonstrations, their causes, and their participants.

Many people who work in Silicon Valley are very hard-working and very well paid. But increasingly they look critically at their employer’s place in the world, and whether the work they do every day is part of a problem or part of a solution. (Google employees have in recent months been asking similar questions.) Some Facebook employees may be wrestling with that question for the very first time now. That’s why Zuckerberg held his surprise meeting on Tuesday.

 

Fast Company , Read Full Story

(8)