Lessons From Silicon Valley’s Biggest Mistakes Of 2016

Sometimes it’s good to take stock, not only of your own errors, but also those of your peers. This year Silicon Valley startups, unicorns, and public companies made blunders they’d probably like to forget about. But in the spirit of self-education, we’re highlighting three such miscalculations so you don’t fall into the cavernous missteps of giants.

If You’re Going To Be Transparent, Don’t Double Back

Two years ago, major companies like Apple, Facebook, Google, Twitter, and others started to release diversity reports as a way to highlight their intention to change homogeneity within their ranks. But the road to an inclusive company has been a slow slog for just about all the companies that have attempted to make diversity a priority. The percentage of black employees at most of these companies stays woefully below 5%, and women represent less than one-third of staffs. Far more dismal are their percentages among management and technical positions. Given the lack of momentum, several companies, including Twitter, Pinterest, eBay, and Salesforce, have delayed the release of their latest diversity reports.

While it may be painful to show that they have made little or no progress, retreating is a bad idea after the companies decided to be open about their demographics. For one, it looks weak. But it can also be detrimental. In the case of diversity reports, the whole point (as my colleague Rich Bellis points out) is to keep the issue top of mind; to remember that it is and should remain a priority. Because reports continue to indicate that employees at top tech companies are disappointingly uniform. In order to support ethnic, racial, and gender diversity, we need to keep talking about how inclusive we are or aren’t.

Algorithms Are Limited

This year, we learned the limitations of algorithms through Facebook’s multiple attempts to replace human judgment with tech. In late 2015, Facebook was criticized for the way it deployed its safety check-in product, for instance turning it on for Paris when it was under terrorist attack, but not for Beirut after dual suicide bombings killed more than 40 people. At the time, Facebook said that Paris represented the first time it had turned on the feature during a terrorist attack (it was primarily being used for natural disasters). In 2016, Facebook created a way to trigger the function in a region where a large number of people were discussing a local catastrophe. Emergency response organizations were concerned about the launch of this feature, because it has the power to let misinformation dictate when a situation is deemed a crisis.

“The biggest thing we always say in emergency response is, bad information is worse than no information,” Rebecca Gustafson told Fast Company’s Steven Melendez earlier this year. “People can criticize emergency responders for taking too long, but the tech community moves at warp speed, and I think being able to take this extra beat to say, is this, is this not, is worth it to verify.”

This worry is echoed in critiques of Facebook’s fake news problem that came up increasingly over the course of the 2016 election cycle. When confronted about its role in spreading misinformation, Facebook CEO Mark Zuckerberg pointed to community flagging as a way to stifle false content and added that the company wasn’t really interested in being an arbiter of truth. Either way, both the safety check-in and the flow of fake news demonstrated that algorithms are not equipped to differentiate between factual information and exaggeration or remodeled reality. After public pressure, Facebook announced an effort to flag fake news through a partnership with Poynter International Fact Checking Network. Posts of questionable authority will soon be marked as such and occupy lower relevance in newsfeeds. But 2016 serves as a lesson to companies that think they can solve inefficiencies entirely through machines, and that some services require a human touch.

The Customer Is Always Right

This hackneyed phrase may seem quaint, but it’s true nonetheless. We learned in 2016 that with enough haranguing, a company will cave to customer demands. Facebook, mentioned above, serves as one example. Another is Evernote. Just recently, the company released a new privacy policy wherein customer’s notes would be periodically reviewed by employees supervising the company’s machine learning technology. Evernote’s CEO explained that any notes shared with employees would be stripped of identifying information, but users remained concerned. After much backlash, the company has made participation in this program voluntary. Only users who have signed off on having their notes occasionally viewed by employees will have to abide by the new policy.

And Jessica Alba’s Honest Company is allegedly altering the formula of its laundry detergent after a report from the Wall Street Journal called the natural products company out for its use of sodium coco sulfate. This year, the Honest Company suffered several customer complaints about whether or not its products were truly natural or organic. The Honest Company told the Journal it was changing the formula to “improve the efficacy of our products.”

How will these lessons apply in 2017? As companies dive deeper into machine learning, self-driving cars, and other forms of automation, it would be wise to bear in mind the limitations of technology as we forge ahead and remember where humans excel. And as we think about transparency in our companies and professional relationships, it’s important to remember that transparency, while a great tactic for developing trust between companies and consumers and managers and employees, is not a panacea—it’s a tool. Retracting or withholding information you once gave willingly can damage trust in any relationship. Finally, as you’re building new products or perhaps thinking about the positive ways your business can create change in the world, listen to your customers. They may have more answers than you think.

Related Video: Are Tech Companies Hobbling The Success Of VR With Their Money?

 

Fast Company , Read Full Story

(35)