Why Meredith Whittaker believes AI is ‘replicating patterns of inequality’

 

By Jessica Bursztynsky

Consumers need to be increasingly aware of how historical biases and marginalization are impacting AI systems and their responses, said Signal Foundation President Meredith Whittaker, speaking on a panel during Fast Company’s annual Impact Council meeting this week.

“[Artificial intelligence] is determining our access to resources, our access to opportunity, our place in the world. And, of course, it’s determining this based on past data,” Whittaker said. “So it is, and I believe inextricably will be, replicating patterns of historical inequality and marginalization. And in many powerful ways, it will be obscuring accountability for those decisions.”

Why Meredith Whittaker believes AI is ‘replicating patterns of inequality’ | DeviceDaily.com
[Photo: Alyssa Ringler]

For many months now, the topic of AI has dominated the tech industry while executives in other industries are looking at potential applications. Big tech firms, like Google and Microsoft, are building out their own AI systems or partnering with industry players to gain dominance in the landscape.

“It’s naturalizing often racist and misogynist determinations about people’s place in the world behind the veil of computational sophistication, in a way that makes it harder and harder to push back against those histories and correct them in the present,” Whittaker added.

AI systems have been used for years for tasks like analyzing data sets and recommending solutions. And its rapid adoption, with what some view as the potential to disrupt every industry, could threaten to continue centuries of mistreatment. After all, the tech is only as good as the data that’s being fed into it.

Oftentimes, people will look at whatever AI spouts out as undeniable fact.

“This is not a system that has any understanding of veracity or truth, right? Like facts are bolted on post hoc as sort of an afterthought,” said Whittaker, whose career has included positions in academia as well as more than a decade at Google. “This is an aggregate of some of the worst and maybe best parts of text on the internet spit back out as plausible-sounding content . . . that has nothing to do with veracity or truth, and it is wild to me that we’ve sort of let that slide. That we have companies that are integrating this into search—the whole thing is a red flag to me.”

 

Fast Company

(11)