Technology can still live up to its promise as a force for justice

By Mary Rinaldi and Ashish Prashar

George Floyd, 46, was handcuffed and pinned down by Minneapolis cops. One officer pressed his knee into Floyd’s neck for eight minutes and 46 seconds as three others stood by. Floyd told them he couldn’t breathe. Then he died.

It’s shocking, but for Black people and their allies in America, sadly, not surprising. This brutality and violence has been a daily reality for decades, and it is only the advent of smartphone cameras and social media that enables the widespread distribution of those videos that has forced people to look up, take notice, and crucially, take to the streets. Over the past week, people across the U.S. have captured what may be the most comprehensive live picture of police brutality ever, only strengthening demonstrators’ resolve.

As protests erupted around the nation, brands and companies rushed to show their solidarity. Tech companies and CEOs began to weigh in on an extremely delicate topic for corporations not accustomed to taking a stand against racism and whose employees often do not reflect the diversity of society at large.

Of course, there has been backlash. Companies have been called out on their documented failure to stop racist language on their platforms, or their inadequacy in hiring and promoting Black people and, in some cases, their enabling of police brutality and surveillance.

For decades, leaders at well-funded tech companies have promised magical change. They painted a picture of green, pollution-free cities dotted with gleaming buildings and energy-efficient transportation, citizens moving through the environment elegantly and gracefully, like a well-choreographed dance.

At first, we believed the myths they spun, that the internet is a great equalizer and that algorithms are pure structures, incapable of discrimination. We let them convince us that all those pesky human challenges, such as racism, sexism, and wealth inequality, would disappear in a poof if we kept doubling down on their vision of technological advancement.

Behind the smoke and mirrors is a different reality.

Behind the smoke and mirrors is a different reality. The companies owned by tech gurus often make a lot of money designing and shipping the technology of oppression—co-opting purchasing and home-safety technology for surveillance and policing. Since these are for-profit companies trying to satisfy shareholders, why wouldn’t they? Typically, law enforcement contracts are backed by well-filled coffers. For example, taxpayers funded the New York Police Department at a whopping $5.8 billion in 2019.

Let’s start with Amazon, which has numerous contracts with law enforcement agencies. Of particular note, Ring, Amazon’s home-surveillance company, partners with at least 200 police departments across the country. As part of its contract with some police departments, Ring incentivizes police to encourage citizens to adopt the company’s neighborhood watch app—which has reported issues with racial profiling. After reviewing more than 100 posts on the app, Motherboard found that the majority of people who users deemed “suspicious” were people of color.

Google, not to be outdone, enables “geofencing warrants.” These warrants permit police to request data on devices in the area surrounding a crime. Google initially supplies anonymous information for phones within the area specified by the warrant. After the police narrow their suspects, Google supplies usernames and location data for the specific devices.

Clearview AI is by far the most frightening. Its facial-recognition app could end our ability to walk down the street anonymously. The user takes a picture of a person, uploads it, and sees auto-populated public and semi-public photos of that person, along with links to where those photos appeared. The system’s backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo, and millions of other websites.

Clearview has contracts with many local and federal law enforcement agencies, including ICE and the U.S. Attorney’s Office for the Southern District of New York. The company also has credentialed users at the FBI, the Customs and Border Protection agency, and hundreds of local police departments. The ACLU is now suing Clearview AI, calling the tool an unprecedented violation of privacy rights.

These technologies have the potential to injure people, undermine our democracies, and erode human rights, and they’re growing in complexity, power, and ubiquity. This forces us to reckon with the utopian vision of society that tech companies have served so liberally.

Building anti-racist technology

Technology, on its own, has never been the answer. Used in the wrong hands, it is a tool of oppression. But in the right hands it helps cultivate ideas, build communities, and ignite artistic vision. It has been crucial in casting into sharp relief the behavior of the police when they think that nobody is watching. Few things have drawn more attention to police brutality incidents than the unflinching eye of smartphones. Their ability to capture and broadcast shocking images in real time has increasingly focused attention on a longstanding problem of police brutality in America. Without the “I can’t breathe” video of Eric Garner’s death or the video of George Floyd’s execution, the public would likely have never heard of these incidents.

Tech has been crucial in casting into sharp relief the behavior of the police when they think that nobody is watching.

Despite tech companies’ entrenched power and dubious relationships with law enforcement, there are countless people working to deconstruct and rethink the ways that technology is built and used.

In 2013, American cryptographer Moxie Marlinspike founded Open Whispers Network, which eventually became Signal, an encrypted messaging app, to ensure people everywhere could speak freely. He elaborated on why in a 2016 interview with Wired journalist Andy Greenberg, saying, “From very early in my life I’ve had this idea that the cops can do whatever they want, that they’re not on your team . . . that they’re an armed, racist gang.” That resonates today. As thousands of people protest police brutality across all 50 states, Signal ensures safety and privacy for many of them. Just days ago, Signal released a new blurring tool for photos, in support of “people on the ground,” according to Marlinspike.

Lineage, a company founded in 2019, offers people a new way to discover and catalog visual art data, powered by a critical approach to AI. Founder Noya Kohavi outlines why repurposing computer-vision surveillance technologies for other tasks doesn’t make sense:

We need tools to explore and discover visual data that are expressly made for that purpose. What I’m attempting to do with Lineage is to “bake” a critical approach to the data into the tool’s algorithmic foundation. The tools we build to organize data have incredible impact on the meaning of that data, on how we understand and contextualize it. We don’t have to leave this task to technologies built for surveillance capitalism.

Creative founders are also building organizations with input from their communities to ensure they put the well-being of the people first. The Black School, an experimental art school, teaches Black and PoC students and allies how to become agents of change through art and technology workshops. Ampled, a cooperative platform for artists whose mission is to make music more equitable for artists, is building a digital toolkit to help techies and artists build their own digital cooperative, as an alternative to the conventional Silicon Valley startup model of venture capitalist ownership.

Most big tech firms today are supposedly trying to “connect” us, to create community. It’s time tech actually started putting community first and stopped aiding military or law-enforcement technologies or building systems that discriminate against customers.

The only way for tech businesses to understand how they can truly serve communities is to look within: to ask if they have enough Black or PoC team members. If the answer is no, that should be seen as a vital growth challenge for the business. It is those team members who can see where a technology could be utilized against Black communities, even if that was not its intended purpose. And it is that diversity of understanding that will help to ensure technology is a liberating force for all—not just for the mostly white leaders of today’s big tech platforms.

 

Mary Rinaldi is a social justice advocate, a creative technologist, and writer who is a mentor-in-residence at NEW INC. Ashish Prashar is a justice reform campaigner, who sits on the board of Exodus Transitional Community, Getting Out and Staying Out, Leap Confronting Conflict, and the Responsible Business Initiative for Justice, and is a fellow at the Royal Society of Arts.

 

Fast Company , Read Full Story

(26)