I became part of the alt-right at age 13, thanks to Reddit and Google

By Anonymous

When I was 13, I was convinced that Jews controlled global financial networks and that black Americans committed homicide at a higher rate than whites. I believed that the wage gap was a fallacy fabricated by feminists, and I was an avid supporter of the men’s rights movement. I accepted all of the alt-right maxims I saw as a Reddit moderator, despite my Jewish upbringing in a liberal household with a tight-knit family that taught me compassion, empathy, and respect for others.

Now, I’m 16, and I’ve been able to reflect on how I got sucked into that void—and how others do, too. My brief infatuation with the alt-right has helped me understand the ways big tech companies and their algorithms are contributing to the problem of radicalization—and why it’s so important to be skeptical of what you read online.

My own transformation started when I switched into a new school in the middle of eighth grade. Like anyone pushed into unfamiliar territory, I was lonely and friendless and looking for validation and social connection. But unlike others, I found that validation on the alt-right corners of the internet. The alt-right and the tech platforms that enable it became the community I needed—until I finally opened my eyes and realized it was turning me into someone who I never wanted to be.

A few weeks after I started going to my new school, I noticed that a bunch of the guys in my class were browsing a website called Reddit. I didn’t understand what the site was or how it worked, but I was desperate to fit in and make a mark in my new environment. I went up to one of those guys during study hall and asked how to use Reddit. He helped me set up an account and subscribe to “subreddits,” or mini communities within the Reddit domain. I spent the rest of that period scrolling through Reddit and selecting the communities I wanted to join.

The alt-right and the tech platforms that enable it became the community I needed—until I finally opened my eyes.”

That’s how I discovered r/dankmemes. At first, I only understood about half of the posts that I saw. A lot of the content referenced political happenings that I had never heard of. There were hundreds of sarcastically written posts that echoed the same general themes and ideas, like “there are only 2 genders,” or “feminists hate men.” Since I had always been taught that feminism and social justice were positive, I first dismissed those memes as abhorrently wrong.

But while a quick burst of radiation probably won’t give you cancer, prolonged exposure is far more dangerous. The same is true for the alt-right. I knew that the messages I was seeing were wrong, but the more I saw them, the more curious I became. I was unfamiliar with most of the popular discussion topics on Reddit. And when you want to know more about something, what do you do? You probably don’t think to go to the library and check out a book on that subject, and then fact check and cross reference what you find. If you just google what you want to know, you can get the information you want within seconds.

So that’s what I did. I started googling things like “Illegal immigration,” “Sandy Hook actors,” and “Black crime rate.” And I found exactly what I was looking for.

The articles and videos I first found all backed up what I was seeing on Reddit—posts that asserted a skewed version of actual reality, using carefully selected, out-of-context, and dubiously sourced statistics that propped up a hateful world view. On top of that, my online results were heavily influenced by something called an algorithm. I understand algorithms to be secretive bits of code that a website like YouTube will use to prioritize content that you are more likely to click on first. Because all of the content I was reading or watching was from far-right sources, all of the links that the algorithms dangled on my screen for me to click were from far-right perspectives.

I liked Reddit so much that after around a month of lurking, I applied for a moderator position on r/dankmemes. Suddenly, I was looking at far-right memes 24/7, with an obligation to review 100 posts a day as a moderator. I was the person deciding whether to allow a meme onto the subreddit or keep it off. Every day, for hours on end, I had complete control of what content was allowed on r/dankmemes. That made me even more curious about what I was seeing, leading to more Google searches—all of which showed me exactly what I already believed to be true—and subsequently shoving me deeper into the rabbit hole of far-right media. I spent months isolated in my room, hunched over my computer, removing and approving memes on Reddit and watching conservative “comedians” that YouTube served up to me.

It slowly hammered hatred into my mind like a railroad spike into limestone.”

In my case, the alt-right did what it does best. It slowly hammered hatred into my mind like a railroad spike into limestone. The inflammatory language and radical viewpoints used by the alt-right worked to YouTube and Google’s favor—the more videos and links I clicked on, the more ads I saw, and in turn, the more ad revenue they generated.

Some of the other moderators were under the influence of this poison, too. They started to focus on the same issues that alt-right forums and online media pushed into the headlines, and we would sometimes discuss how women who abort their children belong in jail, or how “trauma actors” would be used to fake school shooting events like the 2012 massacre at Sandy Hook Elementary. Granted, not all of the moderators took part in these talks. It only takes a few though, and those were the few that I admired the most. It soon felt like a brotherhood or a secret society, like we were the few conscious humans that managed to escape the matrix. We understood what we believed to be the truth, and no one could convince us otherwise.

The alt-right’s appeal started to dissipate that summer, when I took a month-long technology break to go to sleepaway camp before the start of my ninth grade year. But the biggest step in my recovery came when I attended a pro-Trump rally in Washington, D.C., in September 2017, about a month after the “Unite the Right” rally in Charlottesville, Virginia, where counter-protester Heather Heyer was murdered by a white supremacist. I wanted to show my support of Trump while being able to finally meet the people behind the internet forums where I had found my community. After many tries, I finally managed to convince my mom to take me, telling her I simply wanted to watch history unfold (she wrote about the experience in the Washingtonian). But really, I was excited to meet the flesh-and-blood people who espoused alt-right ideas, instead of talking to them online.

We understood what we believed to be the truth, and no one could convince us otherwise.”

The difference between the online persona of someone who identifies as alt-right and the real thing is so extreme that you would think they are different people. Online, they have the power of fake and biased news to form their arguments. They sound confident and usually deliver their standard messages strongly. When I met them in person at the rally, they were awkward and struggled to back up their statements. They tripped over their own words, and when they were called out by any counter protestors in the crowd, they would immediately use a stock response such as “You’re just triggered.” They couldn’t come up with any coherent arguments; they rambled and repeated talking points.

The rally left me with a bad taste in my mouth. Seeing for myself that the people I was talking to online were weak, confused, and backwards was the turning point for me. It wasn’t immediate, but I slowly and gradually began to reduce my time on Reddit, and I eventually messaged the other moderators and told them that I was going to quit to focus on school. They all said that they wanted me to stay and pleaded with me to just take a break and come back later. I stayed on as a moderator in name only, no longer making decisions about any of the content assigned to me. A few months later, Reddit sent me a message with the subject line: “You have been removed as a moderator of r/dankmemes.” I felt like the character James Franco plays in 127 Hours as he walks out of the canyon that had imprisoned him for days on end, bloodied but alive nonetheless.

At this point, we’re too far gone to reverse the damage that the alt-right has done to the internet and to naive adolescents who don’t know any better—children like the 13-year-old boy I was. It’s convenient for a massive internet company like Google to deliberately ignore why people like me get misinformed in the first place, as their profit-oriented algorithms continue to steer ignorant, malleable people into the jaws of the far-right. My own situation was personally very difficult but had no wider consequences. But don’t forget that Dylann Roof, the white supremacist who murdered nine people in a Charleston, South Carolina, church in 2015, was radicalized by far-right groups that spread misinformation with the aid of Google’s algorithms. It all started when Roof asked Google about black-on-white crime.

Tech companies need to be held accountable for the radicalization that results from their systems and standards.”

YouTube is an especially egregious offender. Over the past couple months, I’ve been getting anti-immigration YouTube ads that feature an incident presented as a “news” story, about two immigrants who raped an American girl. The ad offers no context or sources, and uses heated language to denounce immigration and call for our county to allow ICE to seek out illegal immigrants within our area. I wasn’t watching a video about immigration or even politics when those ads came on; I was watching the old Monty Python “Cheese Shop” sketch. How does British satire, circa 1972, relate to America’s current immigration debate? It doesn’t.

If we want to stop destructive, far-right, and alt-right ideologies from spawning domestic terrorism incidents in the future, tech companies need to be held accountable for the radicalization that results from their systems and standards. Google and YouTube should own up to their part in this epidemic, but I doubt they will. Ethics and morals have no meaning when millions of dollars are at stake. That’s the America that I, along with millions of other Gen Z kids, are growing up in.

During my ordeal into and out of the online alt-right, I’ve learned that anyone can be manipulated like I was. It’s so easy to find information online that we collectively forget that so much of the content the internet offers us is biased. Everyone has ulterior motives when they try to persuade you to come over to their way of thinking, and it’s our job as human beings to understand what those motives are.

 

Fast Company , Read Full Story

(17)