With the ‘Skull Breaker Challenge,’ TikTok gets its ‘parental warning’ moment

By Harrison Weber

Congrats, TikTok.

The super-popular social video app—the one dogged by unsettling reports of censorship after devouring another super-popular video app—is having a moment that may finally get parents to pay attention. TikTok is already huge, it’s been huge (and so on), but now parents are ready to talk about it. Ask Bing yourself, or better yet turn to your local news affiliate, because a worrying thing called the “Skull Breaker Challenge” is happening.

CBSN New York perhaps put it best: “There’s a new disturbing trend on a popular cellphone app that’s causing children to get severely injured.”

Actually, it’s less a trend and more a prank in which one kid, apparently unaware they’re about to be tripped, is tripped. At least a few kids have reportedly wound up in the hospital over it, and news outlets have picked up at least a handful of warnings shared by concerned parents on social media.

Child psychiatrist Dr. Jodi Gold said to CBSN New York, “It’s really important that parents and teachers are explaining to kids that this is actually an assault. It’s a form of cyber-bullying and it absolutely has to stop.”

It’s horrible, and it also triggers a certain parental Spidey sense that only a “Your Kids Are in Danger” story can, while feeding into the “kids are stupid” trope that some news outlets just won’t quit.

YouTube has dealt with these kinds of “challenges” for years (revisit Momo, Tide Pods, and all the rest for a trip down memory lane), so perhaps the Skull Breaker Challenge represents a passing of the social-video torch, so to speak.

We reached out to TikTok for comment and will update if we hear back.

With timing so perfect it’s almost suspect, TikTok came out with new parental controls today. The kids are already planning their escape, no doubt.

TikTok later sent Fast Company the following:

“The safety and well-being of our users is a top priority at TikTok. As we make clear in our Community Guidelines, we do not allow content that encourages, promotes, or glorifies dangerous challenges that might lead to injury, and we remove reported behavior or activity that violates our guidelines. To help keep our platform safe, we have introduced a slate of safety features geared towards enhancing our users’ experience, including tools for reporting inappropriate content and for managing privacy settings.”

 

Fast Company , Read Full Story

(25)