Twitch’s Clips feature has reportedly enabled child abuse to fester on the platform

Twitch’s Clips feature has reportedly enabled child abuse to fester on the platform

A troubling report found at least 83 clips with sexualized content involving children.

Twitch's Clips feature has reportedly enabled child abuse to fester on the platform | DeviceDaily.com
ASSOCIATED PRESS

An investigative report from Bloomberg paints a disturbing picture of Twitch’s difficulties in moderating the livestreaming platform — especially its Clips feature, which allows users to preserve short videos. The outlet reports that, after analyzing about 1,100 clips, it found at least 83 with sexualized content involving children. Twitch removed the videos after it was alerted, and a company spokesperson wrote to Engadget in an email that it has since “invested heavily in enforcement tooling and preventative measures, and will continue to do so.”

Bloomberg highlighted one incident that exemplified the problem with Clips’ permanent nature on the otherwise transient platform. It recounts the unsettling story of a 12-year-old boy who took to Twitch last spring “to eat a sandwich and play his French horn.” He soon began taking requests from viewers, which (in a sad reflection of online behavior) somehow led to the boy pulling his pants down.

The outlet describes the incident as being over “in an instant.” Still, Clips’ recording function allowed one viewer — who allegedly followed over a hundred accounts belonging to children — to preserve it. This allegedly led to over 130 views of the 20-second Clip before Twitch was notified and removed it.

Clips launched in 2016 as a way to preserve otherwise ephemeral moments on the platform. The feature records 25 seconds before (and five seconds after) tapping the record button. This has the unfortunate side effect of allowing predators to save a troubling moment and distribute it elsewhere.

Twitch has planned to expand Clips this year as part of a strategy to produce more TikTok-like content on the platform. It plans to launch a discovery feed (also similar to TikTok) where users can post their short videos.

Bloomberg’s report cites the Canadian Centre for Child Protection, which reviewed the 83 exploitative videos and concluded that 34 depicted young users showing their genitals on camera. The bulk were allegedly boys between the ages of five and 12. An additional 49 clips included sexualized content featuring minors “exposing body parts or being subjected to grooming efforts.”

The organization said the 34 “most egregious” videos were viewed 2,700 times. The rest tallied 7,300 views.

Twitch’s response

“Youth harm, anywhere online, is unacceptable, and we take this issue extremely seriously,” a Twitch spokesperson wrote to Engadget. In response to being alerted to the child sexual abuse material (CSAM), the company says it’s developed new models to detect potential grooming behavior and is updating its existing tools to more effectively identify and remove banned users trying to create new accounts (including for youth safety-related issues).

Twitch adds that it’s stepped up its safety teams’ enforcement of livestreams, the root of Clips. “This means that when we disable a livestream that contains harmful content and suspend the channel, because clips are created from livestreams, we’re preventing the creation and spread of harmful clips at the source,” the company wrote. “Importantly, we’ve also worked to ensure that when we delete and disable clips that violate our community guidelines, those clips aren’t available through public domains or other direct links.”

“We also recognize that, unfortunately, online harms evolve,” the spokesperson continued. “We improved the guidelines our internal safety teams use to identify some of those evolving online harms, like generative AI-enabled Child Sexual Abuse Material (CSAM).” Twitch added that it’s expanded the list of external organizations it works with to (hopefully) snuff out any similar content in the future.

Twitch’s moderation problems

Bloomberg reports that Clips has been one of the least moderated sections on Twitch. It also notes the company laid off 15 percent of its internal trust and safety team in April 2023 (part of a harrowing year in tech layoffs) and has grown more reliant on outside partners to squash CSAM content.

Twitch’s livestream-focused platform makes it a trickier moderation challenge than more traditional video sites like YouTube or Instagram. Those platforms can compare uploaded videos with hashes — digital fingerprints that can spot previously known problematic files posted online. “Hash technology looks for something that’s a match to something seen previously,” Lauren Coffren of the US National Center for Missing & Exploited Children told Bloomberg. “Livestreaming means it’s brand new.”

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

(7)