What this week’s Senate hearing with Big Tech execs means for the future of children’s online safety laws

 

By Sara Morrison

As a 21-year-old college senior, Zamaan Qureshi is part of a generation that grew up with social media. Like so many of his peers, Qureshi has spent hours staring at social media platforms designed to feed users an endless stream of algorithm-fueled content—some of it potentially harmful—in a bid to retain their attention for as long as possible. 

“Whether it’s unwanted sexual advances, endless scrolling, or harmful nudges that are keeping you on a platform longer than you want to be, we’re the product of all of those various experiments and features,” he says. “We really just ended up being the commodity at the end of the day.”

Qureshi is one of several young people trying to tackle the problem head on. He’s the cochair of Design It For Us, a coalition of organizations advocating for laws to make the internet and social media safer. The group has met with lawmakers including Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), and Reps. Jamaal Bowman (D-NY) and Cathy McMorris Rodgers (R-WA).

Qureshi plans to attend tomorrow’s Senate Judiciary hearing on child sexual exploitation and safety on Big Tech platforms.

Five tech CEOs are scheduled to testify: X’s Linda Yaccarino, Snap’s Evan Spiegel, Discord’s Jason Citron, TikTok’s Shou Chew, and Meta’s Mark Zuckerberg. The hearing will serve as a sort of kickoff for the latest push for federal legislation that addresses social media’s perceived harms to children. Qureshi and his peers will be following it up with a rally outside the Capitol building.

“We really see this moment as a point to organize around, given the fact that these past couple of months have been pretty tumultuous, to say the least, for the tech industry,” Qureshi says. “We see this as a real opportunity to drive Congress to act on some of these issues.”

They face an uphill battle, but public sentiment seems to increasingly be on their side. We’re in the midst of a teen mental health crisis that is often attributed to the rise of social media. Two whistleblowers have now come forward with allegations that Meta has long known that its platforms are harmful to children but fails to take measures to mitigate that harm. Nearly every state in the country is suing Meta, accusing the company of knowingly designing its platforms to get young people addicted to them and lying about it to the general public. A growing list of school districts are suing social media companies for allegedly harming the mental health of their students. TikTok is seen as everything from a cybersecurity threat to an amplifier of dangerous challenges that have been, in some cases, deadly. Many states, tired of waiting for the federal government to act, have passed or are in the process of passing their own laws governing how children will see and can use online services. A lot of parents—that is, voters—are concerned about the effect of social media on their children. They want their representatives to finally do something about it.

Of the handful of children’s online safety bills now on the table, the Kids Online Safety Act (KOSA), seems to have the best shot at becoming a law. It’s sponsored by 47 senators from both sides of the aisle and unanimously passed out of committee last summer. It’s just waiting for a floor vote. Its supporters continue to push Sen. Chuck Schumer (D-NY) to hold one. A spokesperson for the Senate Majority Leader didn’t tell Fast Company when that vote might happen, saying only that “the sponsors of the online safety bills will work to lock in the necessary support.”

KOSA would require social media platforms to implement various tools and design features for users under 16 years old, including additional privacy controls and a “duty of care” to prevent children from seeing harmful content on or becoming addicted to their services. 

KOSA was first introduced in 2022 by Blumenthal and Blackburn in the wake of and in response to Meta whistleblower Frances Haugen’s revelations that the company had evidence that its platforms were harmful to children but refused to act. A revised version of it was reintroduced last year. (Blumenthal and Blackburn are both members of the Judiciary Committee, so they’ll be at the hearing.)

 

“Without real and enforceable reforms, social media companies will only continue publicly pretending to care about young people’s safety while privately prioritizing profits,” the two senators said to Fast Company in a statement.

Regulating Big Tech and internet services for the entire U.S. population has proven to be exceedingly difficult, as the inability to pass federal online privacy and digital antitrust laws shows. But Congress has been more responsive when it comes to protecting children online. “The Children’s Online Privacy Protection Act, passed in 1998, gives children under 13 special privacy rights. Section 230, which says that online services can’t be held liable for content posted by their users and is considered to be one of the foundational laws of the internet, was an amendment to a 1996 law meant to protect children from accessing online pornography.

“I’m more optimistic now about KOSA than I’ve ever been,” says former Rep. Dick Gephardt, the former 14-term Missouri congressman who’s now working as cochair of the Council for Responsible Social Media, which supports KOSA. “More members understand [the issue]. They’re concerned about it. They’re hearing from their constituents in their states or congressional districts. So that gives me hope.” 

But, Gephardt adds, Congress will likely focus its attention on other, more pressing matters first, and Big Tech companies have “more money than God” to put towards lobbying lawmakers not to pass bills they don’t like. For their part, tech companies say that they continue to invest in and roll out tools to help teens and their parents navigate social media safely. Meta, for instance, just announced a default setting that blocks adults from messaging teens who don’t follow them. The company has also proposed a law that requires that app stores get parental consent before users under 16 can download an app. Snap, on the other hand, is the first and so far only social media company to come out in favor of KOSA, albeit less than a week before the hearing its CEO had to be subpoenaed to appear at.

KOSA is also opposed by some digital rights advocacy groups that say it could promote censorship while harming the privacy of users of all ages, depending on if and how platforms are required to verify users’ ages. According to former Biden adviser Tim Wu, their arguments have been at least somewhat effective. KOSA failed last session, he said, due in part to those complaints—or, at least, because they gave lawmakers a convenient excuse not to move forward with the bill. Without a floor vote, lawmakers never have to take what could be an unpopular public stand against a children’s safety bill nor face accountability for that opposition.

Indeed, what may well give KOSA the edge it needs to pass into law is widespread public support. A plethora of surveys show that a majority—in some cases, a vast majority—of parents are concerned about the impact social media is having on their children and that they support laws that would mitigate it. (Despite those worries, it must be said, a lot of parents can’t or won’t stop their kids from using the apps, even when their children are younger than what the platforms themselves allow.) While Meta denies many of the whistleblowers’ allegations and lawsuits’ assertions, that doesn’t matter very much when it comes to the court of public opinion.

Gephardt says when it comes to regulating the internet, there’s plenty to do. Protecting children on social media, he thinks, should come first.

“If we can’t do that,” he says, “we can’t do anything.”

Fast Company – technology

(19)