Social media is bad for kids. At least it is according to multiple studies, former social media company employees, and the president of the United States. It’s also just one more way that the internet can be a dangerous place for children — a problem that lawmakers have been trying for decades to solve. They’re still trying.
In the past, this has led to laws that are ill-considered, short-sighted, exclusive, and even unconstitutional. We all use the internet that these laws affect, yet they don’t apply to everyone, and everyone’s rights aren’t always taken into account. Laws that focus on children have also taken attention and time away from passing laws that help everyone. It looks like this cycle is starting again.
When President Joe Biden gave his first State of the Union address on March 1, he laid out his vision for how to make the internet a better place for its virtual inhabitants. Specifically, the president called for privacy protections that included a ban on targeted advertising and the collection of personal data. Social media platforms, Biden said, were running a lucrative and harmful “national experiment” on their users. They needed to be held accountable for it.
Privacy advocates were surely happy about the prominent placement. But there was one problem: Biden demanded all these things apply to children only. Adults, it seems, would have to continue to fend for themselves. So while Biden’s speech might have been novel in calling out the potential harms of the data-hungry internet economy, framing them as a children’s safety issue was very familiar territory.
“No parent wants their kid to be hurt, and kids are certainly vulnerable in ways that adults aren’t,” Sen. Ron Wyden (D-OR) told Recode. “It makes sense that politicians and the press would make protecting children the focus of a lot of energy.”
The latest round of internet safety legislation for children was kicked off by Frances Haugen, the former Facebook employee who leaked piles of internal documents, including some that showed that the company knew its products could be harmful to young users. Lawmakers, led by Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), jumped at the chance to use her revelations to investigate social media and privacy harms to children. In the subsequent months, they released two bills: the Kids Online Safety Act (KOSA) and the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act.
As Congress makes yet another bipartisan push for child-focused internet laws — now with the president’s endorsement — it’s worth looking at some of the unintended consequences of past efforts. Sometimes, the laws are half-measures that help some people but leave others out. Other times, laws that are supposed to keep children safe in theory end up harming everyone in practice. And one of those laws essentially created the internet as we know it today, even as the child protection element of it was struck down by the courts.
Wyden co-wrote Section 230, and has been trying to make internet legislation (including but not limited to privacy laws) for decades. He’s also seen where and how Congress’s past efforts have fallen short.
“In my experience, a lot of pundits and politicians look at tech issues very narrowly, without thinking through all the implications of what they’re proposing or considering everyone who uses technology,” he said.
The well-worn “new tech battleground”
Last month, the New York Times deemed child safety to be “the new tech battleground.” But child safety has actually been a tech battleground for quite some time.
Some of the very first attempts to regulate the internet focused on its potential danger to children. In the mid-’90s, lawmakers became increasingly concerned about how easily children could access porn online. They tried to solve this with the Communications Decency Act, which made it illegal to knowingly transmit or display porn on the internet to anyone under the age of 18. Most of the CDA was struck down in the courts for being unconstitutional. In an effort to spare children from the potential harms of seeing sexually explicit content, courts said, the CDA violated the free speech of adults. But one part remained: Section 230, which says that internet platforms can’t be held civilly liable for content their users post. This law has allowed websites that host third-party content — think Yelp, YouTube, Facebook, even the comments sections of news sites — to exist.
This was followed by 1998’s Children’s Online Privacy Protection Act (COPPA), which gave children under 13 some privacy protections, including limits on collecting and retaining their data. More than 20 years after COPPA took effect — and with internet platforms and mobile apps collecting more of our data from more places than ever — Congress still can’t get it together to pass a consumer internet privacy law that covers the rest of us. Actually, it can’t even get it together to pass an update to COPPA, despite many attempts by its author, Sen. Ed Markey (D-MA), to do so over ensuing years (here’s his most recent).
COPPA is also an example of how when a law only applies to certain people, it can become needlessly complicated and may even introduce more privacy issues. The United Kingdom, for example, wants to make pornography sites verify ages by making users supply credit cards or passports. It also introduces a new security risk, as those sites will have another set of sensitive data that could potentially be accessed by bad actors. In the United States, websites typically verify ages through self-declaration, which means all kids have to do is lie to get access to their favorite sites. To get around COPPA, many sites simply forbid anyone under 13 from using them, but don’t require anyone to prove how old they are when they sign up for an account. Either the age verification is essentially useless, or it’s an invasion of privacy.
“When you start looking at how to effectively verify somebody’s age on the internet without also invading their privacy as well as everybody else’s? That’s a really hard question,” India McKinney, director of federal affairs at the Electronic Frontier Foundation, told Recode. “In order to verify somebody’s age, you have to collect a lot more information about them. How does that protect anybody’s privacy?”
That’s not to say that COPPA is a failure by any means. In fact, COPPA was what allowed the Federal Trade Commission (FTC) to go after Weight Watchers for collecting data about users as young as 8 years old. Last week, Weight Watchers was forced to pay a $1.5 million fine and delete young users’ data.
Somewhat ironically, Section 230, the aforementioned byproduct of an attempt to protect children from sex online, is now being undermined by laws framed as attempts to protect children from sex online. Though many lawmakers have tried to reform Section 230 in various ways and for various reasons over the years, the only successful attempt is the Fight Online Sex Trafficking Act and the Stop Enabling Sex Traffickers Act (FOSTA-SESTA). FOSTA-SESTA was framed as a way to prevent online sex trafficking — primarily of children — by removing Section 230’s protections for websites that promote sex work.
Free speech and civil liberties advocates have serious concerns with FOSTA-SESTA. They’ve argued that, under the law, overly cautious websites would censor anything remotely related to sex to avoid even the possibility of a lawsuit. Sex workers also feared that their jobs would become more dangerous if the platforms they used to screen customers or advertise their services shut down. But it’s hard to vote against a bill that says it’s meant to protect children from some of the worst abuses imaginable, and the bill passed both houses by wide margins. In the Senate, only two people voted against FOSTA-SESTA. One was Wyden, Section 230’s co-author and one of its biggest defenders. The other was Rand Paul (R-KY).
After the bill was signed into law in April 2018, many of its detractors’ fears were realized. Several websites removed entire sections and content that had nothing to do with sex trafficking. Consensual sex workers said they had to work the streets and take on unknown customers when their online advertisements and screening networks went dark. Sex workers who were LGBTQ+ or people of color tended to be the hardest hit.
Meanwhile, the benefits of FOSTA-SESTA seem nonexistent. A government report issued last June said it’s almost never been used. Lawmakers — some of whom voted for FOSTA-SESTA — are now trying to pass the SAFE SEX Workers Study Act, which would study the effectiveness and impact of FOSTA-SESTA. The bill was introduced last Congress and was reintroduced earlier this month. Evan Greer, director of Fight for the Future, a digital rights advocacy group, told Recode that she thinks this study should be done before any additional laws that change Section 230 are passed.
“If lawmakers are serious about passing legislation to reduce Big Tech harms to children, they need to learn from their past failures,” she said. “The stakes are incredibly high.”
The future of Big Tech regulation looks narrow
The latest set of bills coming out of the latest child internet safety panic — the Kids Online Safety Act and the EARN IT Act — seem to be continuing the mistakes of the past. EARN IT would remove Section 230 protections from web services that don’t follow a to-be-decided list of best practices to detect or remove child sexual abuse material. The bill’s opponents, which include more than 60 civil rights and internet freedom groups, fear that those best practices will include forbidding encryption, which could have a stifling effect on the speech of people of all ages and all over the world who rely on encryption services to keep their messages private. EARN IT nearly made it to a Senate vote in the last session of Congress. It’s already sailed through a committee markup, and once again is waiting for a floor vote.
“EARN IT is even worse [than FOSTA-SESTA],” Wyden said. “It wouldn’t do a thing to help law enforcement catch predators, or to help prevent kids from becoming victims. Experts say it would make it very difficult for companies to offer secure, encrypted communications. And it would give states incredible power over how the law would work. After seeing the awful laws that Florida, Texas, and other Republican states are passing to target discussion of race, LGBTQ issues, and abortion access, that’s a huge concern.”
The Kids Online Safety Act would require social media platforms to provide tools for kids who are 16 or younger that prevent promotion or amplification of content that is considered to be harmful, and it would give parents the ability to moderate or limit their kids’ use of those platforms. Its potential benefits and downsides are still being evaluated by digital rights advocates; it has the endorsement of several children’s and conservative groups, as well as the American Psychological Association. But it’s clear that some lawmakers want child safety and privacy to be at the forefront of any internet legislation push, even though we still haven’t passed other initiatives like privacy for people of all ages and Big Tech-targeted antitrust bills. Blumenthal has compared the need to tackle the harms of social media to a “Big Tobacco moment,” which implies that social media, like tobacco products, harms everyone. Yet KOSA only applies to children.
“It’s cynical and disturbing to use children’s safety as a way to score political points and advance legislation that won’t help children,” Greer said. “The best way to protect children online is to protect everyone online.”
But consumer internet privacy laws that protect everyone have gone nowhere in this Congress, which has also been the case in pretty much every session of Congress since COPPA’s passage. Blackburn and Blumenthal are well aware of this. Blackburn has sponsored several failed privacy bills that apply to all ages over the years, including as recently as this Congress. (It’s also worth pointing out that Blackburn’s support for internet privacy didn’t extend to an FCC regulation forbidding internet service providers from selling customer browsing history, which she led the charge to block.) Blumenthal tried to come up with a bipartisan privacy bill last Congress but ultimately couldn’t. This Congress, he’s called on the FTC to write new privacy rules.
The White House did not respond to request for comment on if the president believes that adults should get the same privacy protections he said he wanted for kids. The president has also stayed pretty quiet on some of the other internet-related issues that were once expected to be a major part of his administration, such as an effort to curb Big Tech’s power through a package of antitrust bills and initiatives. The State of the Union address only made a passing reference to competition.
On the other hand, Haugen, the Facebook whistleblower, was one of Biden’s special guests. He thanked her for her courage, and she got a standing ovation.