HomeBudget & Tax NewsConcern for Kids Prompts Problematic Internet Regulation (Opinion)

Concern for Kids Prompts Problematic Internet Regulation (Opinion)

Concern for kids prompts problematic Internet regulation “that turned out to be unwise, unconstitutional, or both.” (Opinion)

By Daniel Lyons

“What about the kids?” plays an outsized role in the short history of Internet law. From the Communications Decency Act to the Child Online Protection Act, California’s violent video game law, and more, the contours of online regulation have been shaped by well-meaning legislation that turned out to be unwise, unconstitutional, or both.

Last month, the Senate Commerce Committee reported out a bill poised to become the next entry in this dubious canon. The Kids Online Safety Act (KOSA) purports to protect minors from the very real risks posed by social media. But it does so by placing a vague duty of care on a wide range of Internet-based companies in a manner likely to chill significant amounts of online speech and do more harm than good to minors and society in general—in the unlikely event that it survives judicial review.

The crux of the act is Section 3, which requires “covered platforms” to take reasonable measures to prevent and mitigate specific harms to minors, including:

  • Anxiety, depression, eating disorders, substance-use disorders, and suicidal behaviors;
  • Patterns of use that indicate addiction-like behavior;
  • Physical violence, bullying, and harassment of minors;
  • Sexual exploitation and abuse;
  • Promotion of narcotics, tobacco, or alcohol; and
  • Predatory, unfair, or deceptive marketing practices or other financial harms.

“Covered platforms” includes any public-facing website, online service, or mobile application that allows sharing of user-generated content. The duty applies whenever the platform knows or should have known based on objective circumstances that someone under 17 was using the service.

At first glance, this seems a laudable purpose, which helps explain why the bill has nearly 50 co-sponsors. But the complications come when one considers what content should be shielded from minors. What counts as a “reasonable” measure? How can a platform shield minors from content likely to exacerbate anxiety or depression, the triggers of which can vary from minor to minor? To avoid liability, platforms are likely to err on the side of censorship, shielding minors from a broad array of online content.

This chilling of speech is exacerbated by the enforcement mechanism. The Federal Trade Commission and any state attorney general may bring suit under the act, meaning platforms must block content that any of 51 jurisdictions may consider problematic.

State attorneys general are far from unified on these issues. As Taxpayer Protection Alliance’s (TPA) David McGarry notes, left-leaning and right-leaning jurisdictions can and do disagree on whether, for example puberty-blockers or gay conversion therapies are helpful or harmful to teens. Platforms would steer minors clear of many such controversial topics for fear that a government official somewhere would believe the platform has exposed minors to harm.

The Supreme Court has roundly condemned similar governmental efforts to dictate what content is appropriate for minors. In Brown v. Entertainment Merchants Association, the Court reiterated that “minors are entitled to a significant measure of First Amendment protection, and only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to them.” Like the violent video game law at issue in that case, KOSA involves legislation whose effect “may indeed be in support of what some parents of the restricted children actually want,” but “its entire effect is only in support of what the State thinks parents ought to want.” This substitution of the state for the parent is “not the narrow tailoring…that restriction of First Amendment rights requires.”

Moreover, as the Electronic Frontier Foundation (EFF) notes, while certain topics may be controversial, it is by no means clear that banning them is the solution. EFF highlights research showing that Tumblr’s efforts to ban self-harm and eating disorder blogs inadvertently blocked not only pro-anorexia sites but also content designed to help people struggling with the disorder—and similar efforts may make harmful content just as easy to find by prompting users to create new keywords to avoid platform filters.

It’s important not to downplay the very real concerns that prompted KOSA and similar legislation. There is a growing amount of research linking smartphone adoption and social media use to declines in numerous metrics of adolescent well-being, and we do ourselves no favors as a society by ignoring this phenomenon. But the solution lies in empowering users and parents to understand these risks and take steps to mitigate these harms.

TPA Executive Director Pat Hedger recently highlighted a missive by Ohio Lieutenant Governor Jon Husted giving parents step-by-step directions about how to set time limits for apps on kids’ devices. Government can play a role as information broker, helping parents better guide their children through the digital world. But the First Amendment largely precludes government from displacing parents as the primary arbiters of what content minors can and cannot consume.

Originally published by the American Enterprise Institute. Republished with permission.

For more Rights, Justice, and Culture News.

For more Budget & Tax News.

Daniel Lyons
Daniel Lyons
Daniel Lyons is a nonresident senior fellow at the American Enterprise Institute.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Our Flagship Pocast

Most Popular

Recent Comments