NewsOhio News

Actions

Tech trade group sues Ohio over social media parental consent law

The measure requiring minors under 16 to get consent before joining services like TikTok is set to take effect Jan. 15
social media.jpeg
Posted

The following articlewas originally published in the Ohio Capital Journal and published on News5Cleveland.com under a content-sharing agreement.

An Ohio law requiring social media companies get parental consent for some users who are minors is set to take effect next Monday, but a tech industry group wants a federal judge to block it. The organization, known as NetChoice, filed suit in Ohio’s Southern District Court last Friday.

Under the law, social media companies would have to get “verifiable” consent from a parent before allowing minors younger than 16 from using the service. The tech companies complain the measure’s provisions are too ambiguous. Ditto its carve outs for e-commerce and news sites. Worse, they argue, enforcement would likely impose age verification for all users, which courts have repeatedly rejected as a pre-requisite for accessing speech.

The NetChoice lawsuit asks the court to declare the statute unconstitutional and to block it from taking effect while the case plays out.

The arguments

The organization asserts three arguments against Ohio’s statute: First, its “blanket parental-consent requirement” would violate minors’ First Amendment rights. Next, the group contends the law imposes restrictions based on content and speaker — both limitations the courts have viewed skeptically. NetChoice also argues the provisions are vague enough that companies won’t know if the law applies to them or their responsibilities if it does.

NetChoice’s arguments rely heavily on a case known as Brown v. Entertainment Merchants Association in which the U.S. Supreme Court struck down a California law restricting minors’ access to violent video games.

Writing for the majority, Justice Antonin Scalia described video games as the latest in a succession of media that governments have attempted to regulate over “harmful” speech. While the state has legitimate interest in protecting youth, he reasoned, “that does not include a free-floating power to restrict the ideas to which children may be exposed.”

“This lawsuit is cowardly but not unexpected,” Lt. Gov. Jon Husted said in a statement.

Husted was the driving force behind the measure, which lawmakers included in the state operating budget back in July. Because it advanced as part of the budget, the legislation eschewed the typical committee process in which supporters and detractors could testify on its content. Since then, a similar law in Arkansas has been halted by a federal court.

Still, Husted insisted the measure’s consent provisions are needed.

“In filing this lawsuit, these companies are determined to go around parents to expose children to harmful content and addict them to their platforms,” Husted said. “These companies know that they are harming our children with addictive algorithms with catastrophic health and mental health outcomes.”

In its court filing, NetChoice insisted parents already have myriad options for controlling what their children see. The group describes control options at essentially every step in the chain — with their Internet service provider, router, internet connected device, a third-party app or on the social media site itself.

The question of who’s covered

The problem for Husted and the law’s backers isn’t knowing what they want — it’s describing it in statute. How do you define “social media” so that kids need consent before signing up for Instagram, TikTok, or whatever comes next, without also sweeping in a slew of other unrelated online entities?

The state law applies to any website where users have a profile and interact with other users by posting or reacting to content.

NetChoice explained that definition would sweep in an exceptionally broad array of platforms. Some are unsurprising, like gaming platforms for Xbox or Playstation. Others fall into a grey area. Most kids aren’t setting up a profile on LinkedIn or GitHub, but it’s not like a young entrepreneur or computer programmer is unheard of. Educational sites like Blackboard and general information services like Quora or TripAdvisor could all be subject to the law.

It’s a definition broad enough that it would include user reviews on a site like Amazon or the comments section on a news website. To get around that, drafters included carveouts for e-commerce and “established and widely recognized media outlet(s).”

NetChoice derides those distinctions. It notes the bill never attempts to define what media outlets qualify, and the e-commerce provision yields puzzling outcomes. A minor could review a toaster on Amazon, the group argues, but be prevented from posting the same review on Facebook.

“A minor would be able to create an account to read and post reviews about televisions without parental consent,” the group adds, “but would need parental consent to read and post reviews about television shows.”

More confounding, NetChoice argues, is the law’s 11-point list that “may” be used to determine whether a website targets children. Again, some of those factors make sense — for instance, the “use of animated characters” or “presence of child celebrities.” Others are vague and open-ended. A website’s “subject matter,” “design elements,” “visual content” and “music” will influence its audience, but those terms offer little in the way of useful direction to the site’s owners.

Even if a company attempts to get parental consent, they’ll be faced with several challenges. Confirming a parental relationship between the individual giving consent and the minor trying to access a website is not always straightforward.

“These difficulties are compounded when parents do not, for example, have the same last name or address as their children or cases in which parents disagree about whether to grant consent,” the complaint adds.

But for all its complaints about the law’s application, NetChoice describes existing restrictions for minors under the age of 13 as a success. The organization’s complaint highlights different versions of YouTube and TikTok available for that age group.

The question of what’s covered

The core of NetChoice’s argument relies on the First Amendment. The group insists Ohio’s law places a barrier between minors and protected speech. Even if they aren’t adults, NetChoice argues, courts have repeatedly found minors still have First Amendment rights.

Citing Brown, the group argues Ohio’s law doesn’t enforce parental authority. Instead, it attempts to “impose governmental authority, subject only to a parental veto.”

The group also warns that by attempting to impose limitations for minors, the law might require age-verification for every user. Although the law doesn’t explicitly require websites to verify all users’ ages, in practice, it’s hard to see how they’d be able to avoid it, which presents broader First Amendment problems.

“The Supreme Court has rejected laws that require people to provide identification or personal information to access protected speech,” NetChoice notes.

The group also raises questions about how much of a website minors can access. While the law prohibits a minor from setting up a profile without consent, NetChoice argues it’s not clear if that means the site owner must bar access completely or just to portions of the site that require a login.