Social Links: AI, Age-Gating, and the COPPA Cabana

Morrison & Foerster LLP - Social Media

CONGRESS MOVES TO BLOCK ALL STATE AI LAWS

In a late-night addition to a sweeping budget proposal, the House Energy and Commerce Committee inserted a provision in a large reconciliation package that would block state and local governments from regulating artificial intelligence for the next decade.

The provision would bar enforcement of any local or state law that attempts to regulate “artificial intelligence models, artificial intelligence systems, or automated decision systems,” unless those laws are designed specifically to speed up AI development. The scope of the provision is broad, encompassing technologies ranging from generative AI to facial recognition and algorithmic decision-making in hiring, housing, and public benefits.

The legislative effort arrives as states have begun to step into the regulatory void. New York recently passed a law requiring bias audits of automated hiring tools. California adopted measures mandating AI transparency in healthcare and public services. But under the new federal proposal, such state efforts could be nullified.

“This bill is a sweeping and reckless attempt to shield some of the largest and most powerful corporations in the world . . . from any sort of accountability,” said Lee Hepner, senior legal counsel at the American Economic Liberties Project.

That accountability has already come under scrutiny. RealPage, a software company specializing in property management, is facing a multistate lawsuit alleging it helped landlords collude to drive up rents using its pricing algorithm. SafeRent, another firm, recently settled claims that its opaque tenant scoring system discriminated against Black and Hispanic renters.

If passed, the federal measure could derail those regulatory efforts. For now, the battle over AI oversight is shifting from statehouses to Capitol Hill, where industry access is deep and bipartisan skepticism of regulation runs strong.

WHAT’S UP WITH THE TIKTOK BAN?

Remember 20 minutes ago when it appeared TikTok was about to be banned in the United States? Is that even still a thing? Well, sort of.

President Trump says he’s willing to hit the snooze button again on a TikTok ban if its Chinese parent company, ByteDance, doesn’t sell the app by the current deadline of June 19. In an interview with Meet the Press, the president said he’d extend the pause, adding he has “a little sweet spot in my heart” for the platform, especially given its popularity with young Americans.

This isn’t the first reprieve. Earlier this year, President Trump signed an executive order delaying enforcement of a divest-or-ban law by 75 days, establishing the current deadline. His own TikTok account has since become a juggernaut, racking up billions of views.

Even TikTok CEO Shou Zi Chew took note, publicly thanking the president for being open to a solution. For now, the clock keeps ticking and tocking.

OHIO APPEALS AGE-GATING RULING

Ohio Attorney General Dave Yost isn’t giving up on the state’s push to limit kids’ access to social media. He’s appealing a federal judge’s ruling that struck down a law requiring platforms to verify users’ ages and get a parent’s OK for anyone under 16.

The law was blocked in April by U.S. District Judge Algenon L. Marbley, who said it stepped on the First Amendment. Yost, who recently suspended his gubernatorial campaign, filed the appeal with the Sixth Circuit Court of Appeals, hoping to revive the measure.

Under the law, social media companies would have to check users’ ages through things like government ID, credit cards, or digital consent forms; and get “verifiable” permission from a parent or guardian for younger users. But NetChoice, unsurprisingly, sued to stop the law, arguing it was so vague it could end up limiting access to entirely legal content. Your favorite blog covered all the action last April.

Judge Marbley said he appreciated the state’s effort to protect kids but added that “even the government’s most noble entreaties” still have to respect the Constitution.

Ohio’s not alone here. A handful of states, including Arkansas, Texas, Louisiana, Utah, and Florida, have passed similar laws aimed at age-gating social media. Most of those have either been blocked by courts or are tied up in legal battles of their own.

Still, lawmakers in Columbus are already mulling a new workaround. One idea being tossed around would shift the consent requirement to app stores instead of the platforms themselves. Whether that works out remains to be seen, but it’s clear Ohio officials are determined to stay in the fight when it comes to kids and social media.

MEANWHILE, IN TEXAS . . .

Texas is once again on the far edge of the national debate over kids and social media. A new proposal, House Bill 186, would ban anyone under 18 from creating a social media account and require platforms to verify users’ ages using personal data like IDs, credit cards, or other “transactional” information. The bill passed the Texas House with bipartisan support and appears to have momentum in the Senate.

Parents could also request the deletion of their children’s accounts, and companies would be required to act within 10 days. The bill would apply broadly to sites designed for sharing posts, images, or messages, but not to email, news, or gambling platforms. That last one is a bit of a head-scratcher as minors cannot legally gamble anywhere in the U.S.

Supporters, including Rep. Jared Patterson (the bill’s sponsor), say the legislation addresses a growing youth mental health crisis. “Social media is the most harmful product our kids have legal access to in Texas,” Patterson wrote.

Sen. Adam Hinojosa echoed that message, telling colleagues that online spaces now pose the greatest risk to children. “With House Bill 186,” he said at a recent hearing, “we confront the evil before us and boldly say, ‘You cannot have our children.’”

But critics say the law could backfire. Morgan McGuire, a 17-year-old Texan and TikTok creator, warned that cutting teens off from social media until age 18 drops them into the digital world just as support systems fall away. Others argue the bill oversteps. The usual suspects, NetChoice and the Computer & Communications Industry Association, both cite First Amendment concerns and say the bill tramples on parental choice.

Texas isn’t alone. Ten states have passed some form of social media restriction for minors since mid-2024. But only Florida has floated an outright ban, which stops at age 14.

Privacy concerns loom large, too. The bill mandates that platforms must delete personal data collected during age checks but offers no timeline or clear rules. Critics warn it could lead to massive data hoards that could wind up in the hands of bad actors.

HB 186 may aim to protect kids, but it’s raising new questions about who should decide how young people navigate life online: parents, platforms, or the state.

FTC AMENDS COPPA

The Federal Trade Commission has finalized the first amendments to the Children’s Online Privacy Protection Rule (COPPA) since 2013. The revisions take effect on June 23 of this year, with most operators required to comply by April of 2026. After weighing roughly 300 public comments, the Commission adopted the changes unanimously, citing shifts in the way children use online services.

Here are some key updates:

  • The multifactor test for deciding whether a site or service is “directed to children” is unchanged but now lists illustrative evidence (marketing materials, user reviews, similar-site demographics) to clarify how the FTC applies the standard.
  • A new standalone “mixed audience” definition confirms that services appealing to both children and older users may rely on COPPA’s limited parental-consent exceptions once they collect age information.
  • Operators must maintain a written program with safeguards appropriate to children’s data. A blanket security policy will suffice if it expressly protects minors.
  • Parents must give an additional, distinct authorization before a child’s data is disclosed to any outside party. Sharing for advertising, monetization, or AI training is not considered integral to service operation.
  • COPPA still lets companies collect certain data without parental consent if it’s only used for basic site functions, but now they must clearly explain what the data is used for and how it’s protected from misuse, like being used for ads.
  • Biometrics such as fingerprints, voiceprints, retina patterns, and genetic data are expressly covered, but broader references to gait or facial-derived data were dropped.
  • Children’s data may be kept only as long as “reasonably necessary,” and operators must publish a written retention policy.
  • Knowledge-based authentication and facial recognition (with human review) join the list of approved parental-consent methods.
  • The COPPA Safe Harbor programs will be required to publicly disclose their membership lists and report additional information to the FTC.

The Commission declined to curb push notifications or let schools provide blanket consent for educational technology but signaled it may use future enforcement or rulemaking to address engagement tactics. These are significant changes that could potentially cause some compliance headaches for social media companies.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Morrison & Foerster LLP - Social Media

Written by:

Morrison & Foerster LLP - Social Media
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Morrison & Foerster LLP - Social Media on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide