OHIO SOCIAL MEDIA LAW STRUCK DOWN
On April 16, 2025, a federal judge in Ohio ended the state’s ambitious plan to severely limit social media access for minors. The court permanently blocked Ohio’s Attorney General from enforcing the Parental Notification by Social Media Operators Act, effectively shelving the law.
Passed with admirable optimism in July 2023 and set to roll out in January 2024, the Act required social media platforms to make sure users were old enough to drive before letting them post memes and dance videos (unless their folks say it’s okay). The court saw through the state's well-intentioned lawmaking and called out the law for First Amendment violations. U.S. District Court Judge Algenon L. Marbley pointedly noted the state’s failure to clear the notoriously high bar of strict scrutiny, effectively dooming the Act from the start. Specifically, the judge felt the legislation cherry-picked which online platforms it wanted to target, essentially trying to referee a game it didn’t fully understand. “Ohio’s response to a societal worry that children might be harmed if they are allowed to access adult-only sections cannot be to ban children from the library altogether absent a permission slip,” Judge Marbley wrote in the decision.
The court framed the situation as the classic standoff between two American sacred cows: the rights of kids to express themselves freely online and the rights of parents to raise their offspring without government micromanagement. Ultimately, the ruling reminded Ohio legislators that freedom of speech, even adolescent babble, is still constitutionally protected.
This decision adds Ohio to a growing list of states being successfully challenged by NetChoice, the tech trade association that’s rapidly becoming the grim reaper of underage social media control legislation.
ANOTHER ARKANSAS SOCIAL MEDIA LAW STRUCK DOWN
Arkansas has once again found itself on the losing side of a First Amendment battle. The state’s latest effort, the Social Media Safety Act (Act 689), met the same fate as its 2023 predecessor (SB 396): enjoined as unconstitutional.
The plaintiff in both cases, unsurprisingly, was NetChoice. Unless you’ve been living under a rock with no Wi-Fi, you know it from similar successful challenges in California, Utah, Maryland, Mississippi, Ohio, and Texas (not to mention extensive coverage in your favorite blog). Courts across these states, as in Arkansas, have found these laws imposed undue burdens on free speech and were not properly tailored to address harms to minors. The Arkansas ruling marked the first time NetChoice obtained a permanent injunction against restrictive social media laws, but it was quickly followed by another victory in Ohio.
The Social Media Safety Act would have required social media companies to verify that users were either at least 18 years old or had verifiable parental consent before creating an account. The law also mandated the use of third-party vendors to perform age verification, which could involve government-issued identification or other “commercially reasonable” methods. After granting access, platforms would be prohibited from retaining any identifying information, a provision that invited as much skepticism as it did compliance challenges.
Although Arkansas’ latest statute has been struck down, the broader state-level push to regulate minors’ use of social media continues. Virginia is considering an amendment to its data privacy law that would restrict children under 16 to one hour of daily social media use, while also requiring age verification mechanisms. Arkansas has introduced additional legislation targeting social media companies, and Utah has moved forward with app store age restrictions that will be phased in between May 2025 and December 2026.
Texas has introduced HB 186, another attempt at mandating age verification, despite previous setbacks. Not to be outdone, Florida has proposed SB 868, which would authorize law enforcement to access minors’ messages during investigations, grant parents full access to a minor’s communications, and prohibit the use of accounts featuring disappearing messages.
The trend is clear. States remain committed to policing minors’ online activity, but so far their efforts have largely faltered under constitutional scrutiny. NetChoice, it seems, will continue to have no shortage of work.
X SUES MINNESOTA OVER DEEPFAKE LAW
X has filed a federal lawsuit against Minnesota Attorney General Keith Ellison, challenging the constitutionality of a state law aimed at combating election-related deepfakes.
Passed with bipartisan support in 2023, the law makes it a crime to post fake videos intended to influence elections, with penalties including fines or even prison time. Legislators were understandably concerned about the growing potential for AI-generated misinformation to distort political campaigns. X, however, argues that the statute’s language is so vague that no reasonable platform could confidently determine what it allows.
According to the lawsuit, the risk of criminal penalties would force platforms to err on the side of caution, pulling down anything that even faintly resembles a deepfake. In X’s view, this would chill lawful political speech—precisely the kind of “uninhibited, robust, and wide-open” debate the First Amendment was built to protect.
X also points out that it already maintains internal mechanisms, such as the “Community Notes” feature, to address misleading content without heavy-handed censorship. The platform is asking the court to, among other things, declare the statute in violation of both the Minnesota and U.S. Constitutions, nullify it, and prevent its enforcement.
Supporters of the law are unsurprised by the challenge. State Senator Erin Maye Quade, one of the bill’s sponsors, defended the statute as narrowly tailored to prevent voter deception. In remarks that left little room for ambiguity, she suggested X’s motives might be less about free speech and more about preserving the ability to influence elections through technology. As for Attorney General Ellison’s office, the response so far has been measured: officials say they are reviewing the lawsuit and will respond “in the appropriate time and manner.”
THE NINTH CIRCUIT’S EXPANSIVE TAKE ON INTERNET JURISDICTION
On April 21, 2025, the Ninth Circuit, sitting en banc, delivered a 10-1 decision reviving Briskin v. Shopify, a class action accusing e-commerce platform Shopify of violating privacy laws through the use of internet cookies. The court adopted a notably broad view of specific personal jurisdiction over online businesses—a development likely to raise eyebrows across the tech and legal communities.
The case arose when California resident Brandon Briskin alleged that Shopify surreptitiously installed tracking cookies on his iPhone after he made a purchase through Shopify’s platform. According to the complaint, the cookies monitored Briskin’s movements and online behavior without his consent, violating a host of California privacy statutes, the state constitution, and even the Fourth Amendment.
Both the district court and a Ninth Circuit panel initially dismissed the claims for lack of jurisdiction. Shopify is headquartered in Canada, and none of its U.S. subsidiaries are based in California. However, the en banc majority reversed the panel, concluding that Shopify had “purposefully directed” its conduct at California consumers by allegedly monetizing their personal data.
Judge Consuelo Callahan, the sole dissenter, warned that the majority’s reasoning effectively creates a “traveling cookie” doctrine, suggesting that if a company’s tracking software attaches to a consumer’s device, jurisdiction follows that individual wherever they go.
The decision opens the door to broader assertions of jurisdiction, with potentially far-reaching consequences for social media, online commerce, and any platform that uses cookies. Given the stakes, we’ll be watching this one closely.
UTAH CRACKS DOWN ON “FAMILY VLOGGING” AFTER HIGH-PROFILE ABUSE CASE
In a state where family vlogging is practically an industry, Utah has become the latest to step into the legal void around child influencers. Gov. Spencer Cox signed a new law aimed at protecting minors who appear in online content; a move that follows the child abuse conviction of former YouTube creator Ruby Franke.
The new legislation gives adults the right to scrub content they were featured in as minors and requires parents to set aside earnings for child participants. Specifically, online creators who earn more than $150,000 annually from child-centric content must deposit 15% of those earnings into a trust, accessible when the child turns 18. The protections extend beyond YouTubers and TikTokers, applying also to child actors in film and television.
Illinois and Minnesota have already adopted laws allowing children to sue for their share of online profits, but Utah’s approach goes further by offering a right to erasure. This is a notable shift for a state with deep roots in family-focused social media culture.
The legislation arrives against the backdrop of the Franke scandal, where the once-popular “8 Passengers” YouTube channel crumbled amid allegations (and admissions) of severe child abuse. Franke’s ex-husband told lawmakers earlier this year that he regretted allowing their family to be filmed for profit, warning that “children cannot give informed consent to be filmed on social media, period.”
Utah’s family-centric online economy, buoyed by groups like the so-called “MomTok” community, has long flourished in the absence of meaningful regulation. Now, amid growing public discomfort with the commodification of children’s lives, lawmakers are trying to put some guardrails in place. As Eve Franke, the youngest of the Franke children, wrote in testimony supporting the law: “Kids deserve to be loved, not used by the ones that are supposed to love them the most.”
[View source.]