Last week, New Jersey Attorney General Matthew J. Platkin announced a lawsuit against Discord, Inc., a popular messaging application provider, for allegedly misleading parents about the efficacy of its safety controls and obscuring risks facing children on the application.
The complaint (filed partially under seal) generally alleges the following:
- Structure of Discord’s Platform: The complaint asserts that Discord designed its application to appeal to children and to “encourage unmoderated engagement among users.” Specifically, according to AG Platkin, the app by default allows users to (i) receive friend requests from anyone on the app and (ii) freely message and receive messages from any friend. AG Platkin’s press release explains that, as a result, child users “can be—and are—inundated with explicit content.”
- “Safe Direct Messaging” Feature: AG Platkin describes in the complaint that from 2017 to 2023, Discord offered a “Safe Direct Messaging” setting that allowed users to determine whether the app would “scan” messages for explicit content. Setting options included: (1) “Keep me safe. Scan direct messages from everyone”; (2) “My friends are nice. Scan direct messages from everyone unless they are a friend”; and (3) “Do not scan. Direct messages will not be scanned for explicit content.” According to the complaint, Discord made option 2 (i.e., “my friends are nice”) the default setting, meaning messages that child users received from friends were not automatically scanned for explicit content. In addition, the complaint suggests that options 1 (i.e., “keep me safe”) and 3 (i.e., “do not scan”) did not work as intended, despite Discord’s representations (although details relating to these allegations are largely redacted).
- Use of App by Children Under 13: The lawsuit also alleges that Discord “actively chose not to” verify the date of birth of new users of its app, allowing children under 13 to register for accounts. Moreover, the suit alleges that banned users (such as those who previously circulated child sexual abuse material) can register for new accounts by using a new email address. In the press release, AG Platkin said, “Discord only requires individuals to enter their date of birth to establish their age when creating an account—nothing more. Discord does not require users to verify their age or identity in any other way. Simple verification measures could have prevented predators from creating false accounts and kept children under 13 off the app more effectively.”
Through the suit, AG Platkin seeks a number of remedies, including an injunction to stop Discord from violating the New Jersey Consumer Fraud Act (CFA), civil penalties, and disgorgement of any ill-gotten profits. Notably, the complaint also alleges that because Discord violated the Childrens Online Privacy Protection Act (COPPA) by failing to obtain verifiable parental consent, Discord “thereby engaged in a presumptively unlawful commercial practice in violation of the CFA.” The suit does not explicitly include a COPPA count, however.
The Discord complaint builds upon similar actions by the New Jersey AG’s Office, including a multistate suit in 2024 against TikTok for failing to protect children from harms resulting from excessive time on the app (which we discussed here), and a 2023 suit against Meta based on similar allegations. This action also showcases that state attorneys general are continuing their focus on children (and teen) privacy in 2025.
If you offer a product or service that potentially puts children at risk of unwanted contact or potentially exposes their personal information or otherwise misrepresents the safeguards you have in place to protect children, you should expect enforcers to take notice and use their full authority to address their concerns.
[View source.]