[co-author: Stephanie Kozol]*
On April 29, Michigan Attorney General (AG) Dana Nessel filed a lawsuit against Roku, Inc. (Roku), the smart TV and device provider and streaming service, alleging that Roku collects and monetizes personal data from children without proper parental consent in violation of the Children’s Online Privacy Protection Act (COPPA) and other laws, including the Video Privacy Protection Act (VPPA) and the Michigan Consumer Protection Act.
The Complaint
The Michigan AG detailed its accusations against Roku in the complaint. At a high level, the complaint alleges that:
- Roku fails to protect minor users: The AG asserts that Roku is an “operator” of online services under COPPA, was aware that children use its products, and had specific content directed toward children. Roku allegedly failed to meet its obligations under COPPA by failing to screen for under-age users and obtain parental consent for child users.
- Roku improperly collects, uses, and retains children’s personal data: The AG alleges that Roku uses, retains, and sells children’s personal data including voice, location, IP addresses, and browsing history in violation of COPPA.
- Roku misrepresents its privacy practices: According to the allegations, Roku’s privacy policy failed to disclose its use of the data of minors, which constitutes “unfair, unconscionable, or deceptive” practices that violate consumer protection laws.
In addition to the various alleged statutory violations, the complaint included intrusion upon seclusion and unjust enrichment claims.
Why It Matters
The complaint brings two important issues to light:
- Protecting children and teens online is a top regulatory priority. Children’s and teens’ online safety and privacy issues have bipartisan support at both state and federal levels.
- The enforcement priority is evident not only through the dozens of bills that were introduced across multiple states in the form of social media legislation, age-appropriate design codes, and children’s data protection acts, but also in the Federal Trade Commission’s (FTC) continued emphasis on children’s rights as an enforcement priority, the finalization of the first set of updates to the COPPA Rule in more than a decade, and its recent workshop, “The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families.”
- Additionally, the Michigan AG’s action continues to showcase the focus that state-AGs have maintained on prioritizing child online safety. For example, groups of state AGs advocated in favor of the COPPA Rule amendments and have urged Congress to pass the Kids Online Safety Act (KOSA). Just last month, a group of 28 state AGs sent a letter to Meta demanding that the company implement basic safeguards to prevent its AI assistant from exposing children to sexually explicit content.
- Technology speaks louder than words. Regulators are increasingly taking a more technologically focused and forensically sophisticated approach to reviewing companies’ practices. Regulators will continue to employ technology, computer scientists, engineers, and the publicly available information about data brokerage and the online advertising ecosystem to follow code and data to understand their uses and ensure that those uses align both with applicable laws and covered companies’ consumer facing disclosures.
What Companies Can Do Now
There have always been commercial and compliance-driven incentives for companies to invest in technology to understand and manage their data in real-time, but there is now a pressing need for organizations to review and verify operational controls throughout their data ecosystems. Though not new advice, companies should:
Commit to Privacy-by-Design. The complaint singled out Roku for not offering users an option to separate profiles for child users that were made available by its major competitors. As technology and law continue to evolve at a dizzying pace, companies must strive to demonstrate a commitment to continually evolving privacy protective features — especially for minor audiences. Privacy assessments will help organizations identify opportunities for programmatic maturity and enhancement, but participating in industry groups committed to best practices, standards, and consistent privacy protective experiences for parents and minors across platforms, are other good bets.
Invest (or reinvest) in data mapping. Organizations must have a dynamic and technically reliable way of maintaining a current inventory of personal data and systems that maintain them. These “data maps” must identify data by source and the relevant data subjects especially where it is incumbent to know the data subject’s age and that appropriate consents are consistently managed. Organizations should be able to align personal data with their specific purposes to ensure legal and contractual compliance, make accurate disclosures and ensure that data is only being processed intentionally. This is critically important for sensitive data, including the personal data of children and teens, to ensure sensitive data is properly minimized, deleted, and anonymized, as applicable.
Invest in and test technology. It is evident from the complaint that the Michigan AG used technology to understand Roku’s data practices and gather evidence to stipulate its allegations and state AGs are increasingly recruiting technologists and data scientists to join their enforcement teams. Companies and their legal teams must communicate closely with data-adjacent business teams (e.g., product engineers, backend developers, marketing teams, data scientists, data analysts, etc.) to ensure their internal procedures and code are consistent with the company’s marketing claims, consumer disclosures and representations, commercial terms, and contractual language. Critically, legal and marketing claims and tools relied on by organizations to support product performance or compliance initiatives must be technically validated through the use of internal testing, assessments, and compliance audits. These routinely scheduled and cross-functionally staffed efforts are crucial to establish ongoing accountability and to reduce compliance risk through routine monitoring and programmatic adjustment of compliance controls and other safeguards.
Conduct regular risk assessments. Privacy risk assessments are systematic evaluations of the processing of personal information against internal and external standards and organizational principles. Most comprehensive U.S. state privacy laws and the EU General Data Protection Regulation require privacy risk assessments. Routine use of privacy risk assessments can help organizations achieve privacy-enhanced commercial objectives by enabling organizations to analyze their capabilities, risk profile, and limitations.
*Senior Government Relations Manager