Articles in this Issue
- Warby Parker Fined $1.5 Million Following HHS Investigation of Credential Stuffing Security Breach
- EU Digital Services Act: Harmonizing Transparency Reporting Requirements
- Europe's Push to Safeguard Children's Personal Data
- Unregistered Data Broker Shuttered for Three Years by the California Privacy Protection Agency
- Dechert Tidbits
Warby Parker Fined $1.5 Million Following HHS Investigation of Credential Stuffing Security Breach
On February 20, 2025, the U.S. Department of Health and Human Services (“HHS”), Office for Civil Rights (“OCR”) announced a $1.5 million civil penalty against manufacturer and online eyewear retailer Warby Parker Inc. after a cyberattack compromised the electronic protected health information (“ePHI”) of almost 200,000 Warby Parker customers. OCR initiated its investigation in December 2018, after Warby Parker filed a breach report with OCR. OCR’s investigation found that from September to November 2018, unauthorized third parties accessed Warby Parker customer accounts by using credentials obtained by data breaches on other websites, a tactic known as “credential stuffing.” The ePHI that was potentially compromised included customers’ names, addresses, payment card details, and eyewear prescription information. Warby Parker subsequently reported additional credential stuffing attacks to OCR in September 2019, January 2020, April 2020, and June 2022, which led to unauthorized access to ePHI.
According to OCR’s Notice of Proposed Determination, OCR determined that Warby Parker violated three provisions of the Health Insurance Portability and Accountability Act’s (“HIPAA”) Security Rule, which sets forth standards for covered entities and business associates to safeguard protected health information. Specifically, OCR found that Warby Parker failed to: (1) conduct an accurate and thorough risk analysis to identify the potential risks and vulnerabilities to ePHI in Warby Parker’s systems; (2) implement security measures sufficient to reduce the risks and vulnerabilities to ePHI to a reasonable and appropriate level; and (3) implement procedures to regularly review records of information system activity. In September 2024, OCR issued a Notice of Proposed Determination seeking to impose the $1.5 million civil penalty, and, according to the Notice of Final Determination, the eyewear company did not contest the penalty or request a hearing.
Takeaway: It is essential that companies subject to HIPAA have a comprehensive HIPAA compliance program in place, conducting regular risk assessments and maintaining appropriate security controls to protect health information. In particular, companies that have been the subject of repeated, successful attacks will want to be in a position to demonstrate to their regulators that they have taken appropriate steps to address the attacks and reduce the risk of future recurrence. While no systems are totally immune to compromise, often the key to successful negotiations with regulators, in the wake of a company security incident, is to be able to point to robust policies, processes and procedures that were in place notwithstanding the compromise.
EU Digital Services Act: Harmonizing Transparency Reporting Requirements
In November, 2024, the European Commission published an Implementing Regulation which aimed to standardize the format, content and reporting periods for transparency reports under the EU Digital Services Act (“DSA”). From July 1, 2025, providers of online intermediary services (such as online marketplaces, social media platforms, app stores, cloud providers, and search engines) must start using the templates prescribed in the Implementing Regulation to make their mandatory transparency reports.
The DSA includes a baseline set of obligations generally applicable to intermediary service providers with additional obligations scaling progressively based on the nature and size of the business. For instance, the DSA mandates that all intermediary service providers publish annual transparency reports concerning their content moderation activities. These reports must include data on orders received from authorities, complaints handled, and other content moderation practices. “Very large online platforms” and “very large online search engines” face the strictest transparency requirements. These include biannual reporting covering detailed information on their content moderation teams, such as their language capabilities.
Providers must make their transparency reports publicly available no later than two months after the end of each reporting period. The Implementing Regulation sets out the reporting periods and includes a transition period ending on December 31, 2025. The first fully harmonized reporting cycle will cover January 1, 2026, through December 31, 2026.
Takeaway: Entities in scope of the DSA will want to begin establishing data collection processes and familiarizing themselves with the reporting templates to integrate the specific requirements of the Implementing Regulation into their operations ahead of the first reporting cycle. Early preparation will help to reduce the risk of fines under the DSA, which can total up to 6% of annual worldwide turnover.
Europe's Push to Safeguard Children's Personal Data
On February 11, 2025, the European Data Protection Board (“EDPB”) adopted Statement 1/2025 on age assurance, which aims to standardize age verification practices across the EU. This statement seeks to balance the protection of children's rights with the safeguarding of personal data, providing both high-level principles and specific guidance for service providers. The EDPB emphasized that age assurance mechanisms must align with the EU’s regulatory framework, including the GDPR, the Audiovisual Media Services Directive, and the Digital Services Act.
Service providers are advised to conduct Data Protection Impact Assessments to evaluate the necessity and proportionality of age assurance measures. These measures should be the least intrusive while being effective and reliable. The EDPB highlights the importance of respecting individuals' fundamental rights and freedoms and adhering to GDPR principles such as lawfulness, fairness, transparency, purpose limitation, and data minimization. According to the EDPB, when implementing age assurance measures, the best interests of the child should be a primary consideration for all parties involved.
The UK Information Commissioner's Office (“ICO”) has also focused increasingly on this issue since the Children’s Code came into force in 2021. Since then, the ICO has continued to target how social media and video platforms collect and use children’s data in the UK. Recently, the ICO launched investigations into TikTok, Reddit and Imgur to examine their data protection practices concerning UK child users including their implementation of age assurance measures.
Takeaway: The new EDPB statement marks a significant step towards a unified approach to age verification in the EU, balancing child protection with data privacy. The commitment in both the EU and the UK to protecting children's data online underscores the importance of companies dealing carefully with children’s data. Companies will want to review and, if appropriate, adjust their age verification processes to comply with the EDPB’s guidance, as well as take the opportunity to check their compliance with the UK Children’s Code.
Unregistered Data Broker Shuttered for Three Years by the California Privacy Protection Agency
On February 27, 2025, the California Privacy Protection Agency (“CPPA”) announced that it had reached a settlement agreement with Background Alert Inc. (“Background Alert” or the "Company”), a California-based data broker that advertised its “scary” ability to “dig up” consumer information. The settlement requires the Company to cease operations through 2028 within 15 days for allegedly failing to register with the CPPA as a data broker and pay annual fees as required by the California Delete Act, and further requires the Company to pay a $50,000 fine for violating any terms of the settlement agreement. Specifically, the CPPA alleged that Background Alert amassed billions of public records, drew inferences from those records to create groups of associated individuals, and generated consumer profiles for sale, all without registering as required by the Delete Act. This is the seventh enforcement settlement between a data broker and the CPPA relating to alleged failure to register under the Delete Act. It comes amidst an enforcement sweep by the CPPA with respect to the Delete Act that began in October 2024 and it is the first to require an unregistered data broker to cease its operations. The six prior settlements all involved monetary penalties only.
Takeaway: This settlement is a significant turning point in the CPPA’s ongoing investigative sweep into data brokers operating in California and makes clear that companies that fail to comply with the Delete Act could face draconian consequences, up to, and including being shut down. In light of the CPPA’s enforcement sweep, companies that purchase and license consumer information will want to review their compliance with the Delete Act so that if the agency comes calling, its practices will pass muster.
Dechert Tidbits
Virginia Legislature Passes Act Regulating High-Risk Artificial Intelligence
The Virginia legislature has passed the High-Risk Artificial Intelligence Developer and Deployer Act, aiming to regulate artificial intelligence (“AI”) systems defined as “high risk,” including tools used to make critical decisions in areas such as education, employment, financial services, legal services, and health care. Similar to the provisions of Colorado’s AI Act (which we previously covered in Issue 71 of Cyber Bits), the bill would require AI developers to disclose risks posed by AI systems and how they tested and mitigated those risks. “Deployers” who use AI tools would be required to use reasonable care to protect consumers from foreseeable algorithmic discrimination. The bill passed narrowly and will not become law unless signed by Governor Glenn Youngkin by March 24, 2025.
California Attorney General Agrees to Settlement Striking Portions of Social Media Law as Unconstitutional
California Attorney General Rob Bonta has agreed to a settlement with X Corp. (formerly Twitter) that would excise certain provisions of the Assembly Bill 587, which required social media companies to disclose their content moderation policies for identifying and removing hate speech, disinformation, and other illegal or offensive content as unconstitutional. The settlement follows a ruling from the Ninth Circuit in X Corp.’s favor last September. The rest of the law remains intact, including a provision requiring social media companies to report their terms of service and any modifications to the attorney general’s office.