Dechert Cyber Bits - Issue 29

Dechert LLP
Contact

Dechert LLP

Articles in this issue

  • The Proposed EU-U.S. Data Privacy Framework Faces Potential Obstacles
  • The Federal Trade Commission Launches New Office of Technology
  • The EU’s Digital Services Act
  • Supreme Court of Illinois Adopts Accrual Rule that Will Likely Lead to Increases in Awards under BIPA

The Proposed EU-U.S. Data Privacy Framework Faces Potential Obstacles

On February 14, 2023, the European Parliament Committee on Civil Liberties, Justice and Home Affairs (the “EP Committee”) released a draft opinion advising against adopting an adequacy decision for the U.S. based on the proposed EU-U.S. Data Privacy Framework (“DPF”). In swift succession, the European Data Protection Board (“EDPB”) has followed up with its own opinion on the DPF, adopted on February 28, welcoming substantial improvements but expressing remaining concerns.

The EU-U.S. Privacy Shield framework was invalidated in the Schrems II case in 2020, leaving uncertainty around future EU to U.S. data transfers. As discussed in our OnPoint, in October 2022, President Biden signed an Executive Order (“EO”) for the new DPF. However, privacy activists have criticized the DPF, pointing to continued permitted use of “bulk surveillance” and claiming that redress and oversight mechanisms are insufficient.

The EP Committee adopted similar criticisms, pointing out that:

  • simply using the term “proportionality” is insufficient where the definitions and likely interpretation under U.S. law are different;
  • the EO lacks certainty and foreseeability in its application since it can be amended by the U.S. President at any time;
  • bulk data collection by signals intelligence is not prohibited, and the list of national security objectives can be expanded by the President without public communication;
  • the U.S. does not have a federal data protection law, unlike other countries with an adequacy decision;
  • the DPF’s redress mechanism of the Data Protection Review Court did not meet the impartiality or independence standards under the EU's Fundamental Rights charter, and there was no federal appeal route for data subjects.

The EDPB stopped short of making an express recommendation as to whether the Commission should adopt an adequacy decision, and instead outlined remaining concerns on points including rights of data subjects, onwards transfers, the scope of exemptions, bulk data collection, and the redress mechanism. Further, the EDPB’s stance is that the adoption of an adequacy decision for the DPF should be conditional upon the adoption of updated policies and procedures to implement the EO by all U.S. intelligence agencies.

Takeaway: Although the EDPB’s opinion appears ultimately more positive on its face than the EP Committee’s stark conclusion, it still highlights a number of concerns. The final version of the EP Committee opinion is expected to go to a full European Parliament vote in April 2023. Although non-binding, the EP Committee and EDPB opinions will be considered by the European Commission when deciding whether to adopt an adequacy decision, and may make it challenging for the Commission to do so with the DPF in its current form. Even if the Commission does go ahead with issuing an adequacy decision, it seems almost inevitable that this will face legal challenge by Max Schrems or other activists.

 

The Federal Trade Commission Launches New Office of Technology

On February 17, 2023, the Federal Trade Commission (FTC) launched a new in-house Office of Technology with three main functions: (i) to strengthen and support law enforcement investigations and actions; (ii) to advise and engage with staff and the FTC on policy and research initiatives; and (iii) to highlight market trends and emerging technologies that impact the FTC’s work. The Office of Technology will be run by Stephanie T. Nguyen, the FTC’s Chief Technology Officer. Ms. Nguyen has published a blog post here regarding how she envisions the Office of Technology will interact with the FTC’s 3 Bureaus and other 11 offices.

According to FTC Chair Lina Khan, “[o]ur Office of Technology is a natural next step in ensuring we have the in-house skills needed to fully grasp evolving technologies and market trends as we continue to tackle unlawful business practices and protect Americans.” Indeed, journalists and industry watchers have noted that the FTC “has long been dwarfed by Silicon Valley titans . . . , each staffed with thousands of engineers and technologists.” The launch of the Office of Technology will help the FTC keep pace with rapid advances of technology by bolstering its subject-matter expertise in this area and aligning it with the capabilities of other global enforcement authorities responsible for data privacy and security and consumer protection.

Takeaway: It's unsurprising that the Office of Technology’s launch comes as the FTC has homed in on big tech companies. As the FTC has continued to move aggressively against Silicon Valley, companies – particularly tech companies – should anticipate that the FTC will ask even more intrusive questions regarding the technology underlying a company’s products, services, and solutions – especially those dealing with artificial intelligence, machine learning, or novel uses of data collection. The FTC has signaled that it intends to scrutinize companies’ claims of using AI, to determine whether they are exaggerated.

 

The EU’s Digital Services Act

The new Digital Services Act (“DSA”), a key pillar of the EU’s overhaul of the digital economy, aims to harmonize rules for online intermediaries (such as online marketplaces, social media platforms, app stores, cloud providers, and search engines). It seeks to promote transparency and to allow for safer digital spaces by preventing the dissemination of illegal and harmful content online. As a regulation, the DSA will be directly applicable across all 27 EU Member States and supersede any national legislation. Similar to the General Data Protection Regulation (“GDPR”), the DSA has extra-territorial effect and non-EU established businesses will also be subject to the DSA to the extent they offer intermediary services to recipients established or located in the EU.

The obligations imposed by the DSA are cumulative across four different categories of intermediaries, being: (i) all intermediary service providers; (ii) hosting service providers; (iii) online platform providers; and (iv) very large online platforms (“VLOPs”) and very large online search engines (“VLOSEs”) (those that have at least 45 million average monthly active users in the EU and that have been designated as such by the European Commission). A set of “base” obligations will apply across all categories of intermediary service providers, with hosting service providers, online platform providers and VLOPs and VLOSEs having additional (and increasing) obligations.

Some of the key provisions include:

  • a ban on targeted advertising based on profiling using data of minors or special categories of data;
  • a ban on dark patterns (for example, “nudging” a user to make a certain choice by using particular color and/or size emphasis);
  • obligations to provide further information about adverts and the reasons a recipient is shown an advert;
  • obligations on identifying and removing illegal content;
  • reporting requirements;
  • mandated information for terms and conditions.

While the bulk of the DSA is not applicable until February 17, 2024, there was an earlier deadline of February 17, 2023 for online platforms to publish information on their average monthly active recipients in the EU so that the European Commission can make its VLOP and VLOSE designations. Once designated, VLOPs and VLOSEs will have four months until the DSA applies to them.

Takeaway: Businesses should review the extent to which the DSA applies to their operations and begin to consider the likely implications. In the privacy space, DSA provisions such as the prohibition of certain advertising and dark patterns, as well as additional transparency requirements, mean that affected businesses will likely need to look to each of the GDPR, e-Privacy Directive (or, once adopted, Regulation) and the DSA to understand their obligations. As with the GDPR, businesses should bear in mind the potential for large fines as the DSA sets an upper limit of 6% of annual worldwide turnover.

 

Supreme Court of Illinois Adopts Accrual Rule that Will Likely Lead to Increases in Awards under BIPA

A separate claim for damages can arise under Illinois’ Biometric Information Privacy Act (“BIPA”) each time a business fails to seek permission to gather biometric data from workers or consumers or fails to disclose retention plans for that information: so held the Supreme Court of the State of Illinois (the “Illinois Supreme Court”) on February 17, 2023, in Cothron v. White Castle System, Inc. The Supreme Court divided 4-3 in answering a certified question from the United States Court of Appeals for the Seventh Circuit.

In Cothron, a White Castle employee filed a putative class action in federal district court seeking to hold White Castle responsible for its policy of requiring employees to scan their fingerprints to access pay stubs and computers. The plaintiff alleged that White Castle implemented its biometric-collection system without obtaining her consent in violation of BIPA. White Castle allegedly collected the plaintiff’s biometric data for years, and did not seek her consent to acquire her fingerprint biometric data until 2018, a decade after BIPA took effect. The district court denied White Castle’s motion for judgment on the pleadings on its statute of limitations defense. An interlocutory appeal followed, and the question of when BIPA claims accrued was certified to the Illinois Supreme Court.

Agreeing with the federal district court’s interpretation of the statute’s provisions, the Illinois Supreme Court held that sections 15(b) and 15(d) of BIPA are violated each time an entity collects a person’s biometric information or transfers it to someone else, without that person’s consent. In so holding the Illinois Supreme Court rejected White Castle’s argument that claims accrue only upon the first scan or transmission, concluding that BIPA’s plain language “demonstrates that such violations occur with every scan or transmission.”

The three dissenting Justices contended that the majority’s interpretation was inconsistent with BIPA’s plain language, the purposes behind BIPA, and precedent. They also predicted that the result would “lead to consequences that the legislature could not have intended.” “Moreover,” they opined, “the majority’s interpretation renders compliance with [BIPA] especially burdensome for employers,” and that their construction of the statute “could easily lead to annihilative liability for businesses.” To some extent acknowledging these concerns, the Court’s majority expressly asked Illinois’ legislature “to review these policy concerns and make clear its intent regarding the assessment of damages under the Ac6t."

Takeaway: The Illinois Supreme Court’s ruling that BIPA damages are calculated for each violation dramatically expands the calculation of damages for companies that improperly use or store biometric identifiers. Companies that rely on use of biometric identifiers will want to review their use, consent, and storage practices to ensure compliance with BIPA and be prepared to face claims alleging multiple violations of the Act.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Dechert LLP | Attorney Advertising

Written by:

Dechert LLP
Contact
more
less

Dechert LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide