Articles in this issue
- European Data Protection Board Publishes Strategy for 2024-27
- FTC Seeks Authority to Bring its Own Consumer Cases, Obtain Court-Ordered Monetary Awards
- Data Accuracy in The Context of GenAI - UK Data Regulator Seeks Comments on Draft Guidance
- UK Data Regulator Publishes Guidance on Transparency in Health and Social Care
- White House/HHS Publish HIPAA Privacy Rule to Support Reproductive Health Care Privacy
- Dechert Tidbits
European Data Protection Board Publishes Strategy for 2024-27
The European Data Protection Board (“EDPB” - the EU body tasked with promoting consistency and cooperation in enforcement of the GDPR) has outlined its strategy for the years 2024 to 2027.
The strategy is built upon four main pillars: (i) enhancing harmonization and promoting compliance; (ii) reinforcing a common enforcement culture and effective cooperation; (iii) safeguarding data protection in the developing digital and cross-regulatory landscape; and (iv) contributing to the global dialogue on data protection.
The EDPB intends to focus on strengthening enforcement collaboration and initiating coordinated enforcement actions. It also aims to integrate data protection rights within the broader regulatory framework through guidance on the intersection between data protection and the suite of new digital regulations that are part of the EU’s “Digital Decade” strategy (such as the AI Act, Digital Markets Act and the Digital Services Act).
Takeaway: The EDPB’s strategy for the next three years reflects the changing regulatory landscape in the EU. In the early days of the GDPR, the EDPB’s priority was providing guidance on the GDPR. In its latest strategy, the EDPB acknowledges its increasing “mediator” role between national data regulators and the need for it to evolve its guidance to account for intersecting new EU regulations governing the digital space.
FTC Seeks Authority to Bring its Own Consumer Cases, Obtain Court-Ordered Monetary Awards
The United States Federal Trade Commission (“FTC” or “Commission”) recently issued a report to Congress pursuant to the FTC Collaboration Act of 2021 detailing its cooperation with state law enforcement agencies and recommending legislative initiatives to enhance further collaboration efforts. The FTC’s report contains three sections, with the section on “Legislative Recommendations to Enhance Collaboration Efforts” being the most notable because it asks Congress to expand the FTC’s authority regarding civil penalty cases and equitable monetary relief.
First, the FTC urged Congress to restore the FTC’s authority under section 13(b) of the FTC Act to directly obtain court-ordered equitable monetary relief, such as restitution or disgorgement from subjects of its investigations. The United States Supreme Court struck down this power in its 2021 AMG Capital Management v. FTC decision, holding that section 13(b), which permits the FTC to go to the courts to obtain a “temporary restraining order or a preliminary injunction,” did not authorize the Commission itself to obtain court-ordered monetary relief. In her statement on the report, FTC Chair Lina Khan characterized this change as “critical” to ensuring that “lawbreakers do not profit from lawbreaking and that victims of illegal conduct are made whole.”
Second, the FTC requested that Congress strike the existing requirement that the FTC refer civil penalty cases to the United States Department of Justice (“DOJ”). The change would allow the FTC to file its own lawsuits seeking civil penalties without having to consult with the DOJ beforehand. The report argues that an independent authority to seek civil penalties will streamline the FTC’s enforcement capabilities and improve the Commission’s ability to protect consumers from unfair or deceptive acts or practices.
Takeaway: The FTC has adopted a more aggressive approach to enforcement in recent years and this would only embolden those efforts. Many in the industry already believe that the FTC is significantly overstepping in its enforcement actions. If Congress were to comply with the FTC’s request and restore its authority to obtaining court-ordered equitably monetary relief, and bring cases involving civil penalties without having to refer those cases to the DOJ, the FTC would have significantly more authority and ability to issue monetary penalties than it does currently.
Data Accuracy in The Context of GenAI - UK Data Regulator Seeks Comments on Draft Guidance
The UK Information Commissioner’s Office (“ICO”) has initiated a new phase in its ongoing series of consultations on data protection issues relating to generative AI. This third phase focuses on the principle that personal data must be accurate - the “accuracy principle.” The ICO has previously consulted on training generative AI on web-scraped data and defining the purposes for which personal data can be used in the context of generative AI.
The ICO emphasizes that the purpose for which a generative AI model is used is critical for the purposes of the accuracy principle and encourages developers to put in place measures to prevent their generative AI systems from being used for purposes that are incompatible with the level of accuracy of the AI’s outputs. For example, an AI system designed to be used for purely creative purposes may not have the level of accuracy required for that system to be used to make decisions about individuals or to source information about individuals. The ICO also considers that businesses deploying third party generative AI have responsibilities to ensure that the generative AI is not used in a manner that contravenes the accuracy principle.
Takeaway: With no equivalent to the EU AI Act in the UK at this time, data protection law provides a key framework for regulating AI. The ICO’s consultation series and guidance to date also demonstrates that the ICO sees AI as a priority. The accuracy of personal data in AI outputs is an important issue for AI developers, deployers and users to consider. Businesses using generative AI will want to understand use restrictions imposed by their vendors, as well as measures to ensure end users use the AI only for purposes that are appropriate to the level of accuracy of the personal data involved. The ICO’s current consultation is open for responses until May 10, 2024.
UK Data Regulator Publishes Guidance on Transparency in Health and Social Care
The UK Information Commissioner’s Office (“ICO”) has issued guidance for organizations involved in providing health and social care in the UK. Highlighting the sensitivity of personal data processed in the context of health and social care, the guidance provides sector specific advice to comply with data protection requirements regarding transparency and to foster trust in the health and social care systems.
The guidance adopts a proportionate approach acknowledging that there can be circumstances in a healthcare setting where providing privacy information may not be a priority (e.g., in the case of emergency treatment). The guidance also explains that, whilst some uses of personal data may be obvious to patients, for other uses additional steps may be needed to provide privacy information, such as the use of personal data for secondary purposes (e.g., medical research).
Takeaway: The ICO’s guidance provides practical and detailed explanations of its expectations regarding transparency for organizations involved in delivering health or social care services or handling health and social care data. The ICO emphasizes that complying with the transparency principle under the UK GDPR is not limited to providing the specific privacy information that is listed in the UK GDPR. According to the ICO, transparency, in particular in the health and social care sector, involves a more rounded approach that goes beyond publishing a privacy notice on an organization’s website. This will be context-specific but may include actions such as publishing policy documentation that is not specifically required to be provided under the UK GDPR
White House/HHS Publish HIPAA Privacy Rule to Support Reproductive Health Care Privacy
On April 22, 2024, the White House and United States Department of Health and Human Services (“HHS”) Office of Civil Rights announced the HIPAA Privacy Rule to Support Reproductive Health Care Privacy (the “Rule”). The Rule bolsters the existing Privacy Rule in the Health Insurance Portability Act of 1996 (“HIPAA”), prohibiting certain disclosures of individuals’ protected health information related to lawful reproductive health care.
According to HHS Secretary Xavier Becerra, the Rule protects individuals “seeking lawful reproductive health care regardless of whether the care is in their home state or if they must cross state lines to get it.” Under the Rule, the HHS Office of Civil Rights will administer and enforce protections that prohibit healthcare providers, health plans, healthcare clearinghouses, and their business associates from using or disclosing a patient’s protected health information to: (1) “conduct a criminal, civil, or administrative investigation into or impose criminal, civil, or administrative liability on any person for the mere act of seeking, obtaining, providing, or facilitating reproductive health care, where such health care is lawful under the circumstances in which it is provided;” or (2) to identify “any person for the purpose of conducting such investigation or imposing such liability.”
Takeaway: The Rule is a reaction to rising concerns over the sharing of information concerning reproductive health care and the potentially chilling effect such exchanges may have on individuals’ healthcare decisions. As courts and government bodies at all levels continue to wrestle with the issue of reproductive health care, especially in a post-Dobbs world, companies processing health information should be hyper vigilant in complying with current laws and regulations and in monitoring potential legal and regulatory changes.
Dechert Tidbits
FTC Finalizes Probe of X-Mode
On April 12, 2024, the U.S. Federal Trade Commission (“FTC”) finalized an order with the data broker Outlogic (formerly X-Mode), addressing allegations that Outlogic sold raw location data in combination with users’ mobile ad identifiers that enabled recipients to track individuals’ visits to specific locations, such as medical and reproductive health clinics and places of worship. The FTC order prohibits the company from sharing or selling any sensitive location data and requires, among other things, that Outlogic (i) delete or destroy all the location data it previously collected and any products developed from this data and (ii) “implement procedures to ensure that recipients of its location data do not associate the data with locations that provide services to LGBTQ+ people, with locations of public gatherings of individuals at political or social demonstrations or protests, or use location data to determine the identity or location of a specific individual.” Please see our discussion of the FTC’s proposed order with Outlogic here.
House Passes Bill to Limit Personal Data Sales to Intelligence Agencies, Law Enforcement
The United States House of Representatives passed The Fourth Amendment is Not For Sale Act (“HR 4639”) last month, marking a victory for privacy advocates. HR 4639 would prohibit law enforcement and intelligence agencies from purchasing personal information about customers or subscribers of electronic and remote computing service providers (e.g., social media, cell phone, email, and cloud computing companies) without first obtaining a court order.
CFPB Focuses Attention on Data Brokers
In a speech at the White House, Consumer Financial Protection Bureau (“CFPB”) Director Rohit Chopra outlined the CFPB’s initiatives to rein in the activities of companies that buy and sell consumer data. Of note, the CFPB is considering whether to define a data broker as a “consumer reporting agency,” which would require data brokers to comply with the Fair Credit Reporting Act (“FCRA”). If adopted, these proposals would ban data brokers from sharing certain consumer data with entities unless the entities serve a particular, FCRA listed, purpose.