Déjà Vu All Over Again: The CPPA Releases Draft Regulations on Cybersecurity Audits and Risk Assessments (Part 2 of 2)

Wyrick Robbins Yates & Ponton LLP
Contact

Wyrick Robbins Yates & Ponton LLP

On August 29, 2023, the California Privacy Protection Agency (“CPPA”) released a set of draft regulations on cybersecurity audits and risk assessments.  In Part 1 of this two-part series of posts, we explored the CPPA’s draft Cybersecurity Audit Regulations. In this Part 2, we focus on the Draft Risk Assessment Regulations.

The Draft Risk Assessment Regulations Mirror the Draft Cybersecurity Audit Regulations in Several Key Respects.

As with the draft Cybersecurity Audit Regulations, the draft Risk Assessment Regulations apply only to certain businesses: in this case, those engaged in the processing of personal information that presents a “significant risk” to consumers’ privacy. To that end, the draft Risk Assessment Regulations make clear that both (i) the sale or sharing of personal information and (ii) the processing of sensitive personal information for non-employment purposes fall within the category of high-risk processing. The draft regulations go on, however, to enumerate several other processing activities that could, pending discussion by the CPPA board, also qualify as presenting a significant risk to consumer privacy, including:

  • Processing personal information to monitor a business’s employees, independent contractors, job applicants or students;
  • Processing personal information of consumers that the business has actual knowledge are under age 16;
  • Processing personal information of consumers in public places to monitor their behavior, location, movements, or actions;
  • Processing personal information to train artificial intelligence or automated decisionmaking technologies; and
  • Using automated decisionmaking technologies in furtherance of a decision that results in the provision or denial of financial or lending services, housing, insurance, educational enrollment or opportunity, criminal justice, employment or contracting opportunities or compensation, healthcare services, or access to essential goods, services, or opportunities.

Another point of similarity between the draft Cybersecurity Audit Regulations and Risk Assessment Regulations is the obligation (both direct and indirect through mandatory contractual terms) placed on service providers and contractors to cooperate with businesses that are conducting risk assessments by providing all information necessary for a business to complete its assessment. Both sets of regulations suggest that factual misrepresentations by service providers to businesses in the midst of audits or assessments will be treated as both a breach of contract and a direct regulatory violation.

Finally, as with the draft Cybersecurity Audit Regulations, businesses should expect they will be required to certify compliance with the Risk Assessment Regulations and submit an abridged version of their risk assessment to the CPPA.

At Last—Definitions and (Some) Requirements for Artificial Intelligence and Automated Decisionmaking Technology.

Among the more notable aspects of the draft Risk Assessment Regulations are the proposed definitions of “artificial intelligence” and “automated decisionmaking technology”:

  • “Artificial Intelligence” refers to “an engineered or machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions that influence physical or virtual environments.” Generative artificial intelligence is specifically identified as a type of artificial intelligence subject to the draft regulations.
  • “Automated Decisionmaking Technology” is defined as “any system, software, or process—including one derived from machine-learning, statistics, other data-processing techniques, or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.” The term explicitly includes profiling.

Besides undertaking risk assessments as described below—and with some specific and additional obligations for businesses that use automated decisionmaking technology—the current draft Risk Assessment Regulations require businesses that use personal information to train artificial intelligence or automated decisionmaking technology, and that subsequently provide such technologies to others, to take measures to help ensure the technology is not misused thereafter.  Those measures include:

  • Preparing and distributing to each downstream user of the technology a plain language explanation of appropriate uses for the technology; and
  • Implementing and documenting in a risk assessment any safeguards designed to ensure the appropriate downstream use of artificial intelligence or automated decisionmaking technology by other persons.

These requirements thus put the onus on businesses to take a broad and forward-thinking view of the potential uses of their artificial intelligence and automated decisionmaking technologies and to demonstrate that they have considered and attempted to mitigate the possible negative uses of those technologies.

Businesses Should Prepare to Describe and Defend the Benefits of High-Risk Processing and Must Have a Clear and Articulable Plan for Addressing the Negative Impacts.

The draft Risk Assessment Regulations require in-scope businesses to describe their high-risk processing activities in significant detail before analyzing both the “benefits resulting from processing to the business, the consumer, other stakeholders, and the public” and the “negative impacts to consumers’ privacy associated with the processing.”

In that regard, the requirements for assessing the benefits of the processing are relatively minimal and straightforward and the requirements for assessing the negative impacts of the processing are substantially more involved. Indeed, the assessment requirements are significantly weighted towards an evaluation of negative impacts, requiring businesses to:

  • Identify all negative impacts and their sources;
  • Describe their magnitude and likelihood; and
  • Describe the criteria and method by which the business determined these things.

This assessment must evaluate, at a minimum, ten separate categories of negative impacts ranging from constitutional harms and discrimination to more tangible injuries like physical and psychological harms stemming from the high-risk processing activities undertaken by the business.

The draft regulations make clear it is not enough to merely consider these negative impacts. Rather, businesses must develop and implement safeguards that are specifically designed to address them. For each safeguard, the business must in turn be able to articulate the degree to which each safeguard addresses the negative impacts. These obligations are ongoing, requiring businesses to identify and implement additional safeguards to “maintain knowledge of emergent risks and countermeasures.”

Moreover, the draft Risk Assessment Regulations would require businesses to justify their high-risk processing activities by explaining how the benefits of the processing outweigh the negative impacts as mitigated by the aforementioned safeguards. Given the substantial requirements surrounding the analysis of negative impacts, businesses will need to work hard to identify and articulate sufficiently compelling benefits of processing to successfully support their processing activities.

Businesses will also need to devise meaningful mitigation techniques to bring the level of risk associated with such processing to a sufficiently low level to ensure the benefits of processing will outweigh the negative impacts. Making that showing could require businesses to hire and retain on an ongoing basis, outside consultants to evaluate their processing activities, design mitigation initiatives, and conduct regular evaluations of the same to ensure the balance of risks remains weighted in favor of continued processing.

Looking Ahead

The draft Cybersecurity Audit and Risk Assessment Regulations have a long way to go before they become effective, but the drafts provide a strong indication of the agency’s current thinking on those subjects. Companies that could be in scope for these requirements should, therefore, watch this space closely.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Wyrick Robbins Yates & Ponton LLP | Attorney Advertising

Written by:

Wyrick Robbins Yates & Ponton LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Wyrick Robbins Yates & Ponton LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide