On November 27, 2023, the California Privacy Protection Agency (CPPA) published proposed Automated Decision-Making Rules to be discussed by the CCPA board at its upcoming meeting on December 8, 2023. While the proposed rules are far from final—indeed, they are not even official draft rules—they signal that the CPPA is considering rules that would have significant impact on businesses subject to the California Consumer Privacy Act (CCPA).
The proposed rules define “automated decisionmaking technology” broadly as “any system, software, or process—including one derived from machine-learning, or other data-processing or artificial intelligence—that processes personal information and uses computation as a whole or part of a system to make or execute a decision or facilitate human decisionmaking.” Automated decisionmaking technology includes, but is not limited to, “profiling,” defined to mean any form of automated processing of personal information to evaluate, predict or analyze a person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.
The proposed rules require companies to provide pre-use notice, the ability to opt-out, and a right of access with respect to automated decisionmaking technologies in six specific scenarios:
- For decisions that produce legal or similarly significant effects concerning a consumer;
- Profiling a consumer in their capacity as an employee, independent contractor, job applicant, or student;
- Profiling a consumer while they are in a publicly accessible place;
- Profiling a consumer for behavioral advertising (listed as a discussion topic);
- Profiling a consumer that the business has actual knowledge is under the age of 16 (listed as an additional option for board discussion); and
- Processing personal information to train automated decisionmaking technology (listed as an additional option for board discussion).
The application to employees will be particularly important, as the only other rules on automated decision-making (Colorado) do not apply in the employment context. Further, the proposed rules make clear that profiling employees would include keystroke loggers, productivity or attention monitors, video or audio recording or live-streaming, facial- or speech-recognition or –detection, automated emotion assessment, location trackers, speed trackers, and web-browsing, mobile-application, or social-media monitoring tools. In other words, the proposed rules would have big impacts on common technologies used in the employment context—which may not currently be configured in a way where opt-outs or information could be easily shared.
With respect to the right of access, companies would have to disclose not only that automated decision-making technology is used and how decisions affects the individual, they would also have to provide details on the system’s logic and possible range of outcomes, as well as how human decision-making influences the final outcome. These requirements will be difficult in practice, and, as with Colorado Privacy Act’s regulations and the CPPA’s proposed regulations on risk assessments, should be influencing the nature and amount of information that companies require from vendors that leverage AI now.
As noted above, the proposed rules have a long way to go. Additionally, various exceptions are incorporated into the rules that may mitigate the operational burden in some contexts. However, the proposed rules will almost certainly result in expanded regulatory obligations for subject companies over what they currently face. While compliance efforts may be premature, companies should start assessing whether they could, if necessary, comply with the proposed rules from an operational standpoint.
[View source.]