Automated Decisionmaking Technology in California: New Rules Would Impose Transparency, Notice, and Consumer Opt-Out Obligations on Businesses | Davis Wright Tremaine LLP

Draft rules would apply to the use of AI and machine learning technologies to make automated decisions related to financial services, housing, insurance, education, criminal justice, employment, healthcare

On November 27, 2023, the California Privacy Protection Agency (CPPA) released draft regulations mandating notice, opt-out, and information access requirements for companies using automated decision-making technology (ADMT) to process personal information. These draft rules, required by the California Consumer Privacy Act (CCPA) as amended by the California Privacy Rights Act (CPRA), follow those released by the CPPA last summer relating to privacy risk assessments and cybersecurity audits.

Under the draft discussion rules, businesses would have to make significant disclosures about their implementation of ADMT, such as the business’s purpose for the use of ADMT; information about the ADMT logic; the range of possible outcomes; how human decision-making influenced the outcome for a particular consumer; whether the ADMT was evaluated for validity, reliability, fairness, and the outcome of any such evaluation; and how the ADMT operated with respect to a particular consumer.

The draft rules also contain definitions of important terms and concepts such as:

  • What is ADMT?
  • What constitutes a decision with “legal or similarly significant effects concerning a consumer”?
  • What kind of automated processing constitutes profiling?

Overall, the draft rules would expand regulatory obligations related to the use of ADMT. Interested parties should consider engaging with the CPPA in the upcoming rulemaking process to educate the CPPA regarding the real-world impact these rules would have and shape their final form. After the CPPA finalizes these rules, they may become a model for other states and regulatory agencies considering ways to regulate the use of AI/ML applications and systems.

As with previous draft regulations, this draft is intended to facilitate CPPA and public discussion. The CPPA has indicated that they will initiate a formal rulemaking process in 2024, at which time the final version of the proposed rules will be formally published. A discussion of these draft rules will occur at the next CPPA meeting, scheduled for December 8.

Broad Scope of Proposed Rules Regulating Use of ADMT

The proposed regulations define ADMT very broadly to include any decision-making tool, even those that merely facilitate human decision-making, that incorporates a process, system, or software that processes personal information and “uses computation.”[1] This definition would capture and specifically calls out artificial intelligence (AI) tools, although the current definition has a much broader focus than just AI tools and would cover less sophisticated forms of algorithmic processing.

Additionally, the CPPA has taken a broader approach than other states by defining ADMT to include those tools that contribute to rather than control a decision-making process. For example, Colorado’s regulations create a tiered approach with varying obligations based on whether the technology relies on Solely Automated Processing, Human Reviewed Automated Processing, or Human Involved Automated Processing.

New Consumer Rights

Opt-Out

The CPPA proposes to require businesses to provide consumerstwo or more designated methods with minimal steps for easily submitting requests to opt out of one or all of the business’s use of ADMT (Covered Uses), as follows:

  • When used to make decisions that produce legal or similarly significant effects on a consumer, defined as “a decision that results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services”;
  • When used to profile employees, independent contractors, job applicants, or students, where “profiling” is defined as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements”; and
  • When used to profile a consumer in a publicly accessible place, which is defined as “a place that is open to or serves the public,” and includes shopping malls, stores, restaurants, cafes, movie theaters, amusement parks, convention centers, stadiums, gymnasiums, hospitals, medical clinics or offices, transportation depots, transit, streets, or parks.

In addition, the CPPA has included profiling consumers for behavioral advertising when the business has actual knowledge that the consumer is under 16 years of age as a possible fourth scenario from which consumers could opt out, subject to Board discussion. Other potential subjects for Board discussion include allowing consumers to opt out of ADMT that involves either profiling a consumer that the business has actual knowledge is under 16 or processing personal information to train ADMT.

Exceptions

The draft rules include a number of significant exceptions to the opt-out requirement. Specifically, businesses would not be required to provide an opportunity to opt out if (1) the personal information used by the ADMT is proportionate and in line with the reasonable expectations of the consumer, and (2) the ADMT is necessary to achieve and used solely for the following purposes (Exempt Uses):

  • To prevent, detect, and investigate cybersecurity incidents;
  • To resist malicious, deceptive, fraudulent, or illegal actions directed at the business and to prosecute those responsible for those actions;
  • To protect the life and physical safety of consumers; or
  • To provide the good or perform the service specifically requested by the consumer.

The draft regulations elaborate that to rely on the last exception, businesses must not have a reasonable alternative to using the ADMT to provide a good or service. While the draft regulations would create a rebuttable presumption that an alternative method exists if businesses in the same or similar industry have used an alternative method to provide a similar good or service, businesses could overcome this presumption by establishing that:

  • It would be futile for the business to develop or use alternative methods of processing;
  • Developing and using an alternative method of processing would result in a good or service that is not as valid, reliable, and fair; or
  • Developing an alternative method of processing would impose extreme hardship upon the business.

These are subjective standards. Notably, the draft regulations also state that businesses engaged in profiling for the purpose of behavioral advertising would not be permitted to rely on any of these exceptions, and would be required to provide consumers the opportunity to opt out of ADMT.

Transparency/Access

Businesses would also be required to provide consumers with access to a broad range of information about their use of ADMTs for Covered Purposes. Specifically, businesses would need to provide “plain language explanations” of:

  • The purpose for using the ADMT, which could not be explained in generic terms, such as “to improve our services”;
  • The discrete output(s) of the ADMT with respect to the consumer;
  • If the business has used or plans to use the output to make a decision with respect to the consumer, provide “plain language explanations” of:
    • The decision itself (e.g., placement into a category or segment as a result of profiling), if the output has already been used to make a decision, or how the business plans to use the output to make that decision, if the decision has not yet been made;
    • Any factors other than the output that the business used or plans to use to make the decision;
    • The role of any human involvement in the business’s use of the ADMT; and
    • Whether the business’s use of the ADMT has been evaluated for validity, reliability, and fairness, and the outcome of any such evaluation;
  • How the ADMT worked with respect to the consumer, including:
    • How the logic, including its assumptions and limitations, was applied; and
    • The key parameters that affected the output, why the parameters were key, and how they were applied;
  • A “simple and easy-to-use” method to obtain the range of possible outputs, which may include the aggregate output statistics (e.g., the most common average outputs over the past year and the percentage of consumers that received each output during that time);
  • Instructions for submitting a complaint to the business about its use of the ADMT – including complaints about specific decisions and how the decision was or will be made with respect to the consumer – and information about the consumer’s ability to file a complaint with CPPA and the California Attorney General, along with links to the complaint forms on their respective websites; and
  • Instructions for exercising the other CCPA rights and links to any online request forms or portals offered by the business, including the ADMT opt-out right, unless the business uses the ADMT solely for the Exempt Uses.

The proposed regulations would provide a useful exception to this requirement: if a business uses ADMT solely for the purpose of three of the four Exempt Uses (i.e., to detect security incidents, prevent malicious and fraudulent activity, and protect life and physical safety), then it would not be required to provide consumers access to information that would compromise its processing of personal information for those purposes. This exception would not apply, however, when a business uses ADMT to provide goods or services specifically requested by the consumer.

The draft regulations would also create a separate transparency obligation for businesses that use the ADMT to make a decision that results in the denial of goods or services with legal or similarly significant effects for the consumer (e.g., the denial or an employment opportunity or lowered compensation). Such businesses would also need to affirmatively notify the consumer, via the method by which they primarily interact with the consumer, of the following information:

  • That the business made a decision with respect to the consumer;
  • The consumer’s right to access information about the business’s use of that ADMT and how it can be exercised; and
  • That the consumer may file a complaint with CPPA and the California Attorney General, along with links to the complaint forms on their respective websites.

Notice

The draft regulations also would require businesses that use ADMTs for any of the Covered Uses to provide a “Pre-use Notice” to consumersbefore the business uses ADMT to process the consumer’s personal information. The notice would need to be posted publicly where consumers would encounter it and include the following elements:

  • An explanation of the purpose for which the business proposes to use the ADMT, which – again – could not be provided in generic terms;
  • A description of the consumer’s right to opt out of and access information relating to the business’s use of ADMT;
  • A link to or a layered notice that contains further information about the business’s use of ADMT, including:
    • The logic used in the ADMT, including key parameters that affect the output;
    • The intended output of the ADMT;
    • How the business intends to use the output to make a decision; and
    • Whether the business’s use of ADMT has been evaluated for validity, reliability, and fairness and, if so, the outcome of such evaluations.

Finally, if a business relies on one of the exceptions to the opt-out requirement, it must say so in its Pre-use Notice and explain which exception it is relying on, but it would not be required to disclose information in a Pre-use Notice that would compromise its processing of personal information under the first three of the exceptions relating to fraud, illegal conduct, or safety.

Going Forward

The proposed rules do not provide an effective date. If the CPPA handles these proposed rules in the same manner as the most recent CPRA regulations that take effect in March 2024 (which were delayed by court order), then they would become enforceable one year after approval by the CPPA and California Office of Administrative Law.

Overall, the draft ADMT rules offer a highly-prescriptive and detailed set of requirements, with subjective elements and provisions that may make compliance difficult. The CPPA already proposed risk assessment requirements for ADMT as part of a parallel rulemaking process, and as a result, the general regulatory environment for ADMT is becoming more complicated. Businesses should prepare early, because the final ADMT rules are likely to be very similar to these draft discussion regulations.


[1] Specifically, ADMT is defined as “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.”

[View source.]

Leave a Comment

Your email address will not be published. Required fields are marked *