While generative artificial intelligence (AI) apps – such as Chat-GPT, Google Gemini, and Stable Diffusion – are capturing the national spotlight, many businesses are using other forms of machine learning, algorithms, and other data processing technology to inform decisions made by the business – or in some cases, make those decisions automatically.

The California Privacy Rights Act amendments to the California Consumer Privacy Act (CCPA) directed the California Privacy Protection Agency (CPPA) to adopt regulations governing such “automated decision-making technology.” On March 8, the CPPA met to discuss its latest draft of proposed regulations (Draft Regulations). At the conclusion of the meeting, the CPPA voted to move forward with the formal rulemaking process.

The material terms of the Draft Regulations and key takeaways are below.

Definition of Automated Decision-Making Technology

An algorithm used by a rideshare app to allocate rides or determine fares and bonuses for its drivers. Surveillance software that scans human faces and matches them against a database of known individuals. Formulas in a spreadsheet that a business uses to determine which employees to retain or lay off.

All of these are examples of automated decision-making technology under the Draft Regulations. More formally, the Draft Regulations define “automated decision-making technology” (ADMT) as any technology—meaning software or programs, including those derived from machine learning, statistics, or other data processing or AI—that processes personal information and uses computation to execute a decision, replace human decision-making, or substantially facilitate human decision-making.

  • ADMT specifically includes “profiling,” which means evaluating personal information automatically in order to predict or assess characteristics about a certain person – such as their intelligence, interests, health, behavior, or location. One example many readers will be familiar with is online targeted advertising, whereby businesses deliver personalized ads to web users based on a consumer profile derived from a variety of information, such as browsing habits and location. The CCPA specifically defines profiling as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person, particularly to analyze or predict aspects concerning that natural person’s intelligence, ability, aptitude, performance at work, economic situation, health, including mental health personal preferences, interests, reliability, predispositions behavior, location, or movements.”
  • The Draft Regulations exclude the following technologies from the definition of ADMT, provided that the technologies do not execute a decision, replace human decision-making, or substantially facilitate human decision-making: web hosting, domain registration, networking, caching, website-loading, data storage, firewalls, anti-virus, anti-malware, spam- and robocall-filtering, spellchecking, calculators, databases, spreadsheets, or similar technologies. ADMT “substantially facilitates human decision-making” if the output of the ADMT is used as a key factor in a human’s decision-making.

Limitations on Applicability

While the Draft Regulations define ADMT broadly, they only impose requirements on a narrower set of businesses that:

  • Use ADMT to make or facilitate a “significant decision,” i.e., decisions that result in access to or the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services.
    • Decisions concerning “education enrollment or opportunity” specifically include decisions affecting admission into academic or vocational programs, credentials, and suspension and expulsion.
    • Decisions concerning “employment or independent contracting opportunities or compensation” include decisions concerning hiring, allocation or assignment of work, salaries, hourly or per-assignment compensation, incentive compensation such as bonuses or other benefits, promotion and demotion, suspension and termination.
  • Use ADMT for “extensive profiling,” e., profiling a consumer: (1) through systematic observation when the consumer is acting in its capacity as an applicant to an educational program, job applicant, student, employee, or independent contractor; (2) through systematic observation of a publicly accessible place; or (3) for behavioral advertising.
  • Train ADMT that is capable of being used: (1) for a significant decision; (2) to establish individual identity; (3) for physical or biological identification or profiling; or (4) for the generation of a deepfake.

Risk Assessments

Under the Draft Regulations, any business that engages in regulated uses of ADMT would be required to conduct a risk assessment on its use of ADMT. For ADMT risk assessments, the business would need to identify all of the following:

  • Its purpose is for processing consumers’ personal information. The business should be specific and cannot use generic terms, such as “to improve our services” or for “security purposes.”
  • The categories of personal information to be processed and whether they include sensitive personal information.
  • The actions the business has taken or any actions it plans to take to maintain the quality of personal information processed by the ADMT.
  • The operational elements of its processing activities, including: (1) the planned methods for collecting, using, retaining, and otherwise processing personal information and the sources of the personal information; (2) the retention period for each category of personal information; (3) the relationship between the consumer and the business; (4) the approximate number of consumers whose personal information will be processed; (5) what disclosures will be made to the consumer, and how the disclosures are made; (6) the names and categories of service providers and third parties who will have access to the personal information and the purpose for that access; (7) the technology used in the processing, including the of the ADMT, and any assumptions of the logic, and output of the ADMT, and how the output will be used.
  • The benefits to the business, the consumer, other stakeholders, and the public from the processing of the personal information.
  • The safeguards that it plans to implement to address any negative impacts, including: (1) whether it evaluated the ADMT to ensure it works as intended and does not discriminate based upon protected classes; and (2) the policies, procedures, and training the business has implemented or plans to implement to ensure that the ADMT works as intended for the business’s proposed use and does not discriminate based upon protected classes. As an example, the Draft Regulations suggest a business could evaluate if and when human involvement may be needed and implement procedures and training to involve human decision-making as appropriate.
  • Whether the business will engage in the processing based upon the risk assessment.
  • The identity of the contributors to the risk assessment.
  • The date the assessment was reviewed and approved, and the names and positions of the individuals responsible for the review and approval.

Additionally, a business that makes an ADMT available for a regulated use by another business would be required to provide the recipient business with all facts necessary for the recipient business to conduct its own risk assessment.

The Draft Regulations also require that businesses conduct and document these risk assessments prior to engaging in the regulated use of the ADMT and review and update the risk assessment whenever there is a material change to the processing operations, but no less than every three years.

If a business chooses to move forward with the use of ADMT after the risk assessment, the business would need to submit a certification of compliance and an abridged copy of the risk assessment to the CPPA.


Pre-Use Notice

The Draft Regulations require a business that engages in a regulated use of ADMT to provide consumers a “Pre-Use Notice” informing consumers how the business uses ADMT and about consumers’ rights to opt-out and access certain additional information, including:

  • A plain language description of the specific purpose for which the business will use ADMT. Generic descriptions, such as “to improve our service,” are not permissible. If personal information is processed to train ADMT, the notice would also need to identify the categories of personal information used for training and the specific uses for which the ADMT is capable of being used.
  • A description of the right to opt-out of the business’s use of ADMT and instructions for submitting an opt-out request. (Rights to opt out are discussed below.)
  • A description of the right to access information about how the business uses ADMT with respect to that particular consumer. If a business only trains ADMT and does not use ADMT for significant decision-making or extensive profiling, the business is not required to include a description of this right to access.
  • Information—in plain language—about: (1) the logic used in the ADMT, including key parameters; and (2) the intended output of the ADMT and how the business plans to use the output to make a decision, including how humans are involved in such a decision. A business that only trains ADMT and does not use ADMT for significant decision-making or extensive profiling is not required to provide this information.
Notice for Adverse Significant Decisions

Certain “adverse” significant decisions require additional notice. Specifically, if a business uses ADMT to make a significant decision that results in a consumer being denied an educational credential, having compensation decrease, being suspended, demoted, terminated or expelled, or being denied financial or lending services, housing, insurance, criminal justice, healthcare services or essential goods or services; then the business would also be required to notify the consumer within 15 business days of that decision. That notice must include each of the following details:

  • That the business used ADMT technology to make the significant decision with respect to the consumer.
  • That the business is prohibited from retaliating against consumers for exercising their CCPA rights.
  • That the consumer has a right to access information about the business’s use of the ADMT and how the consumer can exercise their access right.
  • If the business relies on the human appeal exception in relation to a consumer’s right to opt out, that the consumer can appeal the decision and how the consumer can submit their appeal and any supporting documentation.

Opt-Out Requests

Under the Draft Regulations, businesses that engage in regulated uses of ADMT are required to provide consumers with the ability to opt-out of such uses.  The Draft Regulations require that businesses do the following:

  • Offer the ability to opt out with at least two easy-to-use methods that require minimal steps to execute. If a business interacts with consumers online, at least one method must be an interactive form that is accessible via a link in the Pre-Use Notice. Cookie controls are not, by themselves, sufficient opt-out mechanisms.
  • Refrain from processing a consumer’s personal information using ADMT if a consumer makes an opt-out request in response to the Pre-Use Notice.
  • If a consumer makes an opt-out request after processing has begun, delete and discontinue the use of a consumer’s personal information as soon as feasibly possible, but no later than 15 days from the date of an opt-out request.
  • Notify service providers, contractors, and other recipients of the opt-out request and instruct them to comply with the request.
  • Provide a means to confirm the business has processed the opt-out request.
  • Permit consumers to exercise opt-out rights through an authorized agent.

In some situations, a business would not be required to provide a right to opt out:

  • A business does not need to provide consumers a right to opt out of uses of ADMT: (1) to prevent, detect, and investigate security incidents; (2) to resist malicious, deceptive, fraudulent, or illegal actions; or (3) to ensure the life and physical safety of consumers.
  • A business using ADMT to facilitate significant decisions may provide consumers the ability to appeal the decision to a human decision-maker in lieu of an opt-out right. The business would need to designate a qualified human reviewer to review decisions and the business would need to clearly describe how the consumer can submit an appeal.
  • A business using ADMT for admission, acceptance, hiring, allocation of work assignments, compensation decisions, or work or educational profiling does not need to provide a right to opt out of such uses if the business has: (1) evaluated the ADMT to ensure it works as intended for the business’s proposed use and does not discriminate based upon protected classes; and (2) implemented accuracy and nondiscrimination safeguards.

Note, however, that none of these exceptions apply to profiling for behavioral advertising using ADMT or for training uses of ADMT.

Right to Access

The Draft Regulations require that a business that engages in a regulated use of ADMT provide consumers with a right to access information about how the business uses ADMT to process information about that consumer. In response to such a request, the business would need to inform the consumer of the following:

  • The specific purpose for which the business used ADMT in relation to the consumer.
  • The output of the ADMT with respect to the consumer and an easy-to-use method to access multiple outputs relating to the consumer, if any.
  • How the business used or plans to use the output to make a decision with respect to the consumer:
    • If the business used or plans to use the ADMT to make a significant decision, the business would be required to inform the consumer the role the output of the ADMT in the decision and the role of any human involvement.
    • If the business used or plans to use the ADMT to engage in extensive profiling of the consumer, the role of the output of the ADMT or how the business plans to use the output to evaluate the consumer.
  • How the ADMT worked with respect to the consumer, including: (1) how the logic was applied to the consumer; and (2) key parameters that affected the output and how those parameters apply to the consumer.

If a business uses ADMT to protect against security incidents, fraudulent activity or harm to a consumer, the business need not disclose information that would compromise its processing of information for those purposes.

The Draft Regulations also require service providers and contractors to assist the business in responding to requests for access.

Key Takeaways

The Draft Regulations not only regulate the use of “artificial intelligence,” but any technology that substantially facilitates human decision-making. Although the definition of ADMT appears to exclude a number of existing technologies, the Draft Regulations are clear that those technologies are still ADMT when used to facilitate or execute human decision-making. As an example, the Draft Regulations state that a business’s use of formulas in a spreadsheet to determine which employees it will terminate is a use of automated decision-making technology subject to the requirements of the Draft Regulations. Businesses should not assume that their existing processes are not covered by the Draft Regulations merely because those processes do not involve AI.

While the final form of the Draft Regulations remains to be seen, certain core requirements are very likely to remain because they are required by the text of the CCPA. These include the requirements to provide opt outs and access to information and explanation of the logic behind the ADMT and notice of possible outcomes.

Businesses would be well-advised to take a close look at current and proposed uses of ADMT, particularly in HR, lending, healthcare and other contexts. Because regulated uses of ADMT relate to significant decisions affecting consumers and also intersect with federal and state civil rights and consumer protection laws, businesses need to assess whether the disclosure requirements will require disclosing information that could significantly increase the risk of lawsuits for discrimination or other claims, such as unfair trade practices. In that analysis, businesses should strongly consider whether that increased risk outweighs the benefit of continuing to use the ADMT. Relating to that liability, businesses can expect that third-party providers of ADMT will seek indemnity for claims resulting from a decision made by the business, even if that decision results, in part, from the third party’s ADMT.

Businesses that ultimately decide to proceed with regulated uses of ADMT can consider doing the following to prepare for the final rules:

  • All business that use ADMT for a regulated purpose should prepare to update both notices provided to consumers and operational processes for consumer rights requests.
  • Businesses that use ADMT should begin developing processes and policies regarding the development and use of ADMT. Those processes should include the development of risk assessments that assess the ADMT for validity, reliability and fairness. Those developing proprietary ADMT also should begin documenting how the ADMT works, including by developing descriptions of the output of the ADMT, the logic and key parameters used in the ADMT, and why those parameters are key.
  • Businesses using a third-party ADMT should assess what amendments may be needed to agreements with those third-party providers to comply with the new regulations. Businesses also should work with third-party providers to gather information for a risk assessment and identify what information will need to be passed through to consumers in notices and in responses to requests for access. Businesses should expect hesitancy from some vendors, who may view some of this information as proprietary or confidential.

In developing information to provide to consumers, many businesses may struggle with meeting the plain language and access requirements of the rules while also accurately describing the logic of complex algorithms with many parameters. Businesses that use self-training machine-learning models may face considerable challenges because those key parameters may be constantly changing.

We will continue to monitor for updates to the Draft Regulations. If you have any questions about how the Draft Regulations may impact your business, please contact the authors.