Privacy professionals rang in the New Year with two new consumer privacy laws coming into effect and three more on the horizon. Among them are California Consumer Privacy Act (CCPA) amendments enacted through the California Privacy Rights Act (CPRA). The California Privacy Protection Agency (Agency) certainly has its hands full, and subject businesses (Businesses) are looking forward to additional clarity flowing from regulations (even as the CPRA took effect on January 1, related regulations still are not yet finalized). On Friday, February 3, the Agency board finally approved submission of a proposed final rulemaking package for CCPA Regulation Sections 7000-7304 (the 7000-7304 Package) to the Office of Administrative Law (OAL). The OAL will have 30 business days to approve or reject the 7000-7304 Package. Draft regulations covered in the 7000-7304 Package (which we previously examined in November 2022 and June 2022), remain ominously silent on mandatory cybersecurity audits, risk assessments, and automated decision-making.

After applauding submission approval for the 7000-7304 Package, the Agency board chatted about what form outstanding regulations regarding cybersecurity audits, risk assessment and automated decision-making might take and what types of questions the Agency is looking to answer in crafting regulations on these topics (invitation for public comment available here). Below, we’ve summarized a few noteworthy tidbits from their discussion and the public comment invitation.

As background, the CCPA requires Businesses whose personal information processing poses “significant risk to consumers’ privacy or security” to do the following:

  • Perform annual cybersecurity audits.
  • Submit regular risk assessments to the Agency.

In determining whether processing may create a significant risk, a Business must consider the nature and scope of personal data processed. As we look forward to draft regulations addressing cybersecurity audits and submission of assessments, one concern is that such factor-based approaches could result in subjective determinations (and discretionary enforcement), making likely outcomes all the more difficult to predict. While we await regulations, remember that certain businesses operating in Virginia are already required to undertake data privacy impact assessments (DPIAs) where processing presents a heightened risk of harm to consumers, and other states like Colorado will soon have a similar requirement. If a Business does not believe it is subject to such requirements, it might consider proactively preparing documentation showing why it believes that its processing activities do not present a “significant risk.”

Audits

In considering its approach to drafting potential regulations, the Agency appears open to leveraging popular cybersecurity certification models, which often involve third-party audits and similar audit requirements under other laws. Additionally, the Agency may consider allowing the submission of certifications from these frameworks (e.g., ISO) as potentially fulfilling the CCPA’s cybersecurity audit requirement. While it remains to be seen whether third-party certifications will be sufficient to meet CCPA requirements under future regulations, Businesses may want to consider whether current cybersecurity audits or similar certifications the Business undertakes are likely to be sufficient or will, at a minimum, make future reporting easier. Depending on the timing of regulations, Businesses should also be aware that additional funding may be required this year to address any potential cybersecurity audit costs if they become mandatory. Note that any required audits will likely need to be “independent,” and Businesses likely will not be permitted to “self-certify” compliance with the audit requirement. Cybersecurity audits discussed here are distinct from the sweeping audit rights the Agency granted itself in § 7304 of the proposed final regulations approved to submission to the OAL at the February 3 Agency meeting.

Assessments

Distinct from annual cybersecurity audits, Businesses also must regularly submit any required risk assessments to the Agency. This requirement applies to all risk assessments, regardless of whether there is an investigation. At a minimum, the risk assessments required under CCPA must indicate whether processing involves sensitive personal information and must identify and weigh “the benefits and risks” of such processing.

The CCPA defines “Sensitive Personal Information” to include:

  • Personal information that reveals a consumer’s: social security, driver’s license, state identification card, or passport number; account log-in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account; precise geolocation; racial or ethnic origin, religious or philosophical beliefs, or union membership; contents of a consumer’s mail, email, and text messages unless the business is the intended recipient of the communication; or genetic data.
  • Biometric information processed to uniquely identify a consumer.
  • Personal information collected and analyzed concerning a consumer’s health.
  • Personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.

It appears the Agency views the required risk assessments to function similarly to DPIAs under the General Data Protection Regulation (GDPR) and the Data Protection Assessments (DPAs) required under the Colorado Privacy Act. Access current draft regulations here and our previous alerts discussing the Colorado Privacy Act here and here.

Specifically, the Agency may follow the GDPR’s lead in determining whether processing presents a significant risk to consumers’ privacy and security. At a high level, the European Data Protection Board (EDPB) considers large-scale processing of “special category data” (similar to “sensitive personal information” under CCPA), profiling, automated decision-making, and systematic monitoring of publicly accessible areas to all present high risks to individuals’ rights and freedoms (See here). Of note for California employers in particular, the EDPB includes personnel monitoring, performance evaluations, and employee location-tracking as high-risk processing activities that require submission of a DPIA. There is reason to think that automated decision-making, and any sort of profiling, will be construed as posing a significant risk to consumers’ privacy and security.

The Agency also noted that it could follow any assessment content requirements required for Colorado’s DPAs. Colorado requires the following to be included in any DPA, in addition to identifying and weighing any benefits and risks flowing from the processing:

  1. Any mitigation of risk achieved through the use of technical and organizational security measures.
  2. Potential use of de-identified data.
  3. Reasonable expectations of the individuals in regard to the processing.
  4. The context of the processing.
  5. The relationship between the business and consumer.

Note that final regulations (including additional specifications for DPAs), remain pending. Our previous analysis of the Colorado Privacy Act draft regulations is available here. Colorado’s draft regulations also suggest that compliance with requirements of others states, such as California, may be sufficient to meet its requirements, thus potentially permitting Businesses to pick and choose which state’s DPA requirements it wishes to follow.

Businesses should continue to identify which processing activities might pose a “significant risk to consumers’ privacy and security” and complete a preliminary risk assessment that meets the content requirements of either the DPIAs required under GDPR or the DPAs required under Colorado’s privacy act.

Automated Decision-Making

“Automated decision-making” is not explicitly defined under the CCPA and, in fact, the Agency invites public comment on whether any such definition should be formally adopted. Profiling, however, is defined as “any form of automated processing of personal information…to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.” The Agency appears ready to adopt a broad definition of “automated decision-making,” to encompass many activities, particularly those involving algorithmic bias and/or discrimination.

From an operational perspective, Businesses should note two specific consumer rights relating to automated decision-making and profiling.

Right to Access

For automated decision-making, responses to an access request must include meaningful information about the Business’s use of automated decision-making technology/processing and the logic behind such processing and a description of the likely outcome specific to the requesting consumer.

For example, if a Business uses automated decision-making technology for purchase/incentive qualification determinations, the Business’s response to an access request must include the following:

  1. That the Business uses automated decision-making technology for such purchase/incentive qualification determinations.
  2. The underlying logic or inputs for the technology [e.g., purchase history of at least $10,000 in last six months, zip code within 50 miles of a physical store location, and employment with a government agency.]
  3. That use of the automated decision-making technology would likely have resulted in the requesting consumer qualifying/not qualifying for the purchase/incentive.

The Agency notes that it does not want to require the divulgence of trade secrets but otherwise appears committed to providing consumers with as much information as possible regarding such technologies and the use of them by Businesses.

Right to Opt Out

The CCPA explicitly grants consumers the right to opt out of the sale of personal information or the processing of personal information for “cross-context behavioral advertising” (i.e., targeted marketing across platforms). The CCPA did not explicitly grant an opt-out right from automated decision-making but did direct the Agency to draft regulations “governing…opt-out rights with respect to [Businesses’] use of automated decision making technology, including profiling…”

The Agency’s public comment invitation assumes a right to opt out of automated decision-making and goes further in appearing to view the opt-out right as a mechanism by which it can fight algorithmic discrimination for characteristics such as race, sex, and age. Note that the timeframe for responding to an opt-out request is only 15 days (as opposed to 45 for most other consumer rights requests), and Businesses are required to cascade consumer rights requests to applicable service providers/vendors.

Key Dates

  • January 1, 2023: CPRA Effective Date
    • Cure period for CCPA violations sunsets.
    • Broad exemption of employee/employment context data expires.
  • February 3, 2023: CPPA Board Meeting
    • Draft regulations approved for submission to OAL available here.
    • Meeting materials available here.
  • July 1, 2023: CPRA Enforcement Commences

Enforcement for the CPRA amendments, including compliance with the items noted above, commences on July 1, 2023. The automatic cure period for CCPA violations expired when the ball dropped on December 31, 2022. Our team will continue to monitor developments from the CPPA. Please click here to subscribe to additional alerts.