The California Privacy Protection Agency (CPPA) got new powers under the California Privacy Rights Act (CPRA) to create regulations pertaining to “Automated Decision-Making Technology” (ADMT).
These new rules represent an early step towards meaningful private-sector AI regulation in the US. The first draft of the CPPA’s regulations arrived on 28 November 2023 and will affect many types of AI-driven processes and profiling activities.
The regulations will impact many businesses covered by the California Consumer Privacy Act (CCPA). This article, the first of three parts, examines what types of activities the draft regulations cover.
What are ADMT and Profiling?
The draft regulations provide some broad definitions that would likely capture the activities of thousands of businesses.
Here are two of the draft regulations’ four definitions. We’ll address the other two in the appropriate sections below.
- Automated Decision-Making Technology: “Any system, software, or process – including one derived from machine-learning, statistics, or other data-processing or artificial intelligence – that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decision-making. Automated decision-making technology includes profiling.”
- Profiling: “Any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”
In other words…
- “ADMT” means any system that facilitates (even partly) automated or AI-driven decisions based on personal information.
- “Profiling” means the automated or AI-driven processing of personal information to analyze a person’s characteristics or behavior in certain contexts.
In adopting these definitions, the CPPA has been strongly influenced by Article 22 of the EU and UK General Data Protection Regulation (GDPR). But the CPPA’s conception of automated decision-making is arguably even broader than the EU’s.
What uses of ADMT are covered by the regulations?
The proposed rules only apply to businesses engaged in certain uses of ADMT, which are listed in Section 7030 (b) of the draft regulations.
Here are the covered activities, annotated with explanations and definitions.
Decisions with legal or similarly significant effects
A business has obligations under the CPPA’s regulations if it uses ADMT for “a decision that produces legal or similarly significant effects concerning a consumer.”
An automated decision has “legal or similarly significant effects” if it results in access to, or the provision or denial of, services in the following areas:
- Financial or lending services
- Education enrollment or opportunity
- Criminal justice
- Independent contracting opportunities or compensation
- Healthcare services
- Essential goods or services
Profiling employees, contractors, applicants, and students
The CPPA’s regulations cover a business that profiles “a consumer who is acting as an employee, independent contractor, job applicant, or student.”
The CCPA is a particularly powerful privacy law because, unlike every other US state comprehensive privacy law, the CCPA protects employees (etc.) as “consumers”.
The regulations provide a non-exhaustive list of examples of such profiling activities, which include profiling an employee using:
- Keystroke loggers
- Productivity or attention monitors
- Video or audio recording or live streaming
- Facial or speech recognition or detection
- Automated emotion assessment
- Location trackers
- Speed trackers
- Web browsing, mobile application, or social media monitoring tools