Last week, three CDT team members gave statements on various privacy and data issues in front of the new California Privacy Protection Agency (CPPA) as a part of a pre-rulemaking stakeholder event. California was the first U.S. state to pass a comprehensive privacy law, and it is far along in crafting rules to effectuate those laws.
It is vitally important that the California privacy laws are interpreted broadly and enforced effectively to ensure that the privacy of Californians is protected. CDT urged the Agency to address the harms associated with the use of sensitive data (particularly health data), automated decision-making (harms to protected classes and particularly those with disabilities), and data minimization and purpose limitations.
Summaries of and links to the full statements are below.
The pre-rulemaking stakeholder sessions regarding the California Privacy Rights Act (CPRA) regulations opened with the topic of automated decision-making. CDT has analyzed the privacy and equity impacts that automated decision-making systems can have in employment, education, housing, credit, public benefits, and other areas. These systems affect the immediate decisions for which they are used, but also have a cascading effect on decisions in other areas. For instance, automated benefits determinations systems may deny unemployment benefits, affecting the payment history that a financial institution’s algorithm-driven tool evaluates for lending decisions, or that tenant screening algorithms flag in rental applications. Therefore, considering the potentially expansive reach of data-driven harms, the California Privacy Protection Agency’s rulemaking presents an important opportunity.
To address the issues that automated decision-making poses, we spoke about the considerations that should inform the Agency’s rulemaking. The CPRA regulations should:
- Account for both the methodologies involved in an automated decision-making system and the context of the system’s use,
- Require notice to actually offer meaningful substance that enables consumers to recognize when the system’s use may violate their rights,
- Mitigate if not eliminate impacts in discriminatory systems that affect access to critical opportunities, and
- Be scoped to prevent disruption of necessary government services provided through service providers.
Appropriate collection, sharing and use of sensitive data, like health data, can empower individuals and yield truly remarkable and beneficial outcomes. However, sometimes, the benefits are minimal and sensitive data collection, sharing, and use can be harmful. When sensitive data is shared and used in ways that consumers do not want or anticipate, they lose agency over their data and face a greater likelihood of harm.
Right now, the burden of protecting their sensitive health data falls almost entirely on consumers. That needs to change. It is time to rebalance these relationships and empower consumers with more control over how their sensitive information is collected, shared, and used. CDT recommended that the CPPA embrace the following priorities in subsequent rulemaking regarding sensitive personal information:
- Adopt a broad definition of “sensitive personal information” with a focus on the purpose and use of data;
- Narrowly identify what reasonable expectations by an average consumer are to prevent exceptions from swallowing the rule; and
- Ensure that the consumer opt-out process is simple and straightforward.
Data minimization and purpose limitations are critical data protection principles that are often overlooked and not taken seriously in the United States. Many businesses set their own data agendas, crafting essentially limitless practices in dense privacy policies. Businesses often do not think critically about their data practices nor try to limit the potential data-related harm they can cause.
Implementing strong, effective, and enforceable data minimization and purpose limitations are potential solutions to those problems. One approach taken by CDT in its comprehensive privacy framework is to prohibit certain harmful data processing practices when those practices are not required to provide or do not add to the functionality of the product, service, or specific feature that a person has requested. Those practices include:
- Biometric tracking,
- Precise location tracking,
- Cross-device tracking,
- Tracking of children under 13 years of age,
- The content of and parties to communications,
- Audio and visual recording, and
- Health information.
The CPPA should also consider limiting secondary uses of data and discriminatory uses of data.