Skip to Content

AI Policy & Governance, CDT AI Governance Lab

Report: Navigating Demographic Measurement for Fairness and Equity

CDT report, entitled "Navigating Demographic Measurement for Fairness and Equity." Illustration of a compass and streams of data.
CDT report, entitled “Navigating Demographic Measurement for Fairness and Equity.” Illustration of a compass and streams of data.

[ PDF version ]

Executive Summary

Governments and policymakers increasingly expect practitioners developing and using AI systems in both consumer and public sector settings to proactively identify and address bias or discrimination that those AI systems may reflect or amplify. Central to this effort is the complex and sensitive task of obtaining demographic data to measure fairness and bias within and surrounding these systems. This report provides methodologies, guidance, and case studies for those undertaking fairness and equity assessments — from approaches that involve more direct access to data to ones that don’t expand data collection. 

Practitioners are guided through the first phases of demographic measurement efforts, including determining the relevant lens of analysis, selecting what demographic characteristics to consider, and navigating how to hone in on relevant sub-communities. The report then delves into several approaches to uncover demographic patterns.

Approaches for Measuring Demographic Characteristics for Fairness Measurement

Measuring disparities related to real people

  • Collection: Directly asking individuals to self-report their demographic information
  • Observation and inference: Assigning perceived demographic characteristics based on observable features or predicting them using statistical methods or machine learning
  • Proxies and surrogate characteristics: Using signals that correlate with demographic characteristics to detect patterns or disparities without directly inferring individual demographics
  • Auxiliary datasets: Combining existing datasets containing demographic information with the data of interest
  • Cohort discovery: Using pattern detection techniques to identify groups experiencing negative outcomes, without explicitly naming demographic characteristics

Measuring disparities related to representations 

  • Keywords and terms: Manually or automatically constructing lists of words and topics that relate to demographic characteristics and using them to probe systems
  • Observation and labeling of content: Automatically or manually assigning labels of apparent traits to unidentified people represented in audiovisual or text content

Measuring disparities across contexts

  • Synthetic data: Using artificially generated data that simulates the structure and distribution of real-world examples or populations
  • Exploratory analysis: Reviewing a system to reason about how its design, behavior, or other characteristics might lead to negative impact for certain communities
  • Qualitative research: Directly engaging with people using and affected by systems to capture more nuanced insights about people’s lived experience

Given long histories of demographic data being misused to the detriment of vulnerable communities, the report emphasizes that responsibly handling demographic data is just as critical as the measurement methods themselves. Many of the approaches described have the potential to be mixed and matched with one another to strengthen protections against potential harms while helping to enable critical work.

Approaches for Handling Demographic Characteristics for Fairness Measurement

Data and infrastructure controls

  • Pseudonymization: Replaces personal identifiers with placeholder information or otherwise breaks the link between identifying data and other data about an individual
  • Infrastructure controls: Data and system architecture choices that limit how and by whom data and measurement methods can be accessed or used
  • Encryption: Scrambling data so it can’t be easily deciphered without a mathematical key
  • Retention and ephemerality: Preventing data from being created or stored longer than needed

Privacy enhancing methods

  • Aggregation: Combining and summarizing data to reduce identifiability of individual data points
  • Differential privacy: Adding a specific amount of random statistical noise to datasets to realize particular privacy constraints
  • Secure multi-party computation: A cryptographic protocol that allows parties to conduct analyses across multiple datasets without sharing data with one another

Procedural controls

  • User controls: Providing people with the opportunity to decide whether to share their data and to request data be corrected or deleted
  • Organizational oversight: Processes to review proposed uses of data or measurement methods to ensure they comply with policies and follow necessary procedures
  • Separate teams: Assigning a specific team to be responsible for oversight and compliance with laws that implicate demographic measurement
  • Privacy impact assessments: Structured impact assessments to evaluate whether proposed use of data sufficiently mitigate against privacy risks

As policymakers and practitioners build regulatory and technical infrastructure to make progress in this domain, we highlight several recommendations to ensure that the balance remains tipped toward beneficial measurement efforts.

Practitioners should:

  • Establish ongoing relationships with communities affected by measurement activities to co-design data collection and handling strategies, discuss potential risks and benefits, and collaboratively define fairness goals.
  • Where possible, consider methods that avoid generating or storing sensitive demographic information in a way that can be easily connected to individuals. 
  • Take great care before using observation and inference methods to identify characteristics, especially those lacking precedent or that resist observation. 
  • Clearly differentiate between perceived or implied characteristics and actual ones 
  • Employ a robust combination of approaches to handling data and measurement methods to ensure appropriate use.
  • Communicate openly about demographic measurement efforts, as well as how data is handled.

Government agencies and regulators should:

  • Recognize that a variety of approaches are available for companies to identify and measure disparities, even in the absence of comprehensive demographic data collection.
  • Clarify criteria and expectations about acceptable measurement methods when it comes to civil rights compliance, and articulate minimum expectations for how data and methods should be handled.
  • Explore how more measurement methods can be used to monitor compliance with Federal civil rights laws, including to conduct investigations and enforcement actions.
  • Facilitate collaboration between NGOs, research institutes, and government data agencies to explore creative ways that existing administrative data can be used to conduct measurements in a privacy-respecting manner. 
  • Encourage continued research to explore how unsupervised, synthetic, privacy-enhancing, and content-related methods can be used to further the detection and remediation of bias and discrimination.

While there is no one-size-fits-all solution, this report makes clear that the lack of obvious access to raw demographic data should not be considered an insurmountable barrier to assessing AI systems for fairness, nor should it provide a blanket justification for widespread or incautious data collection efforts. From exploring privacy-preserving techniques to pursuing measurement of content-related bias when disparities affecting people are hard to measure directly, practitioners have a range of tools at their disposal. As practitioners navigate this complex but important landscape, they should engage early and often with impacted communities, clearly document and communicate their practices, and embed strong technical and institutional safeguards to prevent misuse. Ultimately, responsible demographic measurement demands extraordinary care — for technical choices and their implications, but even more for the people and communities this work must ultimately serve.

Read the full report.