Skip to Content

AI Policy & Governance, Equity in Civic Technology, Privacy & Data

CDT Letter in Support of Washington State’s SB-5116

This week, CDT wrote a letter in support of SB-5116, a bill in Washington State to establish new standards, approval processes, and enforcement mechanisms for government use of automation decision systems.

Our letter highlights CDT’s recent report, Challenging the Use of Algorithm-driven Decision-making in Benefits Determinations Affecting People With Disabilities. This report illustrates what can go wrong when states use poorly designed algorithm-driven decision-making tools. Our report describes cases where disabled people experienced significant harm from algorithm-driven benefits reductions, leading advocates to sue states over constitutional due process violations among other issues.

***

2 February 2021
Committee on State Government & Elections
Washington State Senate

Dear Committee Members,

As government agencies increasingly integrate automation decision systems (ADS) into crucial government functions, law and policy must ensure that these systems operate fairly, without discrimination, and in a transparent and accountable manner. SB 5116 makes important strides towards these ends by creating new standards, approval processes, and enforcement mechanisms. The specifics of how to implement these steps are critical and raise complex issues. Accordingly, the Center for Democracy & Technology (CDT) strongly encourages the committee to schedule SB-5116 for a hearing to examine further how the bill could best achieve its laudable goals.

To demonstrate how poorly designed ADS may harm citizens and subject governments to legal liability, we share our recent report, Challenging the Use of Algorithm-driven Decisionmaking in Benefits Determinations Affecting People With Disabilities. It explains how ADS used in benefits determinations have cut crucial government benefits to people in need and resulted in states losing challenges under the Constitution and the Administrative Procedures Act.

  • First, without careful design and auditing, government ADS may use faulty or incomplete data or incorporate formulas that do not serve the intended purposes. Idaho’s Department of Health and Welfare, for example, designed and configured its assessment system for Medicaid Home and Community Based Services with an extremely small number of records, 66% of which were found to be incomplete, inaccurate, or unverifiable. After removing the unreliable data, only 733 records remained, which a court found insufficient for creating an accurate and reliable system.
  • Second, poorly designed ADS may subject states to liability for failing to comply with applicable legal requirements. For example, cases in Arkansas and Idaho found that disabled people whose benefits were reduced through an AI-based system often did not receive adequate notice, explanation, or opportunity to appeal reductions in their benefits, as required by states’ due process obligations. People did not know why their states had cut their benefits, could not understand the methodologies and standards used to make the decisions, or were unable to mount a meaningful appeal of the reductions. Advocates in Oregon have also brought a case alleging that the Office of Developmental Disabilities Services uses an undisclosed formula that automatically determines hours of care for people with developmental disabilities, and produces wildly variable results. Their core argument is that the formula’s variability and inconsistency violates constitutional due process standards.
  • Additionally, states implementing ADS without giving the public a meaningful opportunity to provide input can violate notice-and-comment rulemaking requirements. State government agencies must provide the public with sufficient information and explanation about proposed policy changes before they take place. This includes states’ adoption of automated decision-making systems that constitute substantial changes to policy. Advocates in Arkansas, for instance, successfully brought suit against the state for adopting its new system without following the requirements of notice-and-comment rulemaking.
  • Finally, states with poorly designed or implemented ADS may also violate civil rights laws that require fair and nondiscriminatory treatment. For example, the benefits determinations cases we reviewed in our paper showed that states can violate the Americans with Disabilities Act, which prohibits unjustified institutionalization or isolation of disabled people, when disabled people lose access to care necessary to stay in the community and outside of an institution.

Standards, processes, and enforcement mechanisms of the kind set forth in SB 5116 can help prevent harmful ADS development and implementation. In consideration of the high stakes, governments at all levels have begun examining how to regulate ADS. A recent executive order, for example, requires the entire federal government to create inventories of its ADS and audit their compliance with cross cutting principles. And a review of previous state legislative sessions finds that this is a topic of broad concern. We commend the Committee for seeking to get out in front and address these issues now, rather than waiting for problems to develop.

Please do not hesitate to reach out to our team if you have any questions or would like to discuss this further. Thank you for your time and consideration.

Read the full letter here.