Skip to Content

AI Policy & Governance, Equity in Civic Technology

Report – The Federal Government’s Power of the Purse: Enacting Procurement Policies and Practices to Support Responsible AI Use

CDT report, entitled "The Federal Government’s Power of the Purse: Enacting Procurement Policies and Practices to Support Responsible AI Use." Illustration of an open blue change purse with drawstrings, and two purple coins falling inside.
CDT report, entitled “The Federal Government’s Power of the Purse: Enacting Procurement Policies and Practices to Support Responsible AI Use.” Illustration of an open blue change purse with drawstrings, and two purple coins falling inside.

Executive Summary

Government spending on artificial intelligence (AI) has reached unprecedented levels. In fiscal year 2022, the United States government awarded over $2 billion in contracts to private companies that provide services that rely on AI, and total spending on AI has increased nearly 2.5 times since 2017.

Meanwhile, federal policymakers’ attention to AI continues to grow, with multiple legislative and executive actions aimed at encouraging the federal government to adopt AI while accounting for issues of bias, privacy, transparency, and efficacy. The increase of government spending on AI, in addition to the growing acknowledgement of the potential and risks associated with such technology, has raised new and urgent questions about whether and how tenets of responsible AI use are addressed in federal government procurement policies and practices.

Building on years of bipartisan federal efforts to govern AI use – including legislation, agency actions and guidance, and executive orders – two recent executive actions have taken direct aim at the federal government’s procurement of AI: the Biden Administration’s 2023 Executive Order 14110 on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (2023 AI EO) and the Office of Management and Budget’s (OMB) subsequent guidance, the 2024 Memorandum for Agency Use of AI (Final OMB AI Memo). The Biden Administration’s 2023 Executive Order on Further Advancing Racial Equity and Support for Underserved Communities Through the Federal Government (2023 Racial Equity EO) also requires that agencies designing, developing, acquiring, and using AI and automated systems must do so in a manner that advances equity. 

The 2023 AI EO lays out a whole-of-government strategy for federal agencies to support robust AI governance, including by the public sectors. It specifically addresses procurement by focusing on expansion of the federal workforce to ensure that the government has the ability to appropriately solicit and assess procured AI systems, clarifying expectations of the guidelines procured AI systems are expected to adhere to, and directing OMB to provide guidance to agencies on the design, use, and procurement of AI systems.

Following the 2023 AI EO, OMB released the required agency guidance, the Final OMB AI Memo in March 2024 (following a round of comments on a proposed version). The Final OMB AI Memo provides extensive guidance on the use of AI systems and will invariably impact procurement processes because agencies need confidence that a procured system can comply with the Memo’s requirements. The Final OMB AI Memo also provides some explicit procurement recommendations, including aligning procured systems to legal requirements, increasing transparency on procured systems, and promoting competition. Both the AI Executive Order and Final OMB AI Memo are important steps to guide AI procurement, but to achieve the goal of equitable, ethical, and effective government procurement of AI more support is needed, including more robust guidance from OMB. 

Individual federal agencies are still, nonetheless, making decisions about whether and how to procure AI-driven technology from third parties. In doing so, they face a number of challenges specific to the AI context, particularly for the purpose of service delivery. This report identifies a number of these challenges, including the lack of a common definition of AI, limited internal capacity to evaluate AI-driven systems and the vendors that provide them, and insufficient monitoring contracts for AI systems after they have been executed. Additionally, limitations within existing federal procurement processes threaten to further impede the responsible procurement of AI. These include difficulties around understanding and evaluating bias, incorporating human oversight and intervention, and defining and implementing a process for redress in the event that an AI-driven system results in harm.

This report provides a number of recommendations to establish robust and sustainable AI procurement processes for federal agencies. The report is informed by interviews with current and former government employees, and experts representing different perspectives from academia, industry, and civil society organizations. It recommends that the following federal actions should be taken:  

  • Incorporate responsible AI considerations at the acquisition planning stage. This should include encouraging agencies to build upon the processes in the Federal Acquisition Regulations to consider the potential socio-technical risks of AI on end-users or intended beneficiaries; developing an “AI Responsibility Questionnaire” built for government agencies to use as part of procurement planning and market research; and encouraging agencies to require legal review for all contracts that involve AI to ensure equity.
  • Include references to AI risks in the Federal Acquisition Regulations. The references should explicitly call out and emphasize responsible AI practices in areas such as acquisition planning, market research, privacy protection, and quality assurance.
  • Equip agencies to perform pre-award vendor evaluation and post-award vendor monitoring. Federal agencies would benefit from guidance on how to make broader use of their authority to conduct pre-award evaluations for AI models; how to further develop standards or certifications for responsible AI similar to efforts like Federal Risk and Authorization Management Program; and support and resources on how to build independent auditing into the acquisition process.
  • Clarify and strengthen transparency, reporting, and oversight requirements and issue guidance to facilitate compliance. Cross-government bodies such as Congress, OMB, and the GAO should take steps such as providing a consistent definition of AI systems; strengthening the “AI inventories” for greater transparency around agency AI use; advocating for the addition of specific reporting requirements regarding responsible AI in the Federal Information Technology Acquisition Reform Act (FITARA) scorecard; developing guidance and taking a consistent approach to intellectual property provisions in vendor contracts; publishing an “oversight guide” for reviewing agency acquisition activities; strengthening requirements for agencies to conduct Algorithmic Impact Assessments (AIAs) and require agencies to publish them on their websites; and encouraging the National Institute of Standards and Technology (NIST) to adopt a standard for AIAs.
  • Increase federal workforce capacity to ensure agencies are prepared to evaluate and manage risks throughout AI procurement. Agencies should  develop training modules and incorporate them into the existing procurement curriculum, and encourage the growth and support of digital-services teams within the government with experience designing and deploying responsible AI.

While this report focuses on federal AI procurement policy, the federal government can also ensure that federal taxpayer dollars are used responsibly by establishing requirements for and oversight of grants that support state, local, and private sector uses of AI. In addressing federal AI procurement, the following report is intended to provide a framework for how to enable procurers of AI within the federal government to acquire systems that will strengthen and improve agency operations while protecting the people those agencies are made to serve. 

Read full report.