The Center for Democracy & Technology opposes the U.S. Customs and Border Protection’s (CBP) efforts to expand the use of facial recognition technology in airports, because of serious concerns about the technology’s inaccuracy, bias, and intrusion on privacy, as well as the program’s dramatic expansion of a database of sensitive biometric data.
CDT last filed a comment of opposition to this proposal in December. CBP recently reopened the comment period for the same proposal, so CDT filed the attached supplement based on additional information that has since been made public. This week we also joined a coalition letter about this issue.
The text of the comments have been pasted below, and are available in PDF form here.
Re: Supplemental Comment of the Center for Democracy & Technology in Opposition to DHS Docket Number USCBP-2020-0062, Collection of Biometric Data from Aliens Upon Entry to and Departure from the United States; Re-opening of Comment Period
To Whom It May Concern:
The Center for Democracy & Technology is a nonpartisan, nonprofit technology policy advocacy organization dedicated to advancing individual rights in the digital age. Joined by many other organizations, we write to urge U.S. Customs and Border Protection (CBP) to withdraw the above referenced notice of proposed rulemaking (NPRM). CBP proposes to dramatically expand its database of sensitive biometric data by requiring most travelers who cross the U.S. border to submit to a facial image collection and recognition program. In particular, the proposed rule would expand the category of “inscope” travelers subject to a mandatory biometric collection and screening requirement to all non-U.S. citizens, including lawful permanent residents of the U.S. and children, and all U.S. citizens who do not “opt out” of such screening. Yet the record amply demonstrates that CBP’s pilot program for facial image screening has failed to address concerns about its inaccuracy, bias, and intrusion on privacy. As a result, CBP’s proposed expansion of the facial recognition program is unjustified.
CDT’s initial comments urging withdrawal of this proposal in 2020 are attached to this supplemental comment. In brief, we raised a number of privacy, civil rights and civil liberties concerns with facial image collection and screening at ports of entry, and highlighted limitations in the program’s ability to achieve its desired goals and to perform as described. For example, although U.S. citizens have a right to opt out of facial recognition screening at ports of entry, in practice this right has been difficult to exercise. And for all those who must submit to such screening, the pilot program has not demonstrated the technology’s ability to work as intended and to provide an equitable travel experience for all including people of color, women, and young people. The consequences for travelers in the case of an error in this context can be significant: if they are not accurately identified they may be delayed, miss their flight, face a custodial interrogation, or worse. Additionally, CBP elected to center its biometric screening program around a particularly sensitive form of biometric data—the faces of travelers—even though it could have selected a less sensitive and perhaps more effective biometric identifier, such as fingerprints. CBP failed to justify this decision. CBP also failed to adopt privacy protections that would prevent the broad distribution and repurposing of facial images captured by the agency, so this proposed collection risks significantly enhancing the surveillance capabilities of many U.S. government entities. Finally, CBP’s expansion from its current pilots is premature as the agency has yet to address outstanding privacy and security recommendations from the U.S. Government Accountability Office (GAO), which recently concluded a review of CBP’s program.
We would have preferred that CBP withdraw this NPRM based on the original comments it received from CDT and others, but we appreciate that the agency has provided the public an additional 30 days to weigh in on such a consequential program. Our ultimate recommendation based on the concerns we previously highlighted has not changed—CBP should scrap the proposed expansion of biometrics use. We file this supplemental comment to highlight additional information that has been made public since our original comments were filed—information that only further demonstrates that the proposed rule is unjustified.
In our initial comments, we noted that CBP had yet to demonstrate that its facial image screening technology can accurately capture traveler facial images and identify travelers. New information only heightens those concerns. CBP recently reported that in 2020 the agency processed more than 23 million travelers with a match rate of only about 97%. CBP did not disclose the mis-match rate. This means that over the course of a year about 690,000 people were not matched, and an undisclosed number of people were mismatched. CBP failed to disclose the demographic breakdown of unmatched travelers, the cause for the non-matches, or the consequences of the non-matches. Even without that information—which might reveal biases or other significant problems with the technology—it’s clear that problems with the program will impact hundreds of thousands of people annually, which alone should dissuade CBP from this premature expansion.
CBP’s recent attempts to justify this expensive and bloated biometric screening program are unconvincing. To tout its security benefit, CBP recently reported that “[s]ince the program’s inception, in 2018, CBP officers at U.S. airports have successfully intercepted seven impostors who were denied admission to the United States and identified 285 imposters on arrival in the land pedestrian environment.” When combined with information CBP previously disclosed, this means that in 2020, the biometric entry exit system didn’t catch any imposters traveling through airports and caught fewer than 100 pedestrian imposters. CBP did not disclose whether these imposters would have otherwise been so identified by manual document review. Nor did the agency disclose any information about the false match rate, or the number of people who were mistakenly cleared through security due to errors with the program. This information would also bear heavily on the security and integrity value of the program.
Additionally, CBP recently argued that “[u]sing biometric technology, air and sea partners can replace current check in, security, and boarding processes that involve long lines, heavy personal interaction, and the handling of travel documents. Facial biometric technology encourages contactless travel that involves minimal physical contact and promotes social distancing, which increases the safety of travelers, CBP officers, and port personnel.” If CBP is trying to argue that its biometric screening program mitigates the spread of COVID-19 by limiting the handling of documents, the agency is misrepresenting the risk of virus transference via surfaces such as from handling a traveler’s passport. In addition, minimizing physical contact between travelers and DHS personnel can and has been achieved through other measures that do not so adversely impact traveler privacy as does the proposed biometric screening. Such measures include the installation of plexiglass where appropriate, providing personal protective equipment to airport staff and travelers, the redesign of airport lines with clearly demarcated indicators of where travelers should stand to prevent crowding, and designated overflow areas if many people are arriving to or leaving from the airport. Perhaps automated identity verification could be a help to managing airport traffic, but that could be accomplished with a biometric screening system that performs 1:1 automated comparisons between the traveler and their document, and wouldn’t require CBP to collect and retain a facial image.
In other words, CBP is attempting to move the goal posts it had originally established for the use of facial recognition technology in airports. First, the goal was to improve security. Then, it attempted to pivot justifications for facial recognition technology in airports to “improving the traveler experience,” and now to pandemic mitigation. These are distractions. The biometric entry-exit system was proposed and heavily funded to address security concerns related to visa overstaying and fraudulent travel. CBP has not been able to demonstrate the program’s ability to meet the first objective, and hasn’t proven it to be a value add for the latter. The agency shouldn’t move the goal post closer to justify a biometric screening program that raises enormous privacy, civil liberties and civil rights concerns.
We again urge CBP to withdraw this proposal.