For the past 18 months, CDT and several other consumer advocacy organizations have participated in the National Telecommunications and Information Administration’s (NTIA) multistakeholder process to design a code of conduct for the use of facial recognition technology. Today, those consumer groups have announced that we are withdrawing from this process because we do not believe that it is likely to lead to sufficient consumer safeguards for facial recognition technology.
The industry should voluntarily agree to some common sense limitations on the use of this technology.
This was not a decision that we took lightly. Facial recognition raises very sensitive privacy issues that compromise our ability to maintain any degree of practical obscurity as we go about our lives. Only two states have any affirmative protections in place for this type of data; otherwise, the only limitations on comprehensive biometric tracking are technological — and those limitations are diminishing every day. The industry should voluntarily agree to some common sense limitations on the use of this technology, which would certainly be a benefit for consumers who might not want to be automatically recognized and identified every place they go.
Unfortunately, it has been clear for some time now that this forum was not likely to agree on meaningful protections. For months, consumer advocates have wanted the process to address the core issue of consent — to get the group to agree that certain uses of facial recognition technology should happen with the affirmative consent of the consumer. We previously agreed to give the process one more chance to see if we could make progress on that issue.
At last Thursday’s meeting, the group failed to achieve agreement on any scenarios where facial recognition would only be used with consent. As an example, when you’re walking down a public street, you should be able to expect that you are not being tracked by dozens of companies with whom you have no relationship — surely in these sorts of spaces, companies should only do facial recognition tracking for marketing only with permission. However, no company or trade association was prepared to make that concession at the meeting. It became clear that we weren’t going to get a code that’s consistent with consumer desires or expectations.
Part of the problem is that the NTIA processes were originally proposed as part of a federal privacy law whose industry-specific codes that, if followed, would give companies a “safe harbor”. However, that law was never passed, so companies using facial recognition technology don’t have anything they need safe harbor from. The incentives just aren’t aligned to get companies to voluntarily adopt binding (and legally enforceable) codes of conduct.
A lot of companies using facial recognition today are doing so cautiously and with respect for individual privacy.
That’s too bad, because a lot of companies using facial recognition today are doing so cautiously and with respect for individual privacy — including asking in many cases for affirmative permission. It may well be the case that market pressures and public accountability for questionable practices will limit some of the most objectionable uses of facial recognition (so long as those practices are detectable by consumers, advocates, and regulators). But it appears that the NTIA multistakeholder process is unlikely to lead to stronger, industry-wide protections. The administration deserves credit for trying to convince companies to voluntarily adopt a strong self-regulatory code, but it just wasn’t enough. As we’ve argued time and time again, ultimately the United States is going to have to adopt privacy legislation to require meaningful transparency about privacy practices and to empower individuals to make choices about how their personal data is collected and used — instead of just hoping for industries to self-regulate in response to emerging privacy threats.