Skip to Content

Privacy & Data

The Washington Privacy Act Raises Important Considerations for Comprehensive Privacy Proposals

A significantly altered substitute SB 5376 was introduced on February 14, 2019. It does not address the concerns listed below, and raises additional questions about the scope of the privacy and data-driven discriminatory protections at issue. On February 26, CDT sent a letter to the House and Senate urging changes to substitute SB 5376. On March 21, CDT sent a letter to the Washington House recommending changes to HB 1854.

As states across the country grapple with how to enshrine meaningful privacy protections into law, SB 5376 — the Washington Privacy Act — is a serious proposal to require companies to use data in equitable and legitimate ways. While many states have introduced consumer privacy legislation modeled after the California Consumer Privacy Act (CCPA), Washington state legislators have offered a positive alternative approach that still has room for improvement. Below, we highlight five key areas of SB 5376 that are potentially problematic and deserve additional attention from lawmakers. Specifically, we are concerned about:

  • The broad safe harbor for de-identified data;
  • The reliance on corporate risk assessments for restricting data collection, use, and sharing;
  • The lack of clarity with respect to identifying third parties with whom personal data is shared;
  • The need to provide additional resources for meaningful enforcement; and
  • The proposed rules for public and private use of facial recognition technologies (FRTs).

CDT believes there are some significant positive legislative developments with SB 5376. Not only is any proposal for rules governing both law enforcement and commercial use of facial recognition technologies (FRTs) ambitious and welcome, but SB 5376 also includes thoughtful provisions detailing (1) individuals’ rights to access, delete, and limit some sharing of their personal information; (2) restrictions on data brokers and automated processing/profiling; and (3) a study on the privacy practices of government bodies to ensure they are held to the same level of responsibility as private industry. If these protections are coupled with the improvements we identify below, this legislation could provide meaningful protections to the people of Washington State.

1. A Broad Safe Harbor for De-identified Data Can Ignore Meaningful Privacy Risks

The most important definition in any privacy law is the scope of information that is covered by that law. Three terms — “personal data,” “identified or identifiable natural person,” and “deidentified data” — are connected in the Washington state proposal. The bill applies to “any information relating to an identified or identifiable natural person,” which can include “online identifiers” and “specific geolocation data” that could directly or indirectly identify an individual. Critically, however, the proposed law largely excludes de-identified data from its scope, undermining the law’s goal that “personal data will be protected appropriately.”

De-identification is the use of technical and administrative processes to prevent an individual’s identity from being connected with other information. Support for de-identification depends upon the theory that removing identifying information allows businesses or researchers to share information without any privacy risk – but there is no standard for ensuring that such processes will actually prevent information from being re-identifiable. De-identification is a moving target, and we have seen time and time again that information claimed to be de-identified was easily re-identifiable. New York City officials, for example, accidentally revealed the detailed comings and goings of individual taxi drivers in a public release of data that was poorly de-identified. But even when thoroughly de-identified and not released publicly, just a handful of random location data points are uniquely identifiable 95% of the time. Location data is often especially challenging to de-identify, but “anonymous” medical research databases and even Twitter metadata have been used to identify individuals.

As a result, de-identification has become a tremendous point of contention in privacy debates. Not only do de-identification techniques fail, but “anonymous” information can also present real risks to individuals and communities. While companies reasonably want an escape valve from having to give the same level of protection to all information at all times, industry can be quick to claim information is sufficiently de-identified or even anonymous when it is not.

There are two possible definitions of de-identified data in SB 5376, and both create broad loopholes for a general-purpose privacy law. First is data that “cannot be linked . . . without additional information kept separately.” Unfortunately, there are already a wide variety of public databases filled with personal information. If the additional information that serves as the key to re-identifying real individuals behind the data is public, or trivial to obtain, the fact that it is held separately provides no meaningful privacy protection.

Second, de-identified data is data that (1) “has been modified to a degree that the risk of reidentification is small,” (2) is subject to contractual requirements not to re-identify, and (3) has “one or more” legal, administrative, technical, or contractual controls in place. This definition raises three critical questions: who determines when a re-identification risk is small, what qualifies as “small” and who comes up with this standard, and what controls should suffice to ensure data is sufficiently de-identified?

In order for SB 5376 to meet its stated intentions to “keep pace as technology and business practices evolve to protect businesses and consumers,” there must be a higher standard with respect to de-identification – or de-identification should be not be used as a safe harbor at all. To the extent the law would consider the use of de-identification as part of a more holistic risk assessment of data use, a higher standard is still warranted to clearly signify to covered entities that they must take into consideration that nature of the data, the evolving state of the art, and advances in re-identification. At a minimum, CDT recommends that privacy legislation require companies to disclose their methods of de-identifying personal information.

2. Relying on Secret Risk Assessments to Determine Data Use

SB 5376 is largely silent about appropriate uses of data and instead directs companies to conduct a risk assessment and balance their own interests against the interests of their users and the public at large. While these risk assessments must be made available to the state Attorney General, they are exempted from public records laws and therefore will likely never become public. Relying on risk assessments alone to protect privacy doubles down on the current model: Entrust businesses to collect, use and share data, catch some of them in unfair practices after something goes wrong, and assume companies will be accountable for their activities.  

Risk assessments are flexible to a fault. Currently, Section 8 requires an assessment just once per year, and companies are provided no categorization of risks to consider. Much more guidance is needed. The EU General Data Protection Regulation, which inspired this requirement, limits “data protection impact assessments” to certain activities involving sensitive categories of data, automated processing, and systematic monitoring of individuals. SB 5376 does, however, provide some much welcome clarification as to what types of profiling and automated processing are problematic due to “legal or similarly significant effect” on individuals. It would caution companies to place special safeguards around data uses that include “denial of consequential services or support, such as financial and lending services, housing, insurance, education enrollment, criminal justice, employment opportunities, and health care services.”

That is progress, but to advance meaningful privacy protections, SB 5376 must include additional, clearer direction about what is an appropriate use of data. The bill sponsors have already made such decisions with respect to facial recognition, and the legislature should consider what other types of data or data uses need affirmative protections and should not simply warrant extra consideration in a multifactor balancing test. For example, the bill acknowledges that some types of personal data are inherently “sensitive” such as genetic data, biometric data, and data concerning health or sex life, among others, but this designation is only relevant in the context of a business’s own risk assessment. CDT recommends that lawmakers simply prohibit such data from being collected, used, or shared if it is not necessary to provide a product or service that an individual wants.

3. Clarifying Whether Individuals Can Obtain the Identities of Third Parties With Access to Personal Data  

Currently, most companies provide only limited information about how data is shared, referring to relationships with “affiliates,” “business partners” and “marketing partners,” and even law enforcement. The rights provided by SB 5376 to access, correct, and otherwise control personal information cannot be exercised meaningfully without requiring companies to release the actual names of businesses to which any personal information is sold or licensed. While the law’s general transparency provision speaks of providing “categories of third parties” with whom information is shared in a public-facing notice, Section 6(8) also requires businesses to “inform the consumer about such third-party recipients, if any, if the consumer requests such information.” The bill should be further clarified to ensure that individuals can obtain the names of all entities with whom their information is shared outside of a tightly-controlled service provider relationship.

4. The Washington Attorney General Must Be Provided with the Resources and Authority to Give SB 5376 Teeth

The exclusive enforcement mechanism for SB 5376 is via the Washington Attorney General. Certainly, state Attorneys General are eager to see more privacy enforcement authorities, but the practical reality is that a broad-based privacy law that lacks a private cause of action and is designed to be enforced only by state regulators requires the allocation of significant additional resources and personnel.

In California, for example, the Attorney General’s office has sought additional funding and personnel to enforce the CCPA. A separate legislative proposal allocated $700,000 and five staff positions to the California Attorney General’s office to aid in the development of the required regulations. Comparable federal proposals would authorize the FTC to hire 175 new lawyers and technical experts to assist in enforcement. As states enact privacy laws that govern everything from mobile apps to machine learning, lawmakers must consider whether their Attorney General has either existing resources or expertise necessary to enforce the law. In order for the law to be effective, SB 5376 should provide additional support for the state Attorney General.

Further, Section 12(3) also provides companies with a right to cure violations of certain requirements in the bill prior to being subject to any potential fine. Companies should not be given free passes for violating the law. Contrary to arguments from industry, SB 5376’s overarching instruction to provide mandatory transparency and undertake risk assessments are not the types of legal instruction that are open to interpretation and thus should be closely followed.

5. Both Commercial and Law Enforcement Use of Facial Recognition Demands Additional Restrictions

Heeding calls from numerous stakeholders, SB 5376 attempts to establish rules for the use of facial recognition technologies (FRTs) by both commercial actors and government officials. However, in order to be effective, the bill should do more. Specifically, it needs to directly take a position on the issue of consent. While the bill proposes a general requirement that individual’s consent be obtained prior to use of FRTs, this is undermined by embracing implied consent through a store’s mere placement of signage suggesting use of FRTs. As CDT has explained previously, it is unclear what meaningful or “conspicuous” notice of FRTs can or will be provided by businesses. Merely walking into a store cannot be said to imply an individual’s consent to have their face tracked, analyzed, and characterized, or verified against some database of images. If a retailer wants to enroll an individual’s face into a facial recognition system, it should ask.

The bill also fails to cover some new uses of the technology. One issue with Washington State’s existing biometric privacy law is that it specifically exempted any information derived from photographs or video, including face geometry. This arguably captures most applications of FRTs; SB 5376 resolves this by creating rules for facial recognition, but the bill does not apply to all types of face tracking technologies. (Facial analysis, though also recently under scrutiny, does not appear to be captured by the bill’s definition, which is limited to technology that “analyzes facial features and is used for the unique personal identification.”)

On the positive side, Section 14 of the bill requires companies that use facial recognition to employ “meaningful human review” before using FRTs to make decisions that have legal or other significant effects on individuals. Meanwhile, companies that offer FRTs must provide clear documentation of any limitations, contractually bind customers from using their products for unlawful discrimination, and provide an API or other mechanism for independent researchers to test FRTs for accuracy and problematic bias. These are useful controls and should be expected of companies that provide or deploy FRTs.

Finally, regarding government use of these technologies, the bill starts a needed conversation on the proper usage of FRTs by law enforcement. In a very positive step, Section 15 generally prohibits the use of FRTs to engage in “ongoing surveillance” of individuals in public spaces. Law enforcement must either obtain a court order or there must be an imminent danger or risk of death or serious physical injury. These are important limitations on the ability of government agencies to indiscriminately use FRTs in day-to-day life, but a probable cause standard is most appropriate. This constitutional standard means that a search should only happen when it’s more likely than not to result in evidence of a crime.  

These major issues aside, SB 5376 is ultimately an important addition to the ongoing U.S. privacy debate. Like the CCPA before it, Washington’s proposal is a crucial step towards recognizing that privacy is a multi-layered right that necessitates real protections for individuals. CDT hopes the Washington legislature further strengthens the bill before it – but in any event, states across the country now have another option to pursue broad-based data processing protections for their citizens.