Skip to Content

Privacy & Data

What’s the Harm? CDT Comments to FTC Highlight Informational Injury Considerations

On Friday, CDT submitted comments to the Federal Trade Commission in advance of its December 2017 workshop exploring the contours of informational injury. Privacy violations are often highly contextual, making injury resulting from them difficult for individuals to evaluate and regulators like the FTC to quantify. Despite this practical challenge, the Commission can harness its existing tools to protect individuals from privacy harm; in our comments, we argue that the FTC should aggressively use its Section 5 unfairness powers to police business practices that create informational injury.

While the Commission has historically relied on its deception authority to go after misstatements in company privacy policies, CDT believes that unfairness authority may be better equipped to address structurally problematic privacy practices and informational injuries writ large. For example, unfairness authority has been cited by the Commission as appropriate to police conduct that involves vulnerable consumers, opaque third-party business practices, or information asymmetries. Under the FTC Act, unfair privacy practices must cause (1) substantial injury to consumers that (2) cannot be reasonably avoidable, and (3) are not offset by benefits to consumers. The Act also mandates that larger public policy considerations, including state laws and self-regulatory guidance, should also play a role in this analysis. CDT has previously argued that unfairness authority may be the best vehicle, absent a baseline privacy regime, for fully enforcing the Fair Information Practice Principles in a fashion that truly protects individual interests.

Our comments reiterate this position and highlight two important points. First, a user’s ability to control their personal information must inform the Commission’s analysis of the avoidability of a data practice. The Commission assumes individuals are generally in a position to “survey the available alternatives, choose those that are most desirable, and avoid those that are inadequate or unsatisfactory.” This assumption is especially ill-suited to today’s digital ecosystem. To illustrate this, we highlight the difficulties faced by an expectant mother trying either to hide or control information about their pregnancy from marketers and miscellaneous third parties.

CDT’s comments also advise the Commission to explore the utility of including data access and portability rights in their analysis, akin to provisions in the EU’s General Data Protection Regulation. As a result of the GDPR, commercial entities will be offering European Union citizens more control over their data, and we argue that companies should grant Americans the same controls.

While expanded individual rights to access and control information serves as an essential counterbalance to the risk of privacy harms, information asymmetries in the digital ecosystem serve to fundamentally limit an individual’s ability to make truly informed decisions about their privacy and security. A pregnant woman, for example, who is told to “carefully consider the privacy and security tradeoffs” before deciding to use an app that tracks fertility or pregnancy stages, might not be given enough information to adequately evaluate the privacy or security risks of an app sharing her data with third parties. It’s not clear how anyone, expectant mother or otherwise, could gauge the privacy and security risks that could emerge over time.

If the FTC continues to adopt its long-standing position that consumers should have options that comport with their privacy preferences, it must first acknowledge that individual privacy preferences are shaped by numerous factors beyond just knowledge about privacy protections and business practices. An individual’s decisions regarding their data might take into account their risk of identity theft, stalking, or an error-ridden credit report or consumer profile. An individual’s race, gender, and socioeconomic class, as well as attitudes toward government and law enforcement, also play a role in the decisions they make about personal data sharing. The FTC must modernize its enforcement guidelines and investigatory efforts to reflect the realities of today’s digital ecosystem and provide the necessary context for understanding information injury.