Filters Applied


NYC May Be at the Vanguard of Algorithmic Accountability in 2018

The New York City Council has taken a proactive step by enacting a bill establishing a task force to explore fairness, accountability, and transparency in automated decision-making systems operated by the city. This is a big deal. The use of these technologies by city governments have real impacts on citizens. Today, in New York City, algorithms have been used to assign children to public schools, evaluate teachers, target buildings for fire inspections, and make policing decisions. However, public insight into how these systems work and how these decisions are being reached is inadequate.

Read More Read More

What’s the Harm? CDT Comments to FTC Highlight Informational Injury Considerations

On Friday, CDT submitted comments to the Federal Trade Commission in advance of its December 2017 workshop exploring the contours of informational injury. Privacy violations are often highly contextual, making injury resulting from them difficult for individuals to evaluate and regulators like the FTC to quantify. Despite this practical challenge, the Commission can harness its existing tools to protect individuals from privacy harm; in our comments, we argue that the FTC should aggressively use its Section 5 unfairness powers to police business practices that create informational injury.

Read More Read More

Can Data Collaboratives Improve Nonprofit Data Governance?

While nonprofits are as susceptible to the data risks, threats, and pitfalls that for-profit companies routinely trip over, it can be easy to view dollars spent on privacy and security as money diverted from other important areas. Building on a report by GovLab, here are several of CDT’s recommendations for how “data collaboratives” can help nonprofits to improve their privacy and security practices.

Read More Read More

Financial Dashboards: Enhancing User Control Outside a Traditional “Privacy Dashboard”

Privacy dashboards are often put forward as a potential solution to the vexing problem of offering individuals control over their personal information. Industry actors have been iterating on the concept for years, but regulators of all stripes are well-positioned to provide useful guidance and best practices to improve dashboards as a form of user control.

Read More Read More

Ninth Circuit Issues Ruling on Spokeo: Inaccuracies Create Concrete Harms

Over seven years ago, CDT filed a complaint with the Federal Trade Commission against the people-search company Spokeo, alleging that the company and other data brokers were not protecting consumers as required by the Fair Credit Reporting Act (FCRA). A class action lawsuit filed against Spokeo in 2011, led by lead plaintiff Thomas Robins, raised a host of new issues about the nature of privacy harms, the actual protections provided by federal privacy laws, and the use of litigation as a vehicle for protecting consumers’ privacy. According to this week’s ruling by the Ninth Circuit, the accuracy of this type of information is “directly and substantially related” to the goals of the FCRA.

Read More Read More

Fast-moving House Bills on Autonomous Vehicles May Undercut Privacy and Security Regulation

The House Energy and Commerce Committee is poised to introduce a package of fourteen bills that aim to spur deployment of autonomous vehicles on U.S. roadways. Legislative action is warranted, but several of the legislative proposals may have unintended consequences as policymakers grapple with the privacy and security issues posed by data-fueled autonomy.

Read More Read More

Uber’s Fingerprinting Foibles and the Costs of Not Complying with Industry Self-Regulation

No stranger to privacy kerfuffles, Uber is once again in the news for its business practices and invasive use of technology. This time, the headlines are focused on Uber’s intentional circumvention of Apple’s developer rules, which prohibit apps from collecting certain technical identifiers from iPhones. The larger challenge this raises is determining whether Uber’s violation of Apple’s developer terms could or should raise regulatory ire. Sanctions should be tailored to fit the crime, but when it comes to privacy and security mishaps with technology, consumers and their advocates are left in the dark.

Read More Read More