CDT appreciates this opportunity to comment on the Department of Commerce’s draft Report on enhance resilience against botnets and other automated, distributed threats. Because there is an explicit tension between allowing companies to take voluntary but automated action against devices and accounts, and permitting consumers to control their digital footprint, we propose that the National Institute of Standards and Technology (NIST) convene a dedicated process for discussing the implications for privacy and freedom of expression.
Read More
This paper explains the capabilities and limitations of tools for analyzing the text of social media posts and other online content. It is intended to help policymakers understand and evaluate available tools and the potential consequences of using them, and focuses specifically on the use of natural language processing (NLP) tools for analyzing the text of social media posts.
Read More
The City of New York has an obligation to understand, scrutinize, and explain how its algorithms make decisions affecting New Yorkers. At minimum, the city should ensure and demonstrate to the public that NYC’s algorithmic decision-making tools (1) are aligned with the city’s policy goals and the public interest; (2) work as intended; (3) do not use data to marginalize minority or vulnerable populations and exacerbate inequality; (4) provide meaningful transparency to New Yorkers so that they can appeal and seek remedy for automated decisions that are incorrect, unjust, or contrary to law.
Read More