Skip to Content

Cybersecurity & Standards

I* Newsletter: Content moderation through internet infrastructure, private Web advertising

“I*: Navigating Internet Governance and Standards” was a monthly newsletter distributed by the Center for Democracy & Technology (CDT), and compiled by the Public Interest Technology Group (PITG), a group of expert technologists who work across a complex landscape of internet standards development (“I”) organizations that convene in the public interest.

The newsletter highlighted emerging internet infrastructure issues that affect privacy, free expression, and more, clearly explaining their technical underpinnings.

# What does EU law (already) say about content moderation in end-to-end encrypted systems?: “In July 2021, the European Parliament and EU Council agreed upon temporary rules to allow webmail and messenger services to scan everyone’s private online communications. In 2022, the European Commission will propose a long-term version of these rules. In the first installment of this EDRi blog series on online ‘CSAM’ detection, we explore the history of the file, and why it is relevant for everyone’s digital rights.” “EDRi’s goal… therefore, is to make sure that any proposals to detect online CSAM are in line with the EU’s fundamental rights obligations, in particular that measures are lawful, targeted, as well as clearly and objectively proportionate to their stated goal.”

# More private advertising for the Web: Online advertising has been an important and ubiquitous method of monetization supporting publishers and other online services. But surveillance of users, for behavioral targeting or other ad-related purposes, has also become a widespread harm to user privacy online.

Standard-setting venues are seeing more proposals for technology that would support advertising use cases while mitigating the privacy costs. Some techniques address attribution — so advertisers can measure the effect of a campaign without tracking which individual customer saw a particular ad. Other proposals address targeting, for example, letting users explicitly choose the topics of ads they do or don’t like to see, rather than inferring those interests by surveilling browsing history.

These proposals add on to others we’ve seen as part of Google’s Privacy Sandbox initiative, which includes a just-introduced Topics API. Topics would be similar to the previously tested FLoC, with in-browser analysis and classification of a user’s browsing history, but Chrome here proposes to infer a few user-understandable and reviewable ad interest categories (e.g. “Pickup Trucks”, “Nail Care Products” or “Vegan Cuisine” or “Adoption”) to be shared with advertisers. There are several privacy and market impact differences to compare between Topics, FLoC, the status quo in various browsers and other alternatives.

The new Private Advertising Technology Community Group (PATCG) at W3C is organizing its first multi-day meeting to dive into these topics; it’s one of several venues where these discussions are taking place. Our Public Interest Technology Group (PITG) is organizing to consider the public interest impacts of these myriad proposals and the most effective ways for civil society to engage on these web advertising standards.

# One year after the storming of the U.S. Capitol, what have we learned about content moderation through internet infrastructure?: 2021 was a landmark year for the intersection of content moderation and internet infrastructure. Just two weeks into the year, Amazon, Google, and Apple all made headlines for cutting the social media platform Parler off from cloud and mobile app store hosting services in the wake of the January 6 attack on the U.S. Capitol.

One year after the storming of the U.S. Capitol, and Parler’s subsequent struggle to remain online, similar events continue to highlight the power infrastructure actors have to both enable and disrupt users’ access to the internet as a whole, and to specific websites and mobile content. These events call for a retrospective on what we learned in 2021 about the challenges of internet infrastructure providers acting as content moderators and online power brokers.

In this piece for Tech Policy Press, Corinne Cath and Jenna Ruddock outline the role of internet infrastructure providers in content moderation, the associated accountability deficits, and what concrete measures should be taken to ensure that content moderation decisions at the level of internet infrastructure — rather than at the app level —  follow a transparent, human rights-informed framework.

# Everything you always wanted to know about the Russian internet but were afraid to ask: No longer simply federal law, Russian Sovereign Internet now includes instructions for engineers on how to configure a sovereign domain name system, and report number resources and banned traffic to federal regulators. A video presentation from researcher Alexander Isavnin provides an update on Russian internet regulatory practices.

# New leadership announced at the IETF: Recently, members of the incoming leadership of the Internet Engineering Task Force (IETF) were officially announced, including two new additions from the Public Interest Technology Group: Mallory Knodel will join the Internet Architecture Board (IAB), and Paul Wouters will join the Internet Engineering Steering Group (IESG) as a security area director. Their terms will start in March 2022.

Originally created by the Defense Advanced Research Projects Agency (DARPA), the IAB “provides long-range technical direction for Internet development, ensuring the Internet continues to grow and evolve as a platform for global communication and innovation.” It has produced documents foundational to the Internet Engineering Task Force’s policy stances on wiretapping, privacy considerations; confidentiality in the Face of Pervasive Surveillance: A Threat Model and Problem Statement and most recently RFC 8890: The Internet is for End Users.

Alissa Cooper, one of Mallory Knodel’s predecessors at CDT, became the first member of the IAB to represent a civil society organization in 2011.

# Side note: This side note comes from Joe McNamee, who noticed a phrase in a European Parliament draft report on artificial intelligence: “AI… can be thought of as the fifth element after air, earth, water and fire.” He, and many others, have rightfully memed it unto its demise — it’s absent from the current text.