Skip to Content

AI Policy & Governance, Privacy & Data

Facebook Launches New Restrictions on Advertisers

Facebook has just revealed new restrictions on advertisers’ ability to personalize content based on “Ethnic Affinity,” following up on promises to address concerns raised by advocates and government officials last year.  These concerns were raised in response to reports that the platform allowed advertisers to target people based on their “Ethnic Affinity,” a trait inferred by Facebook, even for ads related to credit, employment, and housing. (Facebook describes “ethnic affinity” as “multicultural advertising segments” in their latest post.) Specifically, journalists at ProPublica demonstrated that this targeting included housing-related ads, prompting the Congressional Black Caucus and the Federal Housing Authority to raise concerns that this type of targeting may violate civil rights laws. Today’s announcement includes screenshots demonstrating the steps the social network has taken to remedy this issue. Facebook’s blog post describes welcome changes, many of which reflect suggestions made by CDT last fall, including plans to leverage technical tools like machine learning.

CDT Recommendation: Alert advertisers to their legal obligations.

Facebook is rolling out policy changes and education initiatives addressing legal and ethical objections to targeting ads related to housing, jobs, or employment in discriminatory ways. Last fall, CDT recommended that Facebook inform advertisers of their obligations to comply with federal law, including prohibitions against discriminating against protected classes. Facebook’s updated Advertising Policies include a broad list of prohibited categories and a warning that “ads must not discriminate, or encourage discrimination.” Their education efforts reference resources from several government agencies and civil society organizations including the American Civil Liberties Union and the U.S. Equal Employment Opportunity Commission. Although Facebook is clear that none of their advice replaces legal counsel, these alerts raise awareness of some boundaries and provide information to those looking to learn more about why their ads are restricted.

CDT Recommendation: Ask advertisers to identify advertisements for housing, jobs, or credit products.

Facebook also described plans to use technical tools to enforce its standards. While we had originally proposed asking advertisers to self-identify ads in these categories, Facebook plans to “test new technology that leverages machine learning to help [Facebook] identify ads that offer housing, employment, or credit opportunities.” While slightly different than what CDT proposed, this achieves the same effect and does so in a way that is easier to scale. The self-certification process raises awareness among advertisers. Automatically recognizing this content permits Facebook to prompt the advertiser with a warning addressing their obligations, and to prevent them from using “multicultural advertising segments” to target content that provides information about products or opportunities linked to economic well-being. The insight gained from this machine-learning analysis could be leveraged on Facebook’s other properties to identify content that poses discrimination concerns before that content is imposed on the public.  Ads can be discriminatory because of their content, because of the way they are targeted, or a combination of both. Not all ads targeted to specific communities are discriminatory, and not all broadly avoid troubling stereotypes. A methodology to identify key characteristics of problematic ads can help unravel this problem by providing examples of how, specifically, the content excludes or alienates people. And the dataset will be improved by advertisers who take advantage of the offer to “Request Manual Review,” of their content by reducing false positives. This work has potential to be broadly useful to other institutions looking to address similar issues.

Attempting Fairness by Design

Facebook acknowledged several organizations (including CDT) for their contributions to the company’s efforts to address the concerns raised about this kind of targeting. The changes announced today demonstrate the value of dialogue between private industry and civil society and, although there are still improvements to be made, they demonstrate the promise and potential of pursuing fairness by design. Using sophisticated techniques to assist in identifying and addressing problematic ads helps Facebook to quickly evaluate new ad campaigns for discriminatory effects and prevent users from seeing them, rather than waiting for the content to be flagged once it is in circulation. This may seem trivial or obvious, but technology is not magic. It has to be built, deployed, and maintained by engineers. Facebook should be commended for dedicating resources to this endeavor, and we encourage other companies to incorporate fairness into the design of their products.