Skip to Content

AI Policy & Governance, Privacy & Data

Working with Airbnb to Use Data to Fight Discrimination

This week, Airbnb announced an initiative to use data to combat discrimination on the platform – an effort nearly two years in the making. At the Center for Democracy & Technology, we were pleased to partner with Asian Americans Advancing Justice – AAJC, Color Of Change, League of United Latin American Citizens – LULAC, National Action Network, The Leadership Conference, and Upturn to help the platform work through key anti-discrimination and privacy issues. At the heart of Airbnb’s efforts are two key initiatives. The first is engaging with affected communities to learn about problems on the platform, and the second is devoting real technical and legal resources to test-driving solutions, adopting those that work, and iterating on those that don’t.

While these two ideas seem straightforward, it is notable that Airbnb is deploying its engineers and lawyers to take a proactive approach to racial justice issues. Past changes Airbnb has made to its platform with the goal of reducing bias and discrimination include removing guest profile photos from the booking request process, and increasing the use of Instant Book, which requires no back-and-forth between hosts and guests. But it is crucial to understand if such changes are effective, since an idea that makes intuitive sense may not actually be the best approach. The initiative creates a feedback loop between Airbnb and advocacy groups and directly affected communities, who can highlight issues and suggest changes as new problems arise on the platform. When those issues have few proven solutions, Airbnb can use its resources to look at how users are impacting each other, and utilize the results to make further improvements.

One of the key approaches Airbnb announced yesterday is that they will soon be collecting perceived race data, the company’s best guess about how a host perceives a guest’s race. This data will not be linked to any individual account and aggregated, and will only be used as part of the company’s anti-discrimination efforts. Individual guests can also opt out. Aggregated analysis of perceived or actual race data is a particularly important approach for data-driven companies. So much of the way they make progress is by testing, and one of the necessary ways to protect privacy is working in an aggregated, anonymized, use-limited data setting. Without measuring the racial impact of their products in this privacy-protective context, their understanding of how well changes are working is limited.

At CDT, we’re pleased that Airbnb will be conducting this research and, additionally, have published their methodology online. We hope this type of engagement will become a model that other companies can replicate as they begin to think about how to make progress on the significant challenges that face platforms.