Skip to Content

Privacy & Data

The Ethics of Design: Unintended (But Foreseeable) Consequences

When data becomes divorced from its human origins, it loses context and disassociates companies from their actions – it enables decisions that defy expectations and ethics. From Fitbit and Jawbone to Uber and Netflix, tech companies have experienced widespread backlash as a consequence of this disconnect. Fitness social network Strava is the latest example, after the company publicly released a heat map of aggregate user locations that inadvertently revealed U.S. military bases and personnel around the world.

At the heart of what might be called the “unintended consequences” problem is that so many digital platforms were designed absent any user-centric values, such as privacy and security. Instead, they were built to accommodate growth and monetization, strategies that render human data disposable and interchangeable. Put simply, platforms are optimized for economic gain, which creates systems that prioritize maximum data collection while ignoring what might be in the best interest of users.

Ethical products and services cannot rely on “checking a box” for the use of customer data…

Consider location data. An activity tracker designed to be responsive to user-centric values would certainly designate highly sensitive information like geolocation private by default, including any use of it in aggregate because location is so easily associated with individuals. But Strava, like many apps and services, instead created an environment in which a user’s location is public by default, with a setting to turn off its transmission. Strava effectively obtained consent to invade users’ — and whole communities’ — privacy, seemingly without an understanding of the wide spectrum of risks that could result. Ignoring privacy risks isn’t unique to Strava – in this case, it just had particularly visible results.

Ethical products and services cannot rely on “checking a box” for the use of customer data because such blanket consent ignores difficult questions about user expectations and the unique risks that data exposure might cause for one individual or group over another. Today’s notice and consent mechanisms have become compliance tools for avoiding liability. In contrast to the visibility of Strava’s heat map is the opacity of the app’s data flows for users. Opacity is a common feature of design when it is detached from user-centric values. It is by design that it’s nearly impossible for most people to know what’s happening to their information in digital systems, no doubt because many people express intense discomfort when they learn how and why their data is used by companies. It’s telling that an entity like the U.S. military, where the privacy of information can be a matter of life or death, was seemingly blindsided by the implications of Strava’s heat map. Opacity permits company practices to become misaligned with user expectations, as well as societal and community norms.

Companies often defend their data practices by claiming that much of their information is anonymized or compiled in aggregate; Strava, for example, gives itself broad leeway to do whatever it wants with so-called “aggregated and de-identified information.”  The big problem with this defense is that the massive amount of data in commercial hands has increasingly rendered these obfuscation methods less and less effective. This is especially the case with geolocation information, which, as shown in the Strava example, provides a goldmine of valuable, sensitive, and intimate inferences about human activity. Neither consent nor aggregation should be a  blank check for the unconstrained use of customer data.

When platforms collect and use customer information absent a foundation of ethics and user-centric values, they often develop products that run counter to their users’ interests and that will invariably produce backlash for the company. The data ecosystem has given commercial entities unprecedented power over individuals, whether it’s through exploiting our neural vulnerabilities to grab and hold our attention, filtering the content we see based on our past actions, or limiting our ability to make truly autonomous choices about uses of our data. Companies should balance this power dynamic by creating digital products that prioritize human rights, like privacy, over profit margins, and that make stewardship, not opacity or obfuscation, the most prominent design feature in their products.