The Big Questions About Privacy That Need Answers
Written by Nuala O’Connor
I had a teacher who once said, “When the stuff is hitting the fan, there are three questions to ask: What’s important? What’s missing? And what’s next?” Members of Congress will have their day with Mark Zuckerberg this week, but I’m more interested in unpacking these three questions – and moving towards their answers.
First, what’s important? It’s understandable that people want to focus on the impact of the election, on who saw what political ad, how Facebook is responding, and what things Sheryl and Mark have said. And while all of these things are relevant, what’s more important are the messages and lessons of the current debacle around Facebook and Cambridge Analytica. This is about our data – and the decisions made about us based on that data – that ultimately affect our agency and autonomy on- and off-line.
We are seeing unexpected data collection and sharing. The answers we give to an app that asks for our favorite color or dog breed are being analyzed and shared with researchers and others, without our full comprehension and knowing assent. Such data use is now so targeted that little things about us can be used to determine big things, including our very place in the world. Even more surprising is that our friends’ data is being collected, shared, and analyzed as well, extending the ramifications for our online actions beyond ourselves – and into the lives of our friends and loved ones.
Second, this week shows us what’s been missing: oversight, boundaries, appropriate scrutiny, and any real understanding and action taken by government regulators. But even more so, these past weeks have also shown us what’s specifically missing in industry: meaningful internal controls, which exist in in other sectors, to address issues from money laundering to supply chain management and environmental compliance. Finally, what is missing is some restraint – the willingness to forgo some profit in service of the larger enterprise, including customer trust and the greater good.
And I have a new, additional question: what’s different? The Facebook-Cambridge Analytica situation is substantially different from the many other data use concerns or data breaches we’ve seen. This scenario doesn’t involve Social Security numbers, credit card numbers, or other sensitive financial data that would trigger the data breach disclosures that now exist in all 50 states. (Just in the last few weeks the most recent state data breach law passed.) And yet the drumbeat of concern continues, showing that people are disturbed by the insidious sharing of seemingly trivial or even mundane data. And there is a new awareness that such data can have a meaningful effect.
So, what’s next? This is where I have some hope. I sometimes hear from policymakers and people working in companies that people feel like the rules have changed or that public sentiment has shifted rapidly, and I don’t think that’s true. I think that people – the ordinary end users (otherwise known as customers or citizens) – are realizing the growing importance and thus the increasing intrusion these tools and platforms have on their daily lives. People do care about “privacy,” although what they define that as and how they act about it do not fit a one-size-fits-all framework. One thing people do seem to agree on is that there should be a certain amount of freedom and independence in their daily lives, enabled by reasonable boundaries of privacy.
What’s next is a national conversation on whatever we’re going to call it: digital dignity, data autonomy, or privacy. With a whole new set of citizens who’ve grown up with these technologies and understand the power and pitfalls of life online, we’re going to get this right eventually. But first we need to move beyond the frameworks of the past, even as we are mindful of them: the OECD guidelines, the Fair Information Practice Principles (FIPPs), the endless work of the Federal Trade Commission, and the Consumer Privacy Bill of Rights.
From these, I see three themes that companies and Congress need to get right, no matter what words we use to describe the concepts:
- Agency. People want to know that their decisions, self depictions, and communications are their own. We are wrestling with the sense that computers are making decisions for us – about what we see in terms of both content and advertising, especially political advertising. Human agency and autonomy are fundamental.
- Control. Individuals want to be able to exercise control over what is collected, with whom it is shared, and how it is used to make decisions about them. They want to control their alphanumeric data (e.g., their name and phone number) and also their visual images (e.g., photographs). They see these images as extensions of self.
- Transparency. Another missing element from these recent events is transparent communication about norms and practices that affect individual customers, citizens, and users. Without real transparency into company or institutional policy and practice, individuals are left with the sense that something is suspicious or wrong. And the act of creating transparent systems has a clarifying effect on organizations, forcing them to consider the impact of certain data collections or behaviors. More words in privacy policies is not what I’m talking about here. Instead, we need systems that are designed to offer meaningful communication in many forms and formats.
There’s a lot missing in the dialogue between individuals and companies in the digital age. We are at a unique inflection point, where individuals are calling for more transparency and, frankly, accountability. What form that dialogue, and the outcomes, takes is up to all of us.