Skip to Content

Cybersecurity & Standards, Privacy & Data

Think Differentially: Apple’s Forward-Thinking Approach to Privacy in iOS 10

In the technology world, good design is driven by data, and companies often have to think critically about trade-offs between usability and privacy. They sometimes need to review very personal information about users, such as places they go, websites they visit, and search queries they make, in order to effectively iterate and improve their products. For example, knowing the words and phrases typed by users is crucial to enhancing an autocorrect feature. But there are inherent privacy concerns in tracking things people type – passwords, anyone? This presents a conundrum: in order for a company to improve their products and services, they need to understand how those products are being used; but, to get more data, they must ask users to accept privacy compromises.

Apple has been consistently innovative in its product design and in its approach to privacy. On the heels of CEO Tim Cook announcing that the company considers privacy a human right, Apple announced that it will require HTTPS for iOS apps by the end of 2016, that it will build encryption into its new filesystem, and that it will deploy a technique known as differential privacy in iOS.

While differential privacy is a hot topic in academia, it hasn’t been utilized much in private industry; Apple’s move makes it only the second major use of the technique. (Google built differential privacy into Chrome’s analytics platform in 2014). Differential privacy is a mathematical technique that can be used to deal with what is called “the property of intersection” – information such as age, gender, and ZIP code may be entirely innocuous when examined separately, but the combination of these data points can be uniquely identifying (e.g., 63% of the US population can be uniquely identified by birth date, gender, and ZIP code). Differential privacy slightly changes a small portion of the data points in a data set so that, while inferences can be gleaned about a population as a whole, it becomes much harder to identify individuals. A good example outside of the private sector comes from the US census bureau, which has used differential privacy to add “noise” to data sets it releases to researchers.

In Apple’s case, they can use differential privacy to protect sensitive user data on iOS 10, but still have plenty of “signal” in that data to learn helpful tidbits about how people are using their software. Individual users can feel secure that their privacy is not being violated: when the user data is examined in aggregate, trends can be spotted and used to inform the design of iOS, but this information doesn’t reveal anything about a specific individual’s behavior. As we’ve said previously, this is the privacy equivalent of having your cake and eating it too, and a demonstration that user experience improvement and privacy don’t have to be at odds.

Apple has publicly outlined only a few aspects of its differential privacy technique, stipulating that it is opt-in and is concentrated on uses involving predictive text and search (i.e. making sure it can suggest the right words as you type). We at CDT applaud this step forward, and are eager to see more details of the implementation, since there are so few differentially private systems used in the wild. Apple incorporating differential privacy to improve their products in a privacy-preserving manner will signal that companies can and should design software that is transparent and privacy-protective while still being driven by user data.