Skip to Content

Free Expression

Tech & Inclusion Series: Mobile App Privacy is About Equal Rights, Too

2014-11-20 diversity

This is the third post in our Tech & Inclusion series, to explore how diversity impacts technology from a sociological, business, and design perspective. We hope to unearth and examine how personal bias and world view might influence tech products and services, and examine ways technology might mitigate social disparities and lead to a more egalitarian world – both online and off.

Privacy is most often tied to notions of freedom and autonomy; the right to privacy is indeed fundamental to safeguarding individuality. However, privacy in today’s digital world is as much about equality as it is about liberty. When certain demographic groups’ digital interactions afford them fewer privacy protections than others, those groups ultimately have less control over how information about them is collected, stored, and used. This risk is especially striking in the mobile context.

Research has repeatedly shown that Black- and Hispanic-Americans use smartphones and mobile applications more frequently than white Americans: only 17% of whites go online mostly using their cell compared to 38% of Blacks and Hispanics. (This is likely because more white Americans have Internet access at home ). Additionally, Black- and Hispanic-Americans use smartphones for a wider range of activities, including accessing social media, emailing, and mobile banking.

Mobile technology presents unique privacy risks. For one, mobile apps tend to over collect user information and this collection is under-regulated. The majority of consumers prefer to use “free” ad-supported applications; however, the tradeoff for free services is often ubiquitous tracking and monitoring. This is particularly problematic because mobile apps arguably collect more sensitive data than other digital platforms. For example, mobile phones — unlike laptops and desktop computers that are primarily accessed from one location — are carried by their owners throughout the day. Therefore mobile apps can collect geolocation data on a person at essentially any time and place — even when a person is not interacting with the app.

Secondly, mobile application privacy policies and security are often inadequate. A September 2014 survey of mobile applications found that apps generally fail to adequately explain data collection and use practices. Many of these apps allow for data collection beyond what is needed for functionality and almost a third of the apps studied provided no privacy information. Further, over 40% of apps with privacy policies did not offer a mobile version; instead, users had to read information in small type on a webpage from their phones. Additionally, security protocols are more robust online than in mobile environments. Consequently, the massive amounts of data shared with a mobile device and/or mobile app is at higher risk of access by unauthorized parties. What this means for Hispanic- and Black-Americans is that they may be more vulnerable to harms associated with data collection, such as hacking and unlawful surveillance.

Mobile app developers must commit to implementing “inclusiveness by design”, and this should not stop at hiring or the creation of a diversity and inclusion policy. Inclusiveness by design should inspire developers to meaningfully engage with their users’ backgrounds, use habits, and privacy and security expectations to ensure that a digital service protects all users equally regardless of the means by which that user accesses the service. Or, if the service is only on mobile, it should provide its users with the highest privacy and security standards.

Steps app developers should take include (but are not limited to):

  • Providing clear privacy policies and just-in-time notices (reminders that the app will collect certain information just before the user agrees to provide this information).
  • Obtaining affirmative express consent for certain categories of data collection and dissemination as appropriate.
  • Conducting research on a target or existing audience. This would include an assessment of that audience’s demographic makeup and any particular user expectations resulting from that make-up. For example, if many Spanish speakers use a service the app should offer Spanish translation of the service’s privacy policy. This research could also solicit feedback from the audience periodically on how the application could meet users’ expectations more effectively.
  • Avoiding application design that may inherently marginalize certain groups. For example, an app generally should not default to only “male” or “female” gender options.

Implementing “inclusiveness by design” will be useful, if not vital to the growth of digital platforms in the near future. The Internet has greatly improved communication, yet many groups are unable to access these advancements because they have subpar or nonexistent Internet connectivity. Mobile technology opens the internet to groups, such as minorities and low-income families, who have historically been shut out. If use of mobile technology remains less private and less secure than computer use, those who rely on mobile to access digital services will be forced to either surrender their privacy rights or not use these platforms – and ultimately, mobile technology will reflect the inequalities we see in the non-digital world.