Skip to Content

Privacy & Data

Mind Matters: Mental Privacy in the Era of Brain Tracking Technologies

By CDT Intern, Ebie Quinn

In an age where personal data is regularly collected and tracked online, it can feel like our brain is the last truly private place. While people are generally aware that their clicks, likes, and scroll are recorded and stored, many take solace in the idea that their thoughts remain private. Yet, our “neural” or “mental” privacy is being threatened by the introduction of commercial products that measure and track our brain signals. These brain-based technologies record electrical activity and motor function in the brain, which companies might use to try to discern information about our emotions, preferences, and mental health. 

Despite the novelty of these products, technology measuring brain activity is not new. Brain-based technologies have primarily been deployed in health care and research settings, where they are used to diagnose, treat, and monitor patients with brain-related diseases. Progress has been made in treating patients with paralysis or other mobility-limiting diseases through the use of Brain Computer Interfaces (BCIs), an “invasive” version of brain tracking technology. Additionally, a range of brain-based technologies are being developed to address mental health disorders, including depression and anxiety, through the use of neurofeedback and similar treatments. 

Commercially, brain tracking technologies are growing. In 2022, a man with ALS successfully used a computer independently after receiving a BCI that converts his neural activity to cursor movement. Elon Musk’s company Neuralink is also pursuing this technology, and in January 2024, the company’s first patient received the implant

Generally, however, commercial brain tracking technologies are still largely non-invasive, and appear in the form of wearable headbands and headphones. Estimates show that the burgeoning neurotech industry is growing at an annual rate of 12% and is expected to reach $21 billion by 2026. Companies like Muse and Brainbit have developed headbands that collect brain activity to improve meditation and sleep. Further, NeurOptimal developed EEG sensors and ear clips designed to assist users with their golf game through “brain training,” and Emotiv developed EEG headphones that claim to monitor attention in the workplace. Just one year ago, Apple patented a design for AirPods that measure and collect brain signals from the wearer, an indication that these technologies are becoming increasingly mainstream. 

Commercial brain tracking presents new privacy risks. While brain data in the medical setting is protected by the Health Insurance Portability and Accountability Act (HIPAA), these protections do not apply in a commercial context, which is generally governed by the Federal Trade Commission (FTC) at the federal level. As a result, consumer brain data is vulnerable to being collected, stored, and sold with little oversight. Specifically, the collection of this data could lead to harms arising from practices such as unwanted disclosure of sensitive health information and diagnoses, increased surveillance and productivity monitoring in the work-place, and targeted advertising.

Neural data collected in a commercial setting may be used to make inferences about an individual’s health without their knowledge or understanding. Brain data may reveal if a person has epilepsy, anxiety, depression, or other mental disorders, as certain patterns of brain activity, called “biomarkers” or “neuromarkers” can be linked to specific mental health conditions. Furthermore, the link between these brain activity patterns and health conditions might be used to make predictions about future health outcomes, including the development of mental health disorders, diseases, learning styles, and alcohol and drug-use. While this form of data collection might provide significant benefits in a medical context by helping guide treatment approaches, when deployed in a commercial setting, individuals risk the unwanted use and disclosure of sensitive health information. 

Additionally, brain-based technologies serve as the next frontier in workplace surveillance, an area full of risks as CDT has previously explained. Tech-company Emotiv has promoted their EEG headphones as a solution to wandering attention at work. They boast that this product reads employees’ cognitive states in real time and provides this data to both employee and employer to boost workplace productivity. The use of brain data to assess employee productivity poses significant risks, including potential discrimination and the erosion of employee trust

Finally, the increased collection of consumer neural data will supercharge targeted advertising. Advertisers already recognize the utility of using insights from neural data in designing and marketing products, a strategy known as “neuromarketing,” where a person’s brain activity is leveraged to inform marketing decisions. However, widespread use of brain-tracking technologies would enable advertisers to target individuals based on, for instance, unique responses to stimuli by combining data about what appears on your screen with data about your brain activity at that same moment. Such targeting would lead to the sale and commodification of brain data, a problematic extension of targeted advertising, which results in harms to consumers like manipulation, discrimination, and invasion of privacy.  

Recognizing the potential risks associated with brain tracking data, policymakers have begun to respond. In April 2024, Colorado became the first state to pass legislation explicitly protecting neural data by expanding the scope of the Colorado Privacy Act to include “biological data,” which includes data generated by the brain, in the definition of “sensitive data.” California and Minnesota have each introduced similar legislation, marking a positive step in the recognition of the emergent privacy concerns around consumer neurotechnologies. The most recent version of the American Privacy Rights Act in the House of Representatives also included “neural data” in its definition of “sensitive data.”

These policies, though well-intentioned, don’t go far enough. Neural privacy advocate Nita Farahany points out that the Colorado law applies only when the biological data is collected for identification purposes. However, many companies developing consumer neurotechnologies are not aiming to identify individuals but instead to make inferences about their mental state, mood, and productivity, or perhaps to train artificial intelligence systems. This language therefore might make the law inapplicable to a wide swath of the commercial activity that it was intended to reach. 

Moving forward, it is important to further understand the risks presented by brain tracking technology and to respond to those risks accordingly. Without those protections, we risk ceding essential brain privacy and autonomy.