Regular Checkup readers will be familiar with some of the burgeoning uses of “mind-reading” technologies. We can track brain activity with all sorts of devices, some of which measure brain waves while others track electrical activity or blood flow. Scientists have been able to translate this data into signals to help paralyzed people move their limbs or even communicate by thought alone.
But this data also has uses beyond health care. Today, consumers can buy headsets that allow them to learn more about how their brains work and help them feel calm. Employers use devices to monitor how alert their employees are, and schools use them to check if students are paying attention.
Brain data is precious. It’s not the same as thought, but it can be used to work out how we’re thinking and feeling, and reveal our innermost preferences and desires. So let’s look at how California’s law might protect mental privacy — and how far we still have to go.
The new bill amends the California Consumer Privacy Act of 2018, which grants consumers rights over personal information that is collected by businesses. The term “personal information” already included biometric data (such as your face, voice, or fingerprints). Now it also explicitly includes neural data.
The bill defines neural data as “information that is generated by measuring the activity of a consumer’s central or peripheral nervous system, and that is not inferred from nonneural information.” In other words, data collected from a person’s brain or nerves.
The law prevents companies from selling or sharing a person’s data and requires them to make efforts to deidentify the data. It also gives consumers the right to know what information is collected and the right to delete it.
“This new law in California will make the lives of consumers safer while sending a clear signal to the fast-growing neurotechnology industry there are high expectations that companies will provide robust protections for mental privacy of consumers,” Jared Genser, general counsel to the Neurorights Foundation, which cosponsored the bill, said in a statement. “That said, there is much more work ahead.”
Genser hopes the California law will pave the way for national and international legislation that protects the mental privacy of individuals all over the world. California is a good place to start — the state is home to plenty of neurotechnology companies, so there’s a good chance we’ll see the effects of the bill ripple out from there.
But some proponents of mental privacy aren’t satisfied that the law does enough to protect neural data.“While it introduces important safeguards, significant ambiguities leave room for loopholes that could undermine privacy protections, especially regarding inferences from neural data,” Marcello Ienca, an ethicist at the Technical University of Munich, posted on X.
One such ambiguity concerns the meaning of “nonneural information,” according to Nita Farahany, a futurist and legal ethicist at Duke University in Durham, North Carolina . “The bill’s language suggests that raw data [collected from a person’s brain] may be protected, but inferences or conclusions — where privacy risks are most profound — might not be,” Farahany wrote in a post on LinkedIn.
Ienca and Farahany are coauthors of a recent paper on mental privacy. In it, they and Patrick Magee, also at Duke University , argue for broadening the definition of neural data to what they call “cognitive biometrics.” This category could include physiological and behavioral information along with brain data — in other words, pretty much anything that could be picked up by biosensors and used to infer a person’s mental state.
After all, it’s not just your brain activity that gives away how you’re feeling. An uptick in heart rate might indicate excitement or stress, for example. Eye-tracking devices might help give away your intentions, such as a choice you’re likely to make or a product you might opt to buy. These kinds of data are already being used to reveal information that might otherwise be extremely private. Recent research has used EEG data to predict volunteers’ sexual orientation or whether they use recreational drugs. And others have used eye-tracking devices to infer personality traits.
Given all that, it’s vital we get it right when it comes to protecting mental privacy. As Farahany, Ienca, and Magee put it: “By choosing whether, when, and how to share their cognitive biometric data, individuals can contribute to advancements in technology and medicine while maintaining control over their personal information.”
© Copyright 2024 Technology Review, Inc. Distributed by TRIBUNE CONTENT AGENCY, LLC.