Brain Computer Interface (BCI) companies are charging ahead with devices and services attempting to understand and manipulate human neural pathways. Some medical focused neurotechnology companies like Synchron are using surgically implanted devices to send signals to paralyzed patients’ brains in order to help them regain function of limbs. Other consumer focused firms are using chunky helmets and relatively normal looking smart headphones to measure their users’ brain signals.
[ Related: Neuralink shows first human patient using brain implant to play online chess ]
Though the technology like this is still relatively nascent, neural rights activists and cornered lawmakers want to be ready for when it is more widespread. Critics warn companies may already possess the ability to “decode” consumers’ data presented in brain scans and translate that into written text.
That decoded data can reveal highly sensitive details about an individual’s mental and physical wellness or their cognitive states. Researchers have already shown they can use AI models to read the brain data of patients watching videos and roughly reproduce the scenes those patients saw. This decoding process could become far easier, and more accurate, with the deployment of ever more powerful generative AI models.
There’s also little preventing current neurotechnology companies from misusing or selling that data to the highest bidder. All but one (96%) of the neurotechnology companies analyzed in a recent report by the Neurorights Foundation appear to have had access to consumers’ neural data, which can include signals from an individual’s brain or spine. The Foundation claims those companies provide meaningful limitations to neural data access. More than half (66.7%) of the companies explicitly mention sharing consumer’s data with third parties.
A first-of-its kind US law passed in Colorado this week could shift that dynamic by offering stricter, consumer-focused protections for all neural data sucked up by companies. The law, which gives consumers much greater control over how neurotechnology companies collect and share neural data, could add momentum to other similar bills making their way through state legislatures. Lawmakers, both in the US and abroad, are in the middle of a race to set meaningful standards around neural input data before these technologies enter the mainstream.
Keeping personal neural data private
Colorado law, officially dubbed HB 24-1058 will expand the term “sensitive data” in Colorado’s Privacy Act to include neural data. Neural data here refers to inputs created by the brain, spine, or highway of nerves flowing through the body. In this context, neurotechnology companies typically access this data through a wearable or implantable device. These can range from relatively standard looking headphones to wires jacked directly into a patient’s central nervous system. The expanded definition will apply the same protection to this as is currently afforded to fingerprints, face scans, and other biometrics data. Like with biometric data, businesses will now need to obtain consent before collecting neural data and take steps to limit the amount of unnecessary information they scoop up.
Coloradans, thanks to the law, will have the right to access, correct, or have their neural data. They also opt out of the sale of that data. Those provisions are essential, the bill’s authors write due to large amounts of unintentional or unnecessary neural data likely collected through neurotechnology services. Only 16 of the 30 companies surveyed in the Neurorights Foundation report said consumers can withdraw their consent to data processing under certain conditions.
“The collection of neural data always involves involuntary disclosure of information,” the Colorado bill reads. “Even if individuals consent to the collection and processing of their data for a narrow use, they are unlikely to be fully aware of the content or quantity of information they are sharing.”
Supporters of stricter neural data protections, like Neurorights Foundation Medical Director Sean Pauzauskie praised Colorado’s action during a recent interview with The New York Times.
“We’ve never seen anything with this power before—to identify, codify people and bias against people based on their brain waves and other neural information,” Pauzauskie said.
Who else protects neural data?
Colorado’s law could set the standard for other states to follow. On the national level the US currently lacks any federal legislation limiting how consumer companies access or use neural data. Outside of the The Centennial State, similar bills are under consideration in Minnesota and California. The California legislation stands since many of the biggest names exploring brain computer interfaces, like Neuralink and Meta, are headquartered within that jurisdiction. Other countries have stepped ahead of the US on neural data regulation. In 2021, Chile became the first country to include language legally protecting neural rights after it added them to its national constitution. Since then Brazil, Spain, Mexico and Uruguay, have also passed their own legislation.
All of this simmering regulatory interest may seem unusual for an industry that still appears relatively nascent. BCI users likely won’t be telepathically messaging their thoughts to friends any time soon and medical applications for paralyzed or other injured peoples remain reserved to a select few. But supporters of these relatively early emerging neural regulations hope these preemptive efforts can help set standards and potentially help model the growing neurotechnology industry towards a more privacy-conscious future. And if recent debates over social media regulations are any guide, it’s often easier said than done to try and retroactively apply new regulations to products and services once they’ve already become staples of modern life. When it comes to dystopian tinged mind-reading tech, Pandora’s Box is still mostly closed, but it’s beginning to crack open.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : Popular Science – https://www.popsci.com/technology/brain-data-privacy/