Infosec Watchtower Logo

Colorado Expands Privacy Laws to Protect Brainwave Data from Neurotech Devices

Charles M. Walls | April 23, 2024 | Views: 103

A futuristic image representing neurotechnology and privacy. The scene includes a person wearing a sleek, modern EEG headset.

Using biometric security like fingerprint and face recognition is now commonplace, but what about technology that dives deeper, tapping into your brainwaves? With neurotech wearables, that's exactly where we're headed.

On a notable Wednesday, Colorado's governor enacted legislation that brings brain activity data under the protection of the state’s privacy laws. This new law classifies neural data—essentially, your brainwaves—as biological data. This category also includes your fingerprints, face, and even DNA, elements that technology companies are increasingly capturing.

Neurotechnology utilizes methods such as electroencephalography (EEG) to measure brain activity via electrodes. There are invasive forms, like those developed by Neuralink or Synchron, which are implanted and regulated as medical devices. On the other hand, non-invasive neurotech, which includes various wearable devices employing EEG, lacks similar regulatory oversight as they're deemed consumer products.

These consumer EEG devices have been integrated into products ranging from athletic performance enhancers to meditation aids, and companies like Emotiv, NeuroSky, and even tech giants like Meta, Apple, and Snap are pioneering this space. However, the privacy implications are vast and largely unregulated.

The introduction of this legislation in Colorado is a reaction to increasing privacy concerns, especially around consumer-grade brain-computer interfaces (BCIs). The law highlights the sensitivity of neural data, which can disclose intimate details about an individual's health, emotional state, and cognitive abilities.

A study by The Neurorights Foundation has raised alarms, showing that nearly all of the companies reviewed could access consumer neural data without significant restrictions. Further research supports concerns, suggesting that these devices might even have the capability to decode thoughts—a significant privacy risk if safeguards are not in place.

As AI continues to permeate various sectors, the need for robust data protection has become more pressing. Though AI and tech industries face minimal regulation in the U.S., Europe has taken more stringent measures. While Colorado's legislation is a step forward, both California and Minnesota are also advancing in this area, yet a federal standard on neural data is still missing.

This move by Colorado might be small, but it signals a growing recognition of the need for comprehensive data privacy laws to keep pace with technological advancements, ensuring that individuals' innermost thoughts remain private and protected.

Source of Inspiration