Scientists have been collecting neural data from the brain for medical reasons for years, with myriad regulatory constraints in place. But in 2024, technologies are moving fast and furiously into the realm of consumer products.
The NeuroRights Foundation (NRF) reported in April that implantable technology can already decode language and emotions from the brain, and wearable devices are not far behind. Consumer product companies—and indeed, employers—already are, or will soon be able to, monitor brain waves through wearable devices such as headphones or through an employee typing without touching a keyboard or mouse. As the NRF report notes, at least 30 so-called neurotechnology products are available for purchase by the public.
While undoubtedly profitable for those companies, as these technologies develop—and especially when neurotechnologies are combined with artificial intelligence (AI)—ethical and privacy concerns are paramount. As noted in Neurotechnology and the Law: Privacy and Security Concerns, "[W]ith the emergence of neurotechnology, it may now be possible to tap into someone's brain and read [their] thoughts." Moreover, as neurotechnology is predicted to become a significant market with substantial economic benefits—$17.1 billion globally by 2026—legislators are beginning to take notice.
On April 17, Colorado's governor signed CO HB 24-1058, protecting the privacy of individuals' biological data, protecting the privacy of neural data and expanding the scope of the Colorado Privacy Act (CPA). The CPA applies to legal entities conducting business or producing products and services that are intentionally targeted to Colorado residents and that either (1) control or process personal data of more than 100,000 consumers per calendar year or (2) derive revenue from the sale of personal data and control or process the neural data of at least 25,000 consumers.
Sensitive Data
Neurotechnologies, as the new legislation recognizes, are beneficial in medical settings. A recent New York Times article on the Colorado law reported that neurotechnology is helping a man with paralysis to communicate by imagining, and a woman with paralysis to convey speech and facial expressions, through a computer. As CO HB 24-1058 notes, both invasive and noninvasive neurotechnologies used in medical settings are typically regulated as medical tools that produce health information and are also regulated by health data privacy laws.
Outside of a medical setting, things can get a little scary. With respect to noninvasive neurotechnologies, as the bill states: "they are generally considered consumer products and operate without regulation or data protection standards."
The Colorado General Assembly determined, therefore, to expand the CPA in the definitions of "sensitive data" to include:
- Biological data, which is"generated by the technological processing, measurement, or analysis of an individual's biological, genetic, biochemical, physiological or neural properties, compositions or activities, or of an individual's body or bodily functions, which data is used or intended to be used, singly or in combination with other personal data, for identification purposes"; and
- Neural data,a subset of biological data, which is information "generated by the measurement of the activity of an individual's central or peripheral nervous systems and that can be processed by or with the assistance of a device."
The CPA, meanwhile:
- Provides consumers the right to access, correct and delete personal data, as well as the right to opt out of the sale of personal data and the collection and use of personal data;
- Imposes an affirmative obligation on companies to safeguard personal data; to provide clear, understandable, and transparent information to consumers about how their personal data are used; and strengthens compliance and accountability by requiring data protection assessments in the collection and use of personal data; and
- Empowers the attorney general and district attorneys to access and evaluate a company's data protection assessments, to impose penalties where violations occur and to prevent future violations.
Adding biological data and neural data to the definitions of "sensitive data" in the CPA will require covered entities to be responsible custodians of these types of data as they continue to innovate. The CPA imposes a duty on "controllers" (person(s) who "determine the purposes for, and the means of, processing personal data") to refrain from processing a consumer's sensitive data without first obtaining the consumer's consent; or conducting processing that presents a heightened risk of harm to a consumer without conducting and documenting a data protection assessment.
This could prevent companies, for example, from collecting brainwave data for ad targeting, according to Ars Technica. The new law will take effect following the expiration of a 90-day period after final adjournment of the Colorado General Assembly unless a referendum petition is filed.
Neuro Rights
In 2023, Minnesota introduced H.F. No. 1904, relating to data privacy and establishing neurodata rights. The legislation would have, firstly, established rights to mental privacy and cognitive liberty, prohibiting government entities from collecting data transcribed directly from brain activity without informed consent, or from interfering with the free and competent decision making of an individual when making neurotechnology decisions.
The legislation would have also afforded rights to mental integrity and psychological continuity, requiring companies responsible for recording and storing data to provide (1) notice of potential uses and third parties with which the data will be shared and (2) to obtain consent from individuals for each use and each third party before the data may be used or shared, each time an individual connects to a brain-computer interface. (In case you were wondering, a prohibition on "consciousness bypass" prohibits companies from using a brain-computer interface to bypass conscious decision making by an individual: "Consent obtained by using a consciousness bypass is not informed consent"). Had the bill passed, companies violating those provisions would have been subject to a civil penalty of up to $10,000 per incident.
A pending California bill on consumer privacy, sensitive personal information and neural data, SB 1223, meanwhile, would include neural data in the definition of "sensitive personal information" for purposes of the California Consumer Privacy Act of 2018 (CCPA), and also grant various rights to consumers with respect to personal information collected by a business.
Takeaways
Colorado's HB 24-1058 is the start of the development of legal frameworks governing neurotechnology in nonmedical settings, especially as it connects with AI. More recently, on May 17, Colorado Governor Jared Polis signed into law SB 24-205—concerning consumer protections in interactions with artificial intelligence systems—while expressing reservations on the law's potential impact on innovation. The law will take effect Feb. 1, 2026. This makes Colorado "among the first in the country to attempt to regulate the burgeoning artificial intelligence industry on such a scale," Polis said in a letter to the Colorado General Assembly.
Minnesota, which could have been a trailblazer in 2023, will undoubtedly revisit the topic in the future. While such protections are novel in the United States, Chile amended its constitution in 2021, declaring that that the law shall "especially protect brain activity, as well as the information from it." According to the Future of Privacy Forum, Mexico seeks to follow, and Brazil may not be far behind.
Questions will continue to arise concerning consent. The Minnesota legislation would have required consent in situations including when a government entity collects data from brain activity, or when a company uses or shares the data; the California legislation suggests regulations regarding consent to a sale of a consumer's personal information.
And with new legislation on the horizon, case law cannot be far behind. In February, a federal district court judge in Massachusetts allowed a lawsuit to go forward brought by a candidate for employment subjected to an artificial intelligence technique gauging "integrity and honor" in a job interview. Brendan Baker brought a class action complaint after his interview was uploaded, without notice, to an AI company that analyzes facial and vocal expressions.
As the case law advances in this area, particularly into matters involving the brain, we certainly expect to see more jurisdictions following the lead of Colorado and other states in protecting brain wave data. Jurisdictions may also find themselves, one day, within the purview of the proposed American Data Privacy and Protection Act, a 2022 initiative establishing requirements for how companies handle personal data.
Ethical issues and constraints in this area will continue to arise—especially in connection with AI and data sharing—as well as questions of regulation, as neurotechnologies move from the realm of medicine and regulated medical devices to everyday consumer products. We anticipate addressing many novel questions in this area in future articles.
Meanwhile, the use of these evolving technologies continue to astonish. On May 10, Science, The Boston Globe and other publications reported that researchers are now using AI models to, for example, map a one-cubic-millimeter sample of an epilepsy patient's brain with extraordinary and formerly unavailable detail, allowing humans to identify "previously unknown aspects of the human temporal cortex." The project, which took place over 10 years, involved staining, slicing and imaging of the fragment by Harvard and restitching of the fragment by Google.
The Globe reported: "Google and others expect to use the findings to improve their ability to invent artificial intelligence algorithms modeled on the human brain." Clearly, we need to reap all the benefits that these advancements have to offer—while proceeding thoughtfully with respect to regulation.
* * * *
This article was written with the assistance of Epstein Becker Green staff attorney Ann W. Parks.
Reprinted with permission from the May 22, 2024, edition of the “New York Law Journal" © 2024 ALM Global Properties, LLC. All rights reserved. Further duplication without permission is prohibited, contact 877-256-2472 or asset-and-logo-licensing@alm.com.