Neurable, a consumer neurotechnology startup, has partnered with the Air Force to study whether electrode-studded headphones can track service members’ cognitive fitness, much like Garmin smartwatches have monitored Space Force members’ physical fitness, company and government officials said this month.
The $1.2 million project adds to the Pentagon’s growing investment in the $3 billion market for brain-computer interfaces (BCIs) — helmets, earbuds, other wearable devices and medical implants that use artificial intelligence to make sense of brain signals.
Elon Musk’s Neuralink, perhaps the most high-profile implantable example, has enabled clinical trial patients who cannot move or speak to type words that they imagine. Since at least the 1970s, the Defense Department has funded the development of BCIs that, among other things, restore eyesight and control prosthetic arms.
Now, advances in AI enable BCIs to interpret or change brain activity with unprecedented accuracy and speed. So much so that some scientists and neuroethicists say the capacity to monitor signals that reveal a warfighter’s or a veteran’s mental states, visual perceptions or inner dialogue may outpace the ability to shield that data from misuse.
“One could certainly imagine how enforced use of such devices could create a very dystopian basis for behavioral control,” warned James Giordano, director of the National Defense University’s Center for Disruptive Technology and Future Warfare and Georgetown University Medical Center’s former chief of neuroethics studies.
Consequently, any accessing or use of neural data must require “ongoing, active informed consent,” so active and retired military have the chance to opt in or out without penalty, he said.
Giordano, who was sharing his personal views and not speaking on behalf of any government agency, added, “The security of the neurological data used for that individual — and in that individual’s wellbeing only — is paramount. It goes beyond simple HIPAA.”
He was referring to the 1996 Health Insurance Portability and Accountability Act, a law regulating healthcare providers that predates global Wi-Fi and does not cover most app and algorithm developers, commercial cloud providers or consumer wearable designers.
Neurable and Air Force officials say that the non-invasive cognitive fitness tracker will build upon technology in the firm’s consumer BCI product — Master & Dynamic luxury headphones containing fabric electroencephalogram, or EEG, sensors and electrically conductive inks to gauge mental focus.
The $500 device, which launched last June, deploys an algorithm that maps brainwaves to ground-truth markers of focus — users’ speed and accuracy in finishing tasks, while undistracted and distracted. When the wearer’s brainwaves resemble those tied to flagging attention, the headphones sync with an app that alerts the user: “You’ve earned a Brain Break.”
The Air Force-funded BCI project will use other headset models, such as noise protectors or helmets, said Neurable co-founder Adam Molnar, who helped secure $4 million for prior military research and development work.
The study will measure cognitive performance with and without pressure — for instance, memory skill with and without sleep deprivation — against brainwaves to identify brain activity associated with optimal and suboptimal performance, according to Neurable.
The idea is that alerting service members to significant brain signal changes throughout the day — or, providing “neurofeedback” — will train them to adjust their behavior and brainwaves to hit peak performance.
“This is all within this concept of readiness. How do we make sure that we have a ready and able defense?” said Molnar, who anticipates wrapping development within two years.
Molnar added that the Space Force has a “really cool program where they’re using the Garmin watch,” embedded with heart rate, blood oxygen and other wrist-worn sensors, “to help improve cardio fitness and physical activity‚” but “we don’t really have that for the brain.”
A comparable neurofeedback tool, he said, may spur “more robust brain health practices, not just for defense, but also for possibly identifying Parkinson’s or Alzheimer’s disease indicators a decade before you feel your first symptom.”
Air Force Research Laboratory officials said that Neurable’s fabric-based sensing technology may aid multiple operations, inside and outside of the cockpit.
“I suspect that there are a large number of military members for whom a real-time fatigue or cognitive state monitor would be appealing if they thought it would make them more effective at executing the mission,” William Aue, cognitive neuroscience section chief at the Air Force Research Laboratory’s 711th Human Performance Wing, said by email.
Let them ‘make their mind up’
Some neuroethicists, however, cautioned that the use of brain wearables may also invade troops’ privacy and foster prejudice.
“We need to be much more reflective on, much more concerned about and much more focused on the implications of altering a person’s brainwaves,” through neurofeedback training, “as compared to having them work out more or do more pull-ups,” said Jared Genser, a lawyer and co-founder of the Neurorights Foundation, a group advocating for the legal protection of people’s brain data.
Similarly, Giordano said, “The concept of personalized medicine must be applied here,” meaning that military leaders must tailor training and performance metrics to the limits of each individual’s brain and body, so “that you don’t get plateaus, and you’re not beginning to see performance decrement or performance fatigue.”
The Neurorights Foundation’s Medical Director Sean Pauzauskie, a neurologist in the UCHealth University of Colorado system, added that cognitive fitness standards may overstep the proposed right to freedom from algorithmic bias based on neural data interpretations.
“If the warfighter’s brain is incapable of the appropriate plasticity to achieve the desired [mental] state,” through neurofeedback training, “they could be discriminated” against by their superiors, Pauzauskie said.

Also, EEG data flowing into and out of consumer wearables can reveal several diseases, including epilepsy and mild cognitive impairment, as well as a range of emotions.
Via email, Molnar responded to such concerns, saying, “Neurable’s focus is on supporting human performance and safety, not enforcing neurological conformity.”
“Any feedback provided by our systems is designed to be informational and user-centered, helping individuals understand factors like cognitive workload or fatigue,” he added, “much like a heart-rate monitor provides insight into physical exertion.”
Neurable officials said the system’s sensors filter out all EEG data not needed by its algorithms and encrypt the limited amount of necessary data in transit.
“Even if someone hacked someone’s brain data, they would not be able to make sense of it,” Molnar said.
The Air Force’s Aue confirmed that, during experiments, the Air Force lab anonymizes EEG data streaming from Neurable’s wearables, adding that the information is “challenging to interpret without the algorithms we use to process the raw data.”
But, such protections may not stop adversaries from decoding wearable-based intelligence in the future, as AI becomes smarter and devices utilize more EEG data.
Already, in the laboratory, algorithms can narrow down a person’s identity into a “mindprint” of their mental processing or “brainprint” of their brain structures, based on EEG data from a medical headcap, Pauzauskie noted.
Likewise, Genser explained, EEG data that is out into the open, if isolated from a person’s name and identity, will remain deidentified but only for a period. The problem is that, as generative AI-powered software develops, “you are going to be able to decode a lot more information from the data that escaped, enabling the data to be reidentified and traced back to you personally.”
In an email, Molnar acknowledged that as “AI continues to accelerate research and development, it is prudent to anticipate how capabilities might evolve and to establish safeguards early.”

He suggested additional ethical review, clear boundaries on acceptable uses, and dialogue across technologists, ethicists, policymakers and technology users.
Aue, who has been working on other research with Neurable for four years, said in an email that research study participants “are fully informed about the nature of the work before participating.”
That said, as the technology matures, conversations “need to occur about the ethical use of the technology and what military doctrine might entail,” he added.
Giordano said that the key to any adoption of AI-based neurotech in the military is mandatory informed consent during and after service.
Personnel must receive updates on any adverse effects and information on “the availability or non-availability of continuity of care if things go wrong,” he added. “That way, they can make their mind up.”
Reading the minds of wounded warriors?
Due to AI, implantable brain devices, much as non-invasive brain wearables, now offer troops and veterans novel insights into themselves — and possibly the same intel to adversaries, too.
For instance, new eyesight neuroprostheses rely on AI to “bridge artificial vision and neural tissue,” said Christopher Steele, chief of strategy at the Pentagon-backed nonprofit Medical Technology Enterprise Consortium. The consortium provided $2 million to help develop a prototype.
The camera-equipped BCI uses AI to decode brain signals that convey the landscape a person’s eyes are trying to see. AI then translates data from the camera and other external sensors focused on that view into artificial signals that the mind understands as shapes and motion.
Finally, AI compares the desired and perceived scenes and adjusts the artificial signals to show more detail, said Steele, a former director of the Army Medical Research and Development Command.
This self-adjusting AI, while a potential breakthrough for visually-impaired warfighters, also introduces risks, Giordano said.
“The AI that provides continuous closed-loop signals between the eye and the brain is not monitored,” he explained.
As such, it may self-adapt in a way that falsifies images, omits certain elements of the view or delays transmission of an image, any of which may result in a loss of situational awareness, Giordano said.
Compounding the problem, when the AI “system itself is vulnerable to hacking,” an adversary can monitor or corrupt the warfighter’s visual input, he added.
“If we deploy this device downrange” in a combat zone, ”can it be hacked by an enemy?” Steele said. “We are concerned about the hacking of any medical capability.”
Currently, the system only operates in clinical trials that are subject to HIPAA, and other protections include hardware isolation, encryption and AI guardrails that prevent unsafe adaptation, he added.
Yet, as Giordano cautioned, self-learning AI, by design, can learn good or bad behavior that may override guardrails.
Another military funding arm, the Amyotrophic Lateral Sclerosis Research Program, is underwriting clinical trials to evaluate Braingate, a forerunner to Musk’s Neuralink implant that translates words a patient imagines into audible speech.
The Defense Department’s $2.3 million contribution aims to help people with ALS — a disease associated with military service — who cannot move or talk.
This advance is “foundationally fantastic” for people with various neuro-motor conditions who can think of speech but not generate motor output to move their mouths, meaning “they’re communicatively locked in,” Giordano said. The device “allows these individuals the autonomy of communication.”
At the same time, because the device reads signals in the motor cortex that show attempts at moving the mouth, the tool allows for “what might be regarded as mind-reading,” Giordano said.
Currently, human beings, military or not, have no legal right to be free of mind-reading.
Neither international human rights law nor federal regulations safeguard mental privacy, mental autonomy (“free will”) or neural data, according to the Neurorights Foundation.
Only four states — California, Colorado, Connecticut and Montana afford protection to “neural data” within their state consumer data privacy laws. And one country, Chile, has adopted a constitutional amendment to protect “mental integrity.”
As the Foundation’s Genser notes, one of the only countries with a national-level set of principles on the ethical use of BCIs, China, may be the least likely to use it.
He is pressing for “clear limits” on the purposes for developing and using neurotech and neural data in the military, subject to strong oversight that safeguards the health and autonomy of service members.
“If we don’t have any tools or standards around the national security implications of emerging neurotechnologies, then how is our military different than any other military?” Genser posited.
As an illustration of unregulated brain alteration, he pointed to the CIA’s MKUltra program, a Cold War-era mind-control experiment that tested LSD and other psychedelic drugs on soldiers and others, sometimes with permanent psychological and physical harm.
“The last thing we want is for neurotech to be the next version of MKUltra,” Genser said, “not using drugs, but using neuro-stimulation or other neural approaches to alter brain activity.”
Aliya Sternstein is an investigative journalist who covers technology, cognition and national security. She is also a research analyst at Georgetown Law. Her writing on the intersection of public health and constitutional rights has appeared in the Stanford Law Review, Arizona Law Review, and other law and health academic journals.

