By Clayton Aldern | November 12th 2012 03:27 PM
I took a stab at conversation—“Synapse Gel, eh? Ha. Brain-computer interface firm; I see what you did there.”—but I was quickly chided for smearing the moisturizer with the motion from my speech. I wondered why my aforementioned glance around the room had not provoked the same chastisement.
Tiffany began to give the same conditioning treatment to the four contacts of the EEG headband on the table between us. The headband, which looked like an inflated rubber wristwatch, wriggled under her shiatsu. I glanced around again, now certain that an inappropriately high degree of vigorous kneading of an inanimate object was taking place before me. This scan around the hall was equally fruitless. Nobody was paying attention to the impish rubdown occurring at the Intific booth. All eyes were on the screen. My dweeby instincts were getting ready to articulate some Orwellian parallels, but Tiffany, EEG headband in hands, was suddenly lunging for my unwary head. I must have flinched a bit, because she saw fit to mention that the device would not shock me. I nodded my understanding and bowed my head. After a bit of fumbling around the ears and some awkward eye contact, Tiffany plugged a small cord into an iPad that was more or less pasted to her red-panted hip. Jagged blue waves pounced onto the screen.
“Signal looks good,” she said, shooting me a thumbs-up.
I was ready to play.
EEG stands for electroencephalography. Despite a syllable count that almost breaks into double digits, form follows function quite neatly. Electro- is electricity, enceph- is brain, and graph- is study. EEG measures oscillatory electrical activity from the brain via electrode contacts placed on the scalp, summing up and spitting out a coarse representation of the huge populations of neurons that dance their electrochemical jig on the other side of the skull. Neuroscientists love EEG like the French love Édith Piaf. It makes sense; the technology is objectively mind-blowing. A research-grade EEG recording net has up to 256 electrodes, each capable of noninvasively aggregating the electrical output of thousands to millions of brain cells. By recording from subjects performing cognitive tasks, researchers can build a quantitative constellation that maps how the outer few layers of the brain—the cortex—cooperate during the experiment. Researchers use these maps to understand how the brain does what it does and what might be going wrong when it fails at a given task. Other applications of the technology include epileptic seizure prediction and brain tumor detection.
Further down the rabbit hole of EEG analysis sub-disciplines, investigators stumble upon the SciFi–esque quirk of a practice called “decoding.” Decoders seek to read out behavioral correlates from raw EEG signals alone. Given a few seconds of data, they want to predict whether you are looking at a dog, thinking about breakfast, or attempting to stand up and leave the room. Forty-niners in the gold rush of mind reading, decoders use complex algorithmic pans to sift through their cortical muck. As algorithms improve, decoders’ ability to seize the cognitive correlates hidden in a raw signal becomes more robust.
Neuroengineers refer to devices that employ such algorithms as brain-computer interfaces (BCI). The EEG headband that was strapped around my head was an example of a commercial BCI targeting the entertainment industry. For stroke patients and paralyzed individuals, however, research-grade BCIs occupy the promised land of modern rehabilitation. Neuroprosthetics are BCIs that restore motor function using thoughts alone: Implanted or superficial electrodes read out electrical activity from a patient’s motor cortex, a computer translates the signal into a behavioral category—say, reach—and a robotic limb completes the desired motion on behalf of the patient. Where traumatic brain injury and stroke sever the link between thought and action, BCIs reconnect the circuit.
Intific, along with competitor private BCI companies with other dystopian names like NeuroSky and Emotiv, manufactures consumer-grade BCIs to sell brain-powered video games. The headsets are cheap and dirty. One hundred dollars will get you four to fourteen electrodes that ostensibly decode the difference between brain states like “focus” and “rest.” A theoretical increase in a user’s attention to the task at hand may beef up the vigor of their avatar boxer. It might give an extra boost of nitrous to their avatar racer. Whatever the game, the algorithms exploit the neuronal dichotomy between attention and relaxation to enhance a player’s gaming experience. And with more futuristic experiences come higher revenues for Intific.
Prior to stumbling upon the Intific booth, I had been unaware of the company’s presence in the consumer BCI sphere. The brain game demo had turned a small portion of the 30,000-person strong Society for Neuroscience annual conference into a slightly reticent, slightly ogling swarm. The rubberneckers around me were halfheartedly itching to see how their own brain waves affected their digital performance. I had stepped forward and was about to find out.
Speaking more to my headband than my ears, a firm voice emerging from between a swath of thick sideburns told me the rules of “NeuroStorm.” I was a neurobot flying through the brain, and it was my job to charge up the neurons that needed charging up. With my neurobot laser, I could wholeheartedly zap the forest of brain cells and watch the signals propagate along their axons. All I had to do was keep up my “focus level.” There was no time for my questions—What was the frequency of the laser that was mounted on the neurobot’s chassis? Was this supposed to be an optogenetics simulation? How were we reconciling membrane conductivity with the heat of the laser? Why, out of the heaping set of all horrible pseudoscientific terms, did they choose to call it a neurobot? What, in the name of Fourier, was a focus level?—because Ron with the sideburns shoved a controller into my hands and pressed the start button. It was time to focus.
As I mashed the A-button, I watched my neurobot guide itself through the network of neurons. All I had to do was press A and concentrate. Or think, or something. Focus. Every zap garnered a handful of points, and my focus multiplier could increase that handful by up to a factor of five. Focus.
“Focus” seemed a touch ambiguous.
I tried mentally concentrating on the act of pressing the A-button on the controller—no change in my focusometer at the bottom of the screen. I meditated on EEG analysis, about which, as a student of neuroscience, I knew a small but sufficient amount. I focused on the unfathomable discomfort of the wet headband. My focus level remained stagnantly red. I thought about the act of thinking. Freud and Jung drifted in and out of consciousness. And then I thought about cats, because I often think about cats.
The signal skyrocketed. The next neuron I lasered pulsed with a lime-green five-times score multiplier. A slow smile crept onto my face. I thought a bit more about cats. I thought a bit about LOLcatz. My focus level was soaring in the green. Calico! Zap! Kittens! Zap! I’m really neurobotting now! If anyone wanted to know what four naïve electrodes saw while observing my cat-thoughts, all they had to do was take a peek at Tiffany’s iPad. Albeit untranslated and rather grimy, the readout was a Rosetta Stone of raw waveform. It was all right there: the A-button, Sigmund and Carl, kitty central, iCanHazCheezburger. With enough mathematical dexterity—a wavelet decomposition here, some principal component analysis there—the thought categories would pop right out. It would be the signal-processing equivalent to translating the Stone. Once a reliable thought signature emerged for a given type of thoughts, it could be used to identify all future occurrences of that category. I paused. (My focus level dropped.) That last thought made me a little nervous.
I am not the first person made nervous by reasonable-resolution commercial BCIs. Earlier this year, Oxford University neuroscientist Dr. Ivan Martinovic and colleagues at UC Berkeley and the University of Geneva explored the question at the heart of the anxiety: Using nothing but these comparatively simple devices and their respective software, could someone hack the brain? The researchers described a hypothetical scenario in which a malicious third-party developer—a brain hacker—writes an application that, unbeknownst to the commercial BCI user, reveals and transmits private information: things like ATM PIN numbers, banking locations, geographical residences, and knowledge of people the user might know.
The theoretical prospect is disturbingly sound. Applications for Intific, Emotiv, and NeuroSky games are open-sourced—like the Apple App Store—and anyone with knowledge of the programming language can write an app. If brain hackers wrote a clever enough program, they could absolutely extract the information without the subject’s knowledge. Martinovic and colleagues set out to do just that. The team designed a mock application, compatible with an Emotiv headset, which flashed random images upon the screen while test subjects wore the BCI. The goal was to identify whether or not there was enough information in the Emotiv signal to capture users’ knowledge about and familiarity with various streams of digits, banking firms, geographic locations, and famous people.
Cue sound effect from Inception. The mock brain hack worked—somewhat.
In all cases, there was a slight increase in the probability that the system could predict a user’s familiarity with a given piece of information. Some information types were more susceptible to brain hacking than others. The system, for example, could predict a subject’s month of birth at 60 per cent accuracy on the first guess, and at 95 per cent accuracy by the sixth. By no means was the system perfect, but the proof-of-concept was very real. Digital signal-processing algorithms similar to Martinovic’s were likely being used in NeuroStorm to pick out the points in time in which I was in a state of heightened focus. Under the right conditions, the same field of math that converted cat-thoughts to focus could be capable of fishing out my PIN. That was what was so unsettling. It was all the same medium. The life-altering technology of neuroprosthetic BCIs shared an industry with the ultimate skin-crawl of mind robbery. Technological improvements in one branch would inform the other. Policy directed at one could affect both.
As my submarine-like neurobot cruised past the last neuron on the screen, it was left suspended in a purplish abyss. A score tabulator appeared onscreen and informed me that I had set a new record. (!) Tiffany and her red pants gave the slightest of fist pumps; a bushel of sideburns billowed beside me in the convention center air. I was handed a malleable stress brain by an ethereal hand that emanated from behind the video screen. Up until that point, I had been utterly unaware of the fact that someone else on the Intific team had been watching my performance. Carefully peeling off my headband, Tiffany began explaining the exciting new partnership between Intific and Advanced Brain Monitoring, Inc. that had allowed for the creation of NeuroStorm. Both teams were quite excited to be entering the market, especially because their Synapse Gel-aided wet electrodes offered greater resolution than the dry electrodes at NeuroSky and Emotiv.
“We don’t even consider them competitors,” she explained, showing some teeth.
I asked her if she was aware of Martinovic et al. August 2012. She was not.
Tiffany told me that NeuroStorm had debuted at a recent Google tradeshow. I nodded; she nodded back. Together, we shared maybe three-quarters of a smile. I looked at the spongy cerebral stress ball in my hand and shifted my weight from one leg to another. I averted my eyes, uttered a lackluster “thanks and good luck,” and scampered off with the pamphlets she had somehow slipped into my hands. Damn, they were sneaky.
But other companies are even sneakier. Japanese tech designer Neurowear is the developer of Necomimi, a set of motorized plush ears strapped to a dry-electrode BCI. The ears perk up during focus, and are otherwise turned down. They come in a rainbow of colors. They are tremendously cute. Coming down the pipeline from Neurowear is Shippo (an extension of Necomimi), a motorized tail that wags a bit more spiritedly during focus in addition to updating the social network of your choice with a mood and a geotag. If all goes according to plan—for Neurowear, Intific, and the rest—they will know where you are and how you are feeling at any given point in the day, assuming you are sporting the ultra-posh animal ears that are currently selling like hotcakes. Give decoders another ten years and they will likely know what you are thinking about, too. Sure, the claim is unnecessarily inflammatory, but buried beneath the sensationalism is a kernel of scientific truth that is difficult to ignore. Open-sourcing the software (and hardware—any institutionally affiliated researcher can pick up an Emotiv headset for free) is an invitation to experimentation. Technology is only going to improve.
Tiffany began to give the same conditioning treatment to the four contacts of the EEG headband on the table between us. The headband, which looked like an inflated rubber wristwatch, wriggled under her shiatsu. I glanced around again, now certain that an inappropriately high degree of vigorous kneading of an inanimate object was taking place before me. This scan around the hall was equally fruitless. Nobody was paying attention to the impish rubdown occurring at the Intific booth. All eyes were on the screen. My dweeby instincts were getting ready to articulate some Orwellian parallels, but Tiffany, EEG headband in hands, was suddenly lunging for my unwary head. I must have flinched a bit, because she saw fit to mention that the device would not shock me. I nodded my understanding and bowed my head. After a bit of fumbling around the ears and some awkward eye contact, Tiffany plugged a small cord into an iPad that was more or less pasted to her red-panted hip. Jagged blue waves pounced onto the screen.
“Signal looks good,” she said, shooting me a thumbs-up.
I was ready to play.
EEG stands for electroencephalography. Despite a syllable count that almost breaks into double digits, form follows function quite neatly. Electro- is electricity, enceph- is brain, and graph- is study. EEG measures oscillatory electrical activity from the brain via electrode contacts placed on the scalp, summing up and spitting out a coarse representation of the huge populations of neurons that dance their electrochemical jig on the other side of the skull. Neuroscientists love EEG like the French love Édith Piaf. It makes sense; the technology is objectively mind-blowing. A research-grade EEG recording net has up to 256 electrodes, each capable of noninvasively aggregating the electrical output of thousands to millions of brain cells. By recording from subjects performing cognitive tasks, researchers can build a quantitative constellation that maps how the outer few layers of the brain—the cortex—cooperate during the experiment. Researchers use these maps to understand how the brain does what it does and what might be going wrong when it fails at a given task. Other applications of the technology include epileptic seizure prediction and brain tumor detection.
Further down the rabbit hole of EEG analysis sub-disciplines, investigators stumble upon the SciFi–esque quirk of a practice called “decoding.” Decoders seek to read out behavioral correlates from raw EEG signals alone. Given a few seconds of data, they want to predict whether you are looking at a dog, thinking about breakfast, or attempting to stand up and leave the room. Forty-niners in the gold rush of mind reading, decoders use complex algorithmic pans to sift through their cortical muck. As algorithms improve, decoders’ ability to seize the cognitive correlates hidden in a raw signal becomes more robust.
Neuroengineers refer to devices that employ such algorithms as brain-computer interfaces (BCI). The EEG headband that was strapped around my head was an example of a commercial BCI targeting the entertainment industry. For stroke patients and paralyzed individuals, however, research-grade BCIs occupy the promised land of modern rehabilitation. Neuroprosthetics are BCIs that restore motor function using thoughts alone: Implanted or superficial electrodes read out electrical activity from a patient’s motor cortex, a computer translates the signal into a behavioral category—say, reach—and a robotic limb completes the desired motion on behalf of the patient. Where traumatic brain injury and stroke sever the link between thought and action, BCIs reconnect the circuit.
Intific, along with competitor private BCI companies with other dystopian names like NeuroSky and Emotiv, manufactures consumer-grade BCIs to sell brain-powered video games. The headsets are cheap and dirty. One hundred dollars will get you four to fourteen electrodes that ostensibly decode the difference between brain states like “focus” and “rest.” A theoretical increase in a user’s attention to the task at hand may beef up the vigor of their avatar boxer. It might give an extra boost of nitrous to their avatar racer. Whatever the game, the algorithms exploit the neuronal dichotomy between attention and relaxation to enhance a player’s gaming experience. And with more futuristic experiences come higher revenues for Intific.
Prior to stumbling upon the Intific booth, I had been unaware of the company’s presence in the consumer BCI sphere. The brain game demo had turned a small portion of the 30,000-person strong Society for Neuroscience annual conference into a slightly reticent, slightly ogling swarm. The rubberneckers around me were halfheartedly itching to see how their own brain waves affected their digital performance. I had stepped forward and was about to find out.
Speaking more to my headband than my ears, a firm voice emerging from between a swath of thick sideburns told me the rules of “NeuroStorm.” I was a neurobot flying through the brain, and it was my job to charge up the neurons that needed charging up. With my neurobot laser, I could wholeheartedly zap the forest of brain cells and watch the signals propagate along their axons. All I had to do was keep up my “focus level.” There was no time for my questions—What was the frequency of the laser that was mounted on the neurobot’s chassis? Was this supposed to be an optogenetics simulation? How were we reconciling membrane conductivity with the heat of the laser? Why, out of the heaping set of all horrible pseudoscientific terms, did they choose to call it a neurobot? What, in the name of Fourier, was a focus level?—because Ron with the sideburns shoved a controller into my hands and pressed the start button. It was time to focus.
As I mashed the A-button, I watched my neurobot guide itself through the network of neurons. All I had to do was press A and concentrate. Or think, or something. Focus. Every zap garnered a handful of points, and my focus multiplier could increase that handful by up to a factor of five. Focus.
“Focus” seemed a touch ambiguous.
I tried mentally concentrating on the act of pressing the A-button on the controller—no change in my focusometer at the bottom of the screen. I meditated on EEG analysis, about which, as a student of neuroscience, I knew a small but sufficient amount. I focused on the unfathomable discomfort of the wet headband. My focus level remained stagnantly red. I thought about the act of thinking. Freud and Jung drifted in and out of consciousness. And then I thought about cats, because I often think about cats.
The signal skyrocketed. The next neuron I lasered pulsed with a lime-green five-times score multiplier. A slow smile crept onto my face. I thought a bit more about cats. I thought a bit about LOLcatz. My focus level was soaring in the green. Calico! Zap! Kittens! Zap! I’m really neurobotting now! If anyone wanted to know what four naïve electrodes saw while observing my cat-thoughts, all they had to do was take a peek at Tiffany’s iPad. Albeit untranslated and rather grimy, the readout was a Rosetta Stone of raw waveform. It was all right there: the A-button, Sigmund and Carl, kitty central, iCanHazCheezburger. With enough mathematical dexterity—a wavelet decomposition here, some principal component analysis there—the thought categories would pop right out. It would be the signal-processing equivalent to translating the Stone. Once a reliable thought signature emerged for a given type of thoughts, it could be used to identify all future occurrences of that category. I paused. (My focus level dropped.) That last thought made me a little nervous.
I am not the first person made nervous by reasonable-resolution commercial BCIs. Earlier this year, Oxford University neuroscientist Dr. Ivan Martinovic and colleagues at UC Berkeley and the University of Geneva explored the question at the heart of the anxiety: Using nothing but these comparatively simple devices and their respective software, could someone hack the brain? The researchers described a hypothetical scenario in which a malicious third-party developer—a brain hacker—writes an application that, unbeknownst to the commercial BCI user, reveals and transmits private information: things like ATM PIN numbers, banking locations, geographical residences, and knowledge of people the user might know.
The theoretical prospect is disturbingly sound. Applications for Intific, Emotiv, and NeuroSky games are open-sourced—like the Apple App Store—and anyone with knowledge of the programming language can write an app. If brain hackers wrote a clever enough program, they could absolutely extract the information without the subject’s knowledge. Martinovic and colleagues set out to do just that. The team designed a mock application, compatible with an Emotiv headset, which flashed random images upon the screen while test subjects wore the BCI. The goal was to identify whether or not there was enough information in the Emotiv signal to capture users’ knowledge about and familiarity with various streams of digits, banking firms, geographic locations, and famous people.
Cue sound effect from Inception. The mock brain hack worked—somewhat.
In all cases, there was a slight increase in the probability that the system could predict a user’s familiarity with a given piece of information. Some information types were more susceptible to brain hacking than others. The system, for example, could predict a subject’s month of birth at 60 per cent accuracy on the first guess, and at 95 per cent accuracy by the sixth. By no means was the system perfect, but the proof-of-concept was very real. Digital signal-processing algorithms similar to Martinovic’s were likely being used in NeuroStorm to pick out the points in time in which I was in a state of heightened focus. Under the right conditions, the same field of math that converted cat-thoughts to focus could be capable of fishing out my PIN. That was what was so unsettling. It was all the same medium. The life-altering technology of neuroprosthetic BCIs shared an industry with the ultimate skin-crawl of mind robbery. Technological improvements in one branch would inform the other. Policy directed at one could affect both.
As my submarine-like neurobot cruised past the last neuron on the screen, it was left suspended in a purplish abyss. A score tabulator appeared onscreen and informed me that I had set a new record. (!) Tiffany and her red pants gave the slightest of fist pumps; a bushel of sideburns billowed beside me in the convention center air. I was handed a malleable stress brain by an ethereal hand that emanated from behind the video screen. Up until that point, I had been utterly unaware of the fact that someone else on the Intific team had been watching my performance. Carefully peeling off my headband, Tiffany began explaining the exciting new partnership between Intific and Advanced Brain Monitoring, Inc. that had allowed for the creation of NeuroStorm. Both teams were quite excited to be entering the market, especially because their Synapse Gel-aided wet electrodes offered greater resolution than the dry electrodes at NeuroSky and Emotiv.
“We don’t even consider them competitors,” she explained, showing some teeth.
I asked her if she was aware of Martinovic et al. August 2012. She was not.
Tiffany told me that NeuroStorm had debuted at a recent Google tradeshow. I nodded; she nodded back. Together, we shared maybe three-quarters of a smile. I looked at the spongy cerebral stress ball in my hand and shifted my weight from one leg to another. I averted my eyes, uttered a lackluster “thanks and good luck,” and scampered off with the pamphlets she had somehow slipped into my hands. Damn, they were sneaky.
But other companies are even sneakier. Japanese tech designer Neurowear is the developer of Necomimi, a set of motorized plush ears strapped to a dry-electrode BCI. The ears perk up during focus, and are otherwise turned down. They come in a rainbow of colors. They are tremendously cute. Coming down the pipeline from Neurowear is Shippo (an extension of Necomimi), a motorized tail that wags a bit more spiritedly during focus in addition to updating the social network of your choice with a mood and a geotag. If all goes according to plan—for Neurowear, Intific, and the rest—they will know where you are and how you are feeling at any given point in the day, assuming you are sporting the ultra-posh animal ears that are currently selling like hotcakes. Give decoders another ten years and they will likely know what you are thinking about, too. Sure, the claim is unnecessarily inflammatory, but buried beneath the sensationalism is a kernel of scientific truth that is difficult to ignore. Open-sourcing the software (and hardware—any institutionally affiliated researcher can pick up an Emotiv headset for free) is an invitation to experimentation. Technology is only going to improve.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.