Tuesday, April 23, 2013

These Brain-Scanning Neuro-Toys Are About To Change Everything

Via http://www.fastcompany.com/3008499/tech-forecast/these-brain-scanning-neuro-toys-are-about-change-everything

 
New technology that lets users control game avatars and music playlists with their brainwaves could give stroke patients and the profoundly disabled new ways to communicate.
A small but growing industry of inventors, neurologists, and investors are betting on consumers controlling smartphones, music players, and even desktop computers with their brains.
Innovations in software development kits (SDKs) alongside cheap, ever more sophisticated brainwave readers mean people with money to spend can play computer games through thought alone. But the products they are using--and the patents behind them--could change the world for the neurologically impaired in a decade or two.

Last month at SXSW, Canadian neuroscientist and artist Ariel Garten showed off her commercial brainchild. The Muse is a $200 sensor-enabled headband which connects with PCs and Macs, and allows user to control games with their thoughts or engage in rudimentary neurofeedback. Garten spoke about the Muse and her company, Interaxon, in late 2012 at a TEDx talk in Toronto which went viral thanks to a discussion of the technology the Muse could lead to. Headbands are expected

to ship to customers in late 2013.


Using the Muse was an interesting experience. I had the opportunity to test a prototype out, and the headband slipped on easily--no sterile environment or special electrode setup was required. The headband was accompanied by a number of games and apps, all of which turn brainwaves into data input through embedded electroencephalograph (EEG) sensors. Although the games were dead simple, they were controlled by my thoughts. I was able to manipulate my avatar's motions on screen by thinking happy, sad, or anxious thoughts. Whenever I tried to throw the interface a curveball, it appeared to decently react to whatever line of thought or emotion I was engaged in.

Garten told Fast Company that she first began experimenting with brain-computer interfaces in 2003. Along with InteraXon co-founder Chris Aimone, she created public art installations where people's brainwaves could change the art. “We started by creating concerts where 48 people at a time could control a musician's output, which would then effect people's brain state when they heard it, in a regenerative cycle. We went on to create more musical performances, where musicians could be jamming along to music directly with their brain, it was tons of fun,” Garten said.
EEG-reading headbands aren't only used for consumer games either. Another product making the rounds at SXSW was the Zen Tunes app from Japanese firm Neurowear. Neurowear, who were featured in Co.Design a few years ago for their cosplay brain-powered cat ears (really), manufactured an integrated prototype headset and iOS app combo which generates playlists tailored to a user's brainwaves. Neurowear customers put on an EEG-enabled headset and load songs from their music library onto a playlist. Once the songs are playing, algorithms within the Zen Tunes app analyze brainwaves for EEG patterns associated with focus and relaxation. These patterns are then used to sort music into playlists that, ideally, will match user's specific moods.

Brain-computer interfaces (BCIs) have been around since the 1970s, when clunky EEG readers were used in laboratory settings for rudimentary neurofeedback and biofeedback programs. Although the readings and data inputs from EEG readers have not changed significantly over the past forty-odd years, the equipment used has changed significantly. Instead of requiring a university laboratory or a quiet room without sounds from the outside causing false positives, and instead of requiring nurses or lab technicians to assist with setup, they have become consumer technology. The Muse headband, Neurowear's floppy animal ears, and competing products from firms like Axio are easy-to-use diversions for anyone with a few hundred dollars to burn. Today's brain-reading headbands require no medical training to use, have a tiny learning curve, and frankly are a ton of fun to use.

As Garten put it, “The main concept in brain-machine interfaces is that changes in your brain are reflected in changes in some signal, in our case EEG, which can then be used as a kind of control action to a machine, without the need of using any physical action or command. Your brainwaves (EEG) are small electrical potentials on your scalp as a result of neurons firing in your brain, and Muse's four electrodes record this fluctuating voltage several hundred times per second. These voltages are converted to digital signals and a stream of numbers is sent to your PC or Mac via Bluetooth.”
The SDK lets users turn these EEGs into data, which can then control program avatars. Alternately, developers could use the SDK to write neurofeedback software which lets users view their brain behavior and trace patterns related to hyperactivity or anxiety. According to Garten, Muse's SDK will also provide some preliminary analysis tools that let you extract more meaningful interpretations of the data, such as the power of "alpha" or "beta" frequencies, and use that as control signals to various devices.

However, the real future potential for brain-computer interfaces is in healthcare. When I spoke to Garten, she was outgoing about everything but potential repurposing of Muse's technology for clinical applications. There's a reason for that. Brain-computer interfaces, and higher-end versions of the sensors used in consumer headbands like Muse have world-changing ramifications for traumatic brain injury patients, stroke victims, and individuals with physical disabilities.

Kamran Fallahpour is a psychologist at New York's Brain Resource Center who has used brain-computer interfaces in the workplace for more than 20 years. The Brain Resource Center takes advantage of brain mapping and mind-computer interfaces for patients with everything from mood disorders to traumatic brain injuries. Other patients are professional musicians or actors seeking brain mapping in the course of peak performance training. When these patients visit their office, they essentially use a more complicated version of Neurowear and Muse's software input kits.
According to Fallahpour, the big innovation in brain-computer interfaces is the ever-increasing capabilities of computers. Even an iPhone has sheer processing power to parse through data points that a 1990s-vintage 486 could not. The human mind is immensely complicated and neurologists understand very little of it. Nonetheless, even the information that brain-computer interfaces transform into bits and bytes overwhelmed past computers. Advances in technology mean it's possible to now have basic home mind-reading headbands for your smartphone or laptop--something that was science fiction until quite recently.

But while it's amazingly fascinating to control avatars in Angry Birds-type games with thoughts, it's still all fun and games. According to Fallahpour, most consumer EEG headbands and brain-computer interfaces are “toys” that lack the capabilities of research and clinical-grade systems. The inexpensive sensors used create a large number of artifacts, are very vague, can be effected by physical movement, and only read basic emotions like stress and relaxation. The more sophisticated versions of these commercial brain-computer interfaces are now being used in hundreds of laboratories nationwide in neurofeedback projects that treat post-traumatic stress disorder, hyperactivity, and a host of other conditions.

Other brain-computer interfaces, however, are far more sophisticated. Back in 2010, Fast Company reported on an early project to type into computers using brain-computer interfaces. Since then, they have gotten even more complicated—and can change the world for disabled patients. Researchers at Drexel University College of Medicine in Pennsylvania are currently studying brain-computer interfaces for ALS patients. Using laboratory-quality EEG headsets, scientists hope to see if individuals with ALS with “extreme loss of neuromuscular control and severe communication impairments” can make selections on a computer screen with a brain. In similar projects, patients were able to type short text messages using only their brain waves.

Drexel is currently recruiting participants with ALS in the Philadelphia metropolitan area for the study.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.