Tuesday, April 23, 2013

Samsung Demos a Tablet Controlled by Your Brain

Via MIT Tech Review http://www.technologyreview.com/news/513861/samsung-demos-a-tablet-controlled-by-your-brain/

An easy-to-use EEG cap could expand the number of ways to interact with your mobile devices.                   

One day, we may be able to check e-mail or call a friend without ever touching a screen or even speaking to a disembodied helper. Samsung is researching how to bring mind control to its mobile devices with the hope of developing ways for people with mobility impairments to connect to the world. The ultimate goal of the project, say researchers in the company’s Emerging Technology Lab, is to broaden the ways in which all people can interact with devices.

In collaboration with Roozbeh Jafari, an assistant professor of electrical engineering at the University of Texas, Dallas, Samsung researchers are testing how people can use their thoughts to launch an application, select a contact, select a song from a playlist, or power up or down a Samsung Galaxy Note 10.1. While Samsung has no immediate plans to offer a brain-controlled phone, the early-stage research, which involves a cap studded with EEG-monitoring electrodes, shows how a brain-computer interface could help people with mobility issues complete tasks that would otherwise be impossible.

Brain-computer interfaces that monitor brainwaves through EEG have already made their way to the market. NeuroSky’s headset uses EEG readings as well as electromyography to pick up signals about a person’s level of concentration to control toys and games (see “Next-Generation Toys Read Brain Waves, May Help Kids Focus”). Emotiv Systems sells a headset that reads EEG and facial expression to enhance the experience of gaming (see “Mind-Reading Game Controller”).
To use EEG-detected brain signals to control a smartphone, the Samsung and UT Dallas researchers monitored well-known brain activity patterns that occur when people are shown repetitive visual patterns. In their demonstration, the researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency.
Robert Jacob, a human-computer interaction researcher at Tufts University, says the project fits into a broader effort by researchers to find more ways for communicating with small devices like smartphones. “This is one of the ways to expand the type of input you can have and still stick the phone in the pocket,” he says.

Finding new ways to interact with mobile devices has driven the project, says Insoo Kim, Samsung’s lead researcher. “Several years ago, a small keypad was the only input modality to control the phone, but nowadays the user can use voice, touch, gesture, and eye movement to control and interact with mobile devices,” says Kim. “Adding more input modalities will provide us with more convenient and richer ways of interacting with mobile devices.”

Still, it will take considerable research for a brain-computer interface to become a new way of interacting with smartphones, says Kim. The initial focus for the team was to develop signal processing methods that could extract the right information to control a device from weak and noisy EEG signals, and to get those methods to work on a mobile device.

Jafari’s research is addressing another challenge—developing more convenient EEG sensors. Classic EEG systems have gel or wet contact electrodes, which means a bit of liquid material has to come between a person’s scalp and the sensor. “Depending on how many electrodes you have, this can take up to 45 minutes to set up, and the system is uncomfortable,” says Jafari. His sensors, however, do not require a liquid bridge and take about 10 seconds to set up, he says. But they still require the user to wear a cap covered with wires.

The concept of a dry EEG is not new, and it can carry the drawback of lower signal quality, but Jafari says his group is improving the system’s processing of brain signals. Ultimately, if reliable EEG contacts were convenient to use and slimmed down, a brain-controlled device could look like “a cap that people wear all day long,” says Jafari.

Kim says the speed with which a user of the EEG-control system can control the tablet depends on the user. In the team’s limited experiments, users could, on average, make a selection once every five seconds with an accuracy ranging from 80 to 95 percent.

“It is nearly impossible to accurately predict what the future might bring,” says Kim, “but given the broad support for initiatives such as the U.S. BRAIN initiative, improvements in man-machine interfaces seem inevitable” (see “Interview with BRAIN Project Pioneer: Miyoung Chun”).

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.