Alisa Brownlee, ATP, CAPS blog offers recent articles and web information on ALS, assistive technology--augmentative alternative communication (AAC), computer access, and other electronic devices that can impact and improve the quality of life for people with ALS. Email--abrownlee@alsa-national.org. Any views or opinions presented on this blog are solely those of the author and do not necessarily represent those of the ALS Association.
Monday, March 6, 2017
Brain-controlled Robots
https://www.ft.com/content/0b5a1fdc-0259-11e7-aa5b-6bb07f5c8e12
US researchers have taken a step toward telepathic communication from people to robots, by developing a mind-reading device that allows humans to correct a machine instantly with nothing more than brainwaves.
The prototype brain-computer interface developed by Massachusetts Institute of Technology enables a human observer to transmit an immediate error message to a robot, telling it to fix a mistake when it does something wrong.
Technology that allows humans to interact with robots intuitively through their thoughts could have a range of industrial and medical applications, from robotic limbs to self-driving vehicles.
“Imagine being able to instantaneously tell a robot to do a certain action, without needing to type a command, push a button or even say a word,” said Daniela Rus, director of MIT Computer Science and Artificial Intelligence Laboratory. “A streamlined approach like that would improve our abilities to supervise factory robots, driverless cars and other technologies we haven’t even invented yet.”
The MIT prototype uses an electroencephalography (EEG) cap to record the human brain activity.
While it was designed to handle simple binary choice activities — in this case sorting objects into two categories — Professor Rus hopes that after further research it will enable people to interact with more complex robots.
Brain-computer interfaces are one of the hottest research fields in science.
Other labs are developing them to let disabled people operate robotic limbs or to give communications ability to locked-in patients who are too paralysed even to blink their eyes.
The problem is that these systems generally need either an electronic implant or, when using an EEG, considerable training to get the computer to recognise the user’s brainwaves.
The MIT team, working with Boston University neuroscientists, focused on brain signals called error-related potentials, or ErrPs, that the brain generates when it notices a mistake. They were able to detect a characteristic neural pattern from the observer’s brainwaves that was picked up within a 100th of a second by the team’s machine-learning algorithm.
“As you watch the robot, all you have to do is mentally agree or disagree with what it is doing,” explains Professor Rus. “You don’t have to train yourself to think in a certain way. The machine adapts to you and not the other way around.”
The robot, called Baxter, recognised error signals from 12 volunteers who had no training or previous EEG experience. When the human observer saw Baxter moving to put an object into the wrong box, the brain signal came in time for the robot to correct its manoeuvre.
As ErrP signals can be quite faint, the system can also pick up a stronger “secondary error” brain message if the robot has not corrected its mistake.
“Today’s robots aren’t great communicators . . . Think of the potential if they could read our thoughts,” said Prof Rus, who views the development as a coming age of what she calls “pervasive robotics”. The technology could enable passengers in self-driving cars to become effective “back-seat drivers”.
Wolfram Burgard, professor of computer science at Freiburg university, who was not involved in the project, said: “This work brings us closer to developing effective tools for brain-controlled robots and prostheses.
Given how difficult it can be to translate human language into a meaningful signal for robots, work in this area could have a truly profound impact on the future of human-robot collaboration.”
The MIT team’s results will be presented at the IEEE International Conference on Robotics and Automation in Singapore in May.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.