Alisa Brownlee, ATP, CAPS blog offers recent articles and web information on ALS, assistive technology--augmentative alternative communication (AAC), computer access, and other electronic devices that can impact and improve the quality of life for people with ALS. Email--abrownlee@alsa-national.org. Any views or opinions presented on this blog are solely those of the author and do not necessarily represent those of the ALS Association.
Thursday, March 2, 2017
Steering A Turtle With Your Thoughts
Researchers have developed a technology that can remotely control an animal’s movement with human thought.
Asian Scientist Newsroom | March 2, 2017 | Technology AsianScientist (Mar. 2, 2017)
Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have developed a brain-computer interface (BCI) that can control a turtle using human thought. Their findings have been published in the Journal of Bionic Engineering.
Unlike previous research—most notably in insects—that has tried to control animal movement by applying invasive methods Professors Lee Phill-Seung and Jo Sungho of KAIST propose a conceptual system that can guide an animal’s moving path by controlling its instinctive escape behavior.
They chose a turtle because of its cognitive abilities as well as its ability to distinguish different wavelengths of light. Specifically, turtles can recognize a white light source as an open space and so move toward it. They also show specific avoidance behavior to things that might obstruct their view.
Turtles also move toward and away from obstacles in their environment in a predictable manner. The entire human-turtle setup is as follows: A head-mounted display (HMD) is combined with a BCI to immerse the human user in the turtle’s environment. The human operator wears the BCI-HMD system, while the turtle has a ‘cyborg system’—consisting of a camera, Wi-Fi transceiver, computer control module, and battery—all mounted on the turtle’s upper shell. Also included on the turtle’s shell is a black semi-cylinder with a slit, which forms the ‘stimulation device.’ This can be turned ±36 degrees via the BCI. The human operator receives images from the camera mounted on the turtle. These real-time video images allow the human operator to decide where the turtle should move. The human provides thought commands that are recognized by the wearable BCI system as electroencephalography signals. The BCI can distinguish between three mental states: left, right, and idle. The left and right commands activate the turtle’s stimulation device via Wi-Fi, turning it so that it obstructs the turtle’s view. This invokes its natural instinct to move toward light and change its direction.
Finally, the human acquires updated visual feedback from the camera mounted on the shell and in this way continues to remotely navigate the turtle’s trajectory. The researchers demonstrates the animal guiding BCI in a variety of environments, with turtles moving indoors and outdoors on many different surfaces, like gravel and grass, and tackling a range of obstacles, such as shallow water and trees. This technology could be developed to integrate positioning systems and improved augmented and virtual reality techniques, enabling various applications, including devices for military reconnaissance and surveillance.
The article can be found at: Kim et al. (2016) Remote Navigation of Turtle by Controlling Instinct Behavior via Human Brain-computer Interface
Source: Korea Advanced Institute of Science and Technology. Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff. Read more from Asian Scientist Magazine at: https://www.asianscientist.com/2017/03/tech/turtle-human-brain-computer-interface/
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.