Alisa Brownlee, ATP, CAPS blog offers recent articles and web information on ALS, assistive technology--augmentative alternative communication (AAC), computer access, and other electronic devices that can impact and improve the quality of life for people with ALS.
Any views or opinions presented on this blog are solely those of the author and do not necessarily represent those of the ALS Association.
A team of South African scientists have managed to hook up a human brain to the internet in real time for the first time. Dubbed Brainternet, it is a significant first step toward a plethora of brain-controlled IoT applications.
The researchers used a tiny Raspberry Pi computer paired with an EEG, a device that receives electrical information from electrodes placed on a person’s head. The device sent information instantly to a website, where any person or device was able to access it in real time.
The Brainternet team is led by Adam Pantanowitz, electrical engineer, researcher and lecturer at Wits University in Johannesburg.
“Brainternet is a new frontier in brain-computer interface systems […] Brainternet seeks to simplify a person’s understanding of their own brain and the brains of others; it does this through continuous monitoring of brain activity as well as enabling some interactivity.”
In other words, while the Brainternet discovery is still limited to a mere display of brain activity, other researchers around the globe are working on turning EEG electrical signals into actionable commands that computers and other internet-enabled devices can understand and follow, called brain-computer interfaces (BCI).
Pairing a BCI with a Brainternet-like device is the step forward that will begin erasing the boundaries between the human brain and machines, allowing us to envision a very near future where our knowledge and physical reach can expand as we have instant access to the internet.
As mobile internet becomes more widespread and with the proliferation of companies working in interfaces for human interaction like Google Glass, Oculus Rift and other virtual and augmented technologies, it is not a stretch to envision that a tight integration of thoughts and machines will be common within our lifetimes.
Wikipedia, news, social media and all kinds of online content are getting closer to becoming seamlessly integrated with the human brain’s electrical activity. Met someone you like? Find them on Instagram or Facebook right away. Met a Silicon Valley entrepreneur or talent? Connecting to Lifographwould give you all the relevant info you need to decide whether they are good fit for your company or not—in a matter of seconds.
Furthermore, as we move on from the IoT to the Tactile Internet, the flow of real-time information from your brain to internet-connected devices could get your coffee brewing with a thought from bed, or enable a surgeon to control a robotic arm and save a child in Africa.
The possibilities are as endless, as are the profound changes BCI and Brainternet-like technologies imply for our tech-dependent existence.
For people who are paralyzed or have mobility issues due to injury or illnesses, using a mouse, keyboard, or touchscreen can be difficult, if not impossible. And computer interfaces designed for them can be prohibitively expensive.
Giving people with disabilities better access to the world around them is the aim of the Neil Squire Society, a nonprofit in Burnaby, B.C., Canada. The organization is named after a man who became paralyzed in the early 1980s from injuries he suffered in a car accident. To help him communicate, Squire’s cousin, an engineer, connected an Apple 2E computer to a Morse code transmitter that Squire could control using “sip and puff” technology, which sends electrical signals by sipping and puffing on a tube. The device became known as the Joust.
Last year the society teamed up with the Google Foundation to build a more advanced version of Joust: the LipSync, a joystick that allows a person to control a computer cursor with minimal head and neck movements. A hollow mouthpiece is attached to a sensor on the joystick that requires only slight pressure to move a cursor on a computer screen. Users can “click” the left and right mouse buttons by inhaling or exhaling into the mouthpiece.
“While similar solutions exist for desktop computers, they can cost up to US $1,500 and do not work well on mobile devices,” says Chad Leaman, the society’s director of innovation. He’s also the founder of its Makers Making Change initiative, which connects tinkerers with people who have disabilities and need assistive technologies.
“What makes the LipSync unique is that we are releasing it open-source, so it can be built for about $300 worth of parts by anyone who wants to volunteer their time to put the unit together.”
Makers Making Change has held hackathons throughout the year in which students and other makers help build the LipSyncs. The organization sends engineers to the events to guide the participants through the building process. They provide instructions as well as 3D printers and soldering irons as needed. The LipSyncs are then donated to rehabilitation centers, disability organizations, or directly to those in need.
At the society’s first hackathon, held in January in Vancouver, volunteers built a system that could help a paralyzed man alert his wife, who is hard of hearing, if he’s in distress. The LipSync connected with a vibrating alarm outfitted with flashing lights that could alert her even if she’s sleeping.
The society recently held a make-a-thon in partnership withTOM:Albertaand the University of Calgary. The participants, including a team of IEEE student members, built more than 20 LipSyncs during the weekend. Events also were held in June in Philadelphia and Seattle as part of theNational Week of Making.
“This year alone, with the help of the maker community, we’ve built 150 LipSync devices,” Leaman says. “Our goal is to build 1,500 more over the next year and a half. We need engineers and students across Canada and the United States to help bring the device into their communities.”