Alisa Brownlee, ATP, CAPS blog offers recent articles and web information on ALS, assistive technology--augmentative alternative communication (AAC), computer access, and other electronic devices that can impact and improve the quality of life for people with ALS. Email--abrownlee@alsa-national.org. Any views or opinions presented on this blog are solely those of the author and do not necessarily represent those of the ALS Association.
Friday, August 24, 2012
Using an App as Prescribed
From: The New York Times - 08/19/2012
By: Joshua Brustein
Before long, your doctor may be telling you to download two apps and call her
in the morning.
Smartphone apps already fill the roles of television remotes, bike
speedometers and flashlights. Soon they may also act as medical devices,
helping patients monitor their heart rate or manage their diabetes, and be
paid for by insurance.
The idea of medically prescribed apps excites some people in the health care
industry, who see them as a starting point for even more sophisticated
applications that might otherwise never be built.
Read the entire article at:http://www.nytimes.com/2012/08/20/technology/coming-next-doctors-prescribing-apps-to-patients.html
Links:
Happtiquehttp://www.happtique.com/
WellDochttp://www.welldoc.com/
New Mobile App Aims to Monitor Diabeteshttp://www.welldoc.com/LinkClick.aspx?fileticket=BIlRqU6iupU%3d&tabid=104
Continua Health Alliancehttp://www.continuaalliance.org/index.html
Coming Soon: Artificial Limbs Controlled by Thoughts
From: Scientific American - 08/21/2012
By: Miguel A. Nicolelis
The idea that paralyzed people might one day control their limbs just by
thinking is no longer a Hollywood-style fantasy
Brain waves can now control the functioning of computer cursors, robotic arms
and, soon, an entire suit: an exoskeleton that will allow a paraplegic to
walk and maybe even move gracefully.
Sending signals from the brain's outer rindlike cortex to initiate movement
in the exoskeleton represents the state of the art for a number of
bioelectrical technologies perfected in recent years.
The 2014 World Cup in Brazil will serve as a proving ground for a
brain-controlled exoskeleton if, as expected, a handicapped teenager delivers
the ceremonial opening kick.
Read the entire article at:http://www.scientificamerican.com/article.cfm?id=artificial-limbs-controlled-by-thought
Links:
First Steps toward a Robotic Leg Suit for Paraplegics (with video 3:08)http://www.scientificamerican.com/article.cfm?id=exoskeleton-first-steps-towards-robotic-leg-suit-for-paraplegics
Mind Out of Body: Controlling Machines with Thoughthttp://www.scientificamerican.com/article.cfm?id=mind-out-of-body
NicolelisLabhttp://www.nicolelislab.net/
Computing with Neural Ensembleshttp://www.neuro.duke.edu/faculty/nicolelis/
Wednesday, August 22, 2012
New study uncovers brain's code for pronouncing vowels
August 21, 2012 in Neuroscience Scientists have unraveled how our brain cells encode the pronunciation of individual vowels in speech. The discovery could lead to new technology that verbalizes the unspoken words of people paralyzed by injury or disease.
Read more at: http://medicalxpress.com/news/2012-08-uncovers-brain-code-pronouncing-vowels.html#jCp
Diagnosed with Lou Gehrig's disease at 21, British physicist Stephen Hawking, now 70, relies on a computerized device to speak. Engineers are investigating the use of brainwaves to create a new form of communication for Hawking and other people suffering from paralysis. -
Daily Mail Scientists at UCLA and the Technion, Israel's Institute of Technology, have unraveled how our brain cells encode the pronunciation of individual vowels in speech. Published in the Aug. 21 edition of Nature Communications, the discovery could lead to new technology that verbalizes the unspoken words of people paralyzed by injury or disease. "We know that brain cells fire in a predictable way before we move our bodies," explained Dr. Itzhak Fried, a professor of neurosurgery at the David Geffen School of Medicine at UCLA. "We hypothesized that neurons would also react differently when we pronounce specific sounds. If so, we may one day be able to decode these unique patterns of activity in the brain and translate them into speech."
Fried and Technion's Ariel Tankus, formerly a postdoctoral researcher in Fried's lab, followed 11 UCLA epilepsy patients who had electrodes implanted in their brains to pinpoint the origin of their seizures. The researchers recorded neuron activity as the patients uttered one of five vowels or syllables containing the vowels.
With Technion's Shy Shoham, the team studied how the neurons encoded vowel articulation at both the single-cell and collective level. The scientists found two areas—the superior temporal gyrus and a region in the medial frontal lobe—that housed neurons related to speech and attuned to vowels. The encoding in these sites, however, unfolded very differently.
Neurons in the superior temporal gyrus responded to all vowels, although at different rates of firing. In contrast, neurons that fired exclusively for only one or two vowels were located in the medial frontal region. "Single neuron activity in the medial frontal lobe corresponded to the encoding of specific vowels," said Fried. "The neuron would fire only when a particular vowel was spoken, but not other vowels."
At the collective level, neurons' encoding of vowels in the superior temporal gyrus reflected the anatomy that made speech possible–specifically, the tongue's position inside the mouth. "Once we understand the neuronal code underlying speech, we can work backwards from brain-cell activity to decipher speech," said Fried. "This suggests an exciting possibility for people who are physically unable to speak. In the future, we may be able to construct neuro-prosthetic devices or brain-machine interfaces that decode a person's neuronal firing patterns and enable the person to communicate."
Read more at: http://medicalxpress.com/news/2012-08-uncovers-brain-code-pronouncing-vowels.html#jCp
Read more at: http://medicalxpress.com/news/2012-08-uncovers-brain-code-pronouncing-vowels.html#jCp
Diagnosed with Lou Gehrig's disease at 21, British physicist Stephen Hawking, now 70, relies on a computerized device to speak. Engineers are investigating the use of brainwaves to create a new form of communication for Hawking and other people suffering from paralysis. -
Daily Mail Scientists at UCLA and the Technion, Israel's Institute of Technology, have unraveled how our brain cells encode the pronunciation of individual vowels in speech. Published in the Aug. 21 edition of Nature Communications, the discovery could lead to new technology that verbalizes the unspoken words of people paralyzed by injury or disease. "We know that brain cells fire in a predictable way before we move our bodies," explained Dr. Itzhak Fried, a professor of neurosurgery at the David Geffen School of Medicine at UCLA. "We hypothesized that neurons would also react differently when we pronounce specific sounds. If so, we may one day be able to decode these unique patterns of activity in the brain and translate them into speech."
Fried and Technion's Ariel Tankus, formerly a postdoctoral researcher in Fried's lab, followed 11 UCLA epilepsy patients who had electrodes implanted in their brains to pinpoint the origin of their seizures. The researchers recorded neuron activity as the patients uttered one of five vowels or syllables containing the vowels.
With Technion's Shy Shoham, the team studied how the neurons encoded vowel articulation at both the single-cell and collective level. The scientists found two areas—the superior temporal gyrus and a region in the medial frontal lobe—that housed neurons related to speech and attuned to vowels. The encoding in these sites, however, unfolded very differently.
Neurons in the superior temporal gyrus responded to all vowels, although at different rates of firing. In contrast, neurons that fired exclusively for only one or two vowels were located in the medial frontal region. "Single neuron activity in the medial frontal lobe corresponded to the encoding of specific vowels," said Fried. "The neuron would fire only when a particular vowel was spoken, but not other vowels."
At the collective level, neurons' encoding of vowels in the superior temporal gyrus reflected the anatomy that made speech possible–specifically, the tongue's position inside the mouth. "Once we understand the neuronal code underlying speech, we can work backwards from brain-cell activity to decipher speech," said Fried. "This suggests an exciting possibility for people who are physically unable to speak. In the future, we may be able to construct neuro-prosthetic devices or brain-machine interfaces that decode a person's neuronal firing patterns and enable the person to communicate."
Read more at: http://medicalxpress.com/news/2012-08-uncovers-brain-code-pronouncing-vowels.html#jCp
Tuesday, August 21, 2012
I Think, Therefore I Spell
From: Scientific American - 09/2012 - page 28
By: Ferris Jabr
Researchers are developing new ways to help the paralyzed communicate with
their thoughts alone. Many of the new techniques rely on computers that
analyze patients' brain activity and translate it into letters or other
symbols. In a study published online in June in Current Biology, Bettina
Sorger of Maastricht University in the Netherlands and her colleagues taught
six healthy adults to answer questions by selecting letters on a computer
screen with their thoughts.
Read the preview (omits last two paragraphs) at:
http://www.sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=11595E82-237D-9F22-E82489D23CAAFDF3
Links:
New Brain-Machine Spelling Device Could Help the Paralyzed Communicate
http://www.scientificamerican.com/article.cfm?id=fmri-spelling-device
Bettina Sorger
http://www.psychology.unimaas.nl/Base/Medewerkerspersonal/Bettinasorger_extended.htm
Neuronal ensemble control of prosthetic devices by a human with tetraplegia
http://www.nature.com/nature/journal/v442/n7099/abs/nature04970.html
Mind Reading Computer System May Help People with Locked-in Syndrome
http://www.nsf.gov/news/special_reports/science_nation/brainmachine.jsp
By: Ferris Jabr
Researchers are developing new ways to help the paralyzed communicate with
their thoughts alone. Many of the new techniques rely on computers that
analyze patients' brain activity and translate it into letters or other
symbols. In a study published online in June in Current Biology, Bettina
Sorger of Maastricht University in the Netherlands and her colleagues taught
six healthy adults to answer questions by selecting letters on a computer
screen with their thoughts.
Read the preview (omits last two paragraphs) at:
http://www.sciamdigital.com/index.cfm?fa=Products.ViewIssuePreview&ARTICLEID_CHAR=11595E82-237D-9F22-E82489D23CAAFDF3
Links:
New Brain-Machine Spelling Device Could Help the Paralyzed Communicate
http://www.scientificamerican.com/article.cfm?id=fmri-spelling-device
Bettina Sorger
http://www.psychology.unimaas.nl/Base/Medewerkerspersonal/Bettinasorger_extended.htm
Neuronal ensemble control of prosthetic devices by a human with tetraplegia
http://www.nature.com/nature/journal/v442/n7099/abs/nature04970.html
Mind Reading Computer System May Help People with Locked-in Syndrome
http://www.nsf.gov/news/special_reports/science_nation/brainmachine.jsp
Subscribe to:
Posts (Atom)