Follow by Email

Wednesday, February 26, 2014

32 and 16 Years Ago - Computer Help




From: IEEE Computer - January 2014 - page 16

 

Personal computers aid the handicapped

From: IEEE Computer - January 1982

By: Ware Myers - Computer staff

 

"The personal computer provides a new kind of leverage for bringing aid to the handicapped," declared Paul L. Hazan, director of the First National Search for Applications of Personal Computing to Aid the Handicapped. The search was conducted by the Applied Physics Laboratory of the Johns Hopkins University with funding provided by the National Science Foundation and Radio Shack, a division of Tandy Corporation. The Computer Society was a program associate of the effort.

 

"Over the years a great deal of worthy research has gone on," Hazan continued. Unfortunately the end result of much of the earlier research in the field was costly special-purpose equipment that sometimes ran as much as

$70,000 to $100,000 per individual helped. Consequently, it was difficult to find funds to get the products into the marketplace. Moreover, continued maintenance of special-purpose equipment was difficult and expensive. The final payoff - the number of handicapped helped - was therefore limited.

 

Personal computer leverage. The advent of mass marketed and reasonably priced computers brings with it the potential for change in the existing situation, Hazan pointed out. He mentioned that although it has long been recognized that computers extend an individual's mental reach, in the case of the handicapped (with restricted physical capabilities), the possibility also exists to extend the physical reach of this group of users.

 

If the personal computer can be brought to bear on this problem, there are a number of built-in advantages. First, it is now low enough in cost for the handicapped themselves, or their families and friends, to afford; alternatively, in the workplace an employer can finance it for a potential employee. This makes a large, centrally financed support program unnecessary.

 

Secondly, the infrastructure for the application of personal computers already exists. There is a nationwide - even worldwide - network of dealers, maintenance, and training, and arrangements for the distribution of programs are growing.

 

Finally, personal computers to aid the handicapped constitute a significant business opportunity, both for the makers and marketers of personal computers and for those who construct and sell peripherals and input/output devices.

Given 20 million handicapped in the United States (a conservative estimate), Hazan calculates that if only two percent of them acquire a personal computer, they would create a potential market of 400,000 buyers - a figure in the same ball park as the total number of personal computers sold to date.

Assuming that the average price for the units is $2000, including peripherals, input/output devices, and programs, the actual dollar value of this market is $800,000,000. "Enough for industry to pay attention," Hazan noted. And the two percent market is just a guess. No doubt it will ultimately be much more. The point is that while the personal computer may be just a hobby for the able-bodied, with the proper applications it can become a necessity for people with a variety of disabilities. 

 

The search for applications. The First National Search, announced in November 1980, was an effort to bring grassroots initiatives to bear on the task of finding a variety of methods to apply the personal computer to the needs of the handicapped. It was highlighted by a national competition for ideas, devices, methods, and computer programs to help handicapped people overcome difficulties in learning, working, and successfully adapting to home and community settings.

 

In the spring, orientation workshops were held at major rehabilitation centers throughout the United States to bring together potential "inventors,"

handicapped people, and professionals in the educational, technical, and rehabilitation fields. Over 900 entries were received by the June 30, 1981 deadline.

 

In August regional exhibits were held in ten cities - Boston, New York, Baltimore, Atlanta, Chicago, Houston, Kansas City, Denver, San Francisco, and Seattle. Awards were made to over 100 regional winners. From the pool of regional winners a national panel of judges selected 30 entrants to exhibit their work in Washington, DC. Of these, 28 made it to the Great Hall of the National Academy of Sciences on October 31 and November 1, attracting substantial numbers of the handicapped and those who work with them, as well as three or four television news crews. One of the reporters, himself blind, represented National Public Radio. 

 

The next day the winners were honored at a banquet in the Mayflower Hotel.

This banquet was also attended by government and industry representatives with an interest in the subject. At the dinner the three top-place winners and seven honorable mention recipients were named (see photos and box).

 

During the following two days the 28 winners explained their developments at a workshop held at the Applied Physics Laboratory, near Washington.

Proceedings of this conference containing almost 100 papers - all the regional and national winners - are available from the Computer Society. 

 

What next? The Applied Physics Laboratory has a National Science Foundation grant to study the feasibility of setting up a data base to hold application programs for the handicapped. The search turned up a number of excellent programs and some means of making them available to handicapped users is needed. If the idea is feasible and funding becomes available, a potential user could dial up the data base, select programs of interest from a menu, view a demonstration of the program he selects, and ultimately download it into his own equipment.

 

The First National Search is now history and the word first implies a second.

There seems to be general agreement that the making of inventions is too time-consuming for an annual search to be practical. The receipt of inquiries from 19 countries also suggests that something more than "national" is needed. Hazan expects another search to follow, but there is much work to be done and funds to be raised before it can be launched.

 

Photo Caption:

Lewis F. Kornfeld (left), retired president of Radio Shack, presents the first prize of $10,000 to Harry Levitt of the City University of New York for his Portable Telecommunicator for the Deaf. Levitt programmed a TRS 80 pocket computer to send and receive messages over the telephone via a TRS interface, enabling the deaf to commuhicate with each other or with their normal-hearing friends.

 

The other award winners were

Second Prize ($3000): Mark Friedman, Mark Dzmura, Gary Kiliany, and Drew Anderson - Eye Tracker

 

Third Prize ($1500): Robin L. Hight - Lip Reader Trainer

 

Honorable Mention Awards ($500):

Joseph T. Cohn - Augmentative Communication Devices Randy W. Dipner - Micro-Braille System Sandra J. Jackson - Programs for Learning Disabled David L. Jaffe - Ultrasonic Head Control for Wheelchair Raymond Kurzweil - Reading Machine for Blind Paul F. Schwejda - Firmware Card and Training Disk Robert E. Stepp III - Braille Word Processor

 


(IEEE membership required)

For Those Unable To Talk, A Machine That Speaks Their Voice


fromKPLU

             

Carl Moore, a former helicopter mechanic, was diagnosed with ALS 20 years ago. He has had unusual longevity for someone with ALS but expects someday to rely on his wheelchair and speech-generating device.Carl Moore, a former helicopter mechanic, was diagnosed with ALS 20 years ago. He has had unusual longevity for someone with ALS but expects someday to rely on his wheelchair and speech-generating device.



Carl Moore, a former helicopter mechanic, was diagnosed with ALS 20 years ago. He has had unusual longevity for someone with ALS but expects someday to rely on his wheelchair and speech-generating device.
Carl Moore, a former helicopter mechanic, was diagnosed with ALS 20 years ago. He has had unusual longevity for someone with ALS but expects someday to rely on his wheelchair and speech-generating device.

It's hard to imagine a more devastating diagnosis than ALS, also called Lou Gehrig's disease. For most people, it means their nervous system is going to deteriorate until their body is completely immobile. That also means they'll lose their ability to speak.


So Carl Moore of Kent, Wash., worked with a speech pathologist to record his own voice to use later — when he can no longer talk on his own.


Most ALS patients live only a few years after diagnosis, but Moore, a former helicopter mechanic, is the exception — he was diagnosed 20 years ago. At the beginning, he lost use of his hands, and it wasn't until years later that he found that the symptoms were affecting his speech.


Carl Moore shows some of the phrases he's recorded in his own voice and stored on his speech-generating device.ii
Carl Moore shows some of the phrases he's recorded in his own voice and stored on his speech-generating device.

Carl Moore shows some of the phrases he's recorded in his own voice and stored on his speech-generating device.
Carl Moore shows some of the phrases he's recorded in his own voice and stored on his speech-generating device.


"You can hear my three-shots-of-tequila speech," he says. "And it does get worse as I get tired."
So several years ago, before that slur crept in, he recorded hundreds of messages and uploaded them to the speech device he'll someday rely on. The machine looks like a chunky tablet computer, and it would normally sound like a robot. But now, instead, it will sound like Moore.


"It's almost like preserving a piece of yourself," he says. "I've taken auditory pictures of who I am."
Moore's banked messages range from the practical ("I feel tired") to the absurd ("You know what? Your driving sucks") and somewhere in the middle ("Hey, my butt itches. Would you give it a bit of a scratch?").

Moore is kind of a snarky guy — some of his messages can't be played in decent company. It's a part of his personality that he's rescuing from the disease.


And it's not just for his own benefit. Message banking also helps his caregiver: his wife, Merilyn.
"If it's a computer voice, I think it's harsh," she says, "whereas if it's his own voice, I can feel like he's actually speaking those words."


John Costello, a speech pathologist at Boston Children's Hospital, is credited with inventing the clinical use of voice banking. He says it can make a big difference in people's quality of life.
"If you wanted to say something like, 'You're the love of my life,' having that in synthetic speech is devastating," Costello says.


One patient's wife, he says, contacted him shortly after her husband's death. "She wrote to me that the work that we did was the only bright forward movement. Everything was about loss, except the possibility of communication."


"It gives the patient something to do when they have no control over the disease," she says.
Yet for all its benefits, in Kelley's clinic, only a fraction of patients actually do it.


"The ones that don't do it can't deal with it," she says. "They don't want to think about using an electronic piece of equipment to talk. So most of them nod, smile and do nothing."


Heartbreakingly, many come back hoping to record their voices after it's too late.


Carl, on the other hand, brings a mechanic's pragmatism to the project, and he's clearly having some fun too. Besides letting him razz Merilyn for years to come, the recordings will become an archive for her.


"I see this also as a legacy, which will feel like his presence with me even after he's gone," she says.
So Merilyn wants to make sure Carl has banked the really important things — which raises a question: Where, among the witty barbs and the practical lines, were the messages of tenderness, of intimacy?


"My conversations are mostly sarcastic," he says. "She asked me before we left if I had the phrase 'I love you,' and I realized I didn't."


He says he'll make more recordings at some point — sooner rather than later. The trouble, he says, is his voice has already gone downhill.


"We'll see how it works out. I'm not comfortable with recording my voice as it is," he says.
"I think that it's important that we capture you as you are now," Merilyn says. "We love you as you are now just as much as eight years ago."


"So I will record, 'Yes, dear.' "


Later, Carl dug back through his hard drive and discovered that he had, indeed, recorded himself saying "I love you." He added it to the device that will someday speak for him.

Monday, February 24, 2014

The Eyes Have It-- Eye Gaze Technology

by David Harraway

Eye Gaze technology may provide some people with disabilities with effective access to required communication and computer control functions where other methods prove too difficult or inefficient for them.

Current commercially available systems consist of an eye tracking camera plus a software interface, which allows the person to bring their own computer hardware. Speech generating devices with integrated Eye Gaze units are also available.


Eye Gaze is an Assistive Technology (AT) area that has undergone significant advances in recent years. Two current leading systems available in Australia are Tobii PCEye Go (manufactured by Tobii Technologies, Sweden) and Inteligaze CAM30NT (from Alea Technologies, Germany). Both are sold with advanced mouse control interaction and basic popup onscreen keyboards. When coupled with additional specialized and fully integrated software, these systems can offer increased independence with required communication and computer control functions.


New mainstream eye tracking options are now available and have generated some interest online in Assistive Technology discussion forums. This interest has largely been due to the lower cost of the hardware. My Gaze This is an external link is one of these and was shown at the Eurogamer Expo 2013 to positive reviews. Q4 2013 saw the arrival of the Eye Tribe tracker (theeyetribe.com This is an external link). Just last month, Tobii featured the Tobii Eye X This is an external link at Las Vegas Consumer Electronics Show. Both Eye Tribe and Tobii Eye X are on the market as “developer release” and are available online for between US $100 -200.
It needs to be stated clearly from the outset that the primary purpose of a developer release is to allow programmers to make their new and existing applications compatible with Eye Gaze access. These systems do not and should not be directly compared to fully -integrated assistive technology solutions as they lack the necessary features and setting adjustments that are normally required to make them work for a person with a disability. Both Tobii and Eye Tribe offer a software development (SDK) which allows developers to link the camera to their specific application.


Key differences between a ‘developer’ camera and a fully supported commercial AT system are noted here:
  1. Developer releases ship with minimal software. The Eye Tribe model, for example, offers an eye mouse moving option; but does not provide the sophisticated computer interaction utilities available in Tobii’s Gaze Interaction or Alea’s Desktop 2.0
  2. Both units ship with only limited support from the manufacturer. Eye Tribe have a user forum for developers and users to ask questions. At the time of writing there have been several questions asked and resolved.
  3. There are hardware and operating system requirements that must be satisfied. For example, the Eye Tribe tracker is USB 3 (usually a blue socket) connection only and operating systems below Windows 7 are not supported. However, there is hope for other OS support. In 2013, Eye Tribe showed an earlier version of their system working on an Android tablet (You Tube) and their current promotional video demonstrates what appears to be Apple OSX functionality; and the Eye Tribe site states than an OSX SDK will be available in April 2014.
  4. To purchase a developer kit, you must agree to develop for the product in the licensing agreement. I purchased my Eye Tribe tracker with the intent of learning to code basic Windows applications. Microsoft Visual Studio has a free version and there are tutorials online for learning the programming languages supported. The Eye Tribe developer forum has links to one or two applications already made eye gaze compatible (Fruit Ninja being one).
  5. Eye tracking cameras are known to be subject to environmental effects. Chief among these are the level and direction of light in the room. Other known factors to consider when investigating eye gaze access include: user movement (head control), visual and perceptual skills, and level of intentionality to the task.
This last sub-point is of particular interest among AT professionals and other team members, as at least some of the learning we have had in past few years has been directed towards the possibility of using Eye Gaze systems with people who may present with conditions resulting in an Intellectual Disability. As some people with ID have significant communication challenges, the standard methods for setting up systems (instruct the person to track a calibration plot around the screen and fixate on it until it moves on) may not be relevant. However, software developers such as Sensory Software have made programs such as Look to Learn This is an external link that provide a series of engaging activities which grade up from the most basic (glance past the target) to more complex (choice making and built sequence of actions). A demo version is available with sample activities to explore. As noted elsewhere on this blog, it is also possible to use Clicker 6 with Eye Gaze access, as the program includes dwell click interaction (look at target and hold until the pre-set dwell time is achieved). Tobii’s Eye Gaze Learning Curve This is an external link is an excellent model of how progression through this AT area might look; and provides resources (including video tutorials) and suggested activities.

Review of Eye Tribe Development Kit

The kit ships with :

  • Eye Tribe eye tracker camera
  • 2m USB 3 ribbon cable
  • Small adjustable tripod stand
  • Instructions and link to download the software drivers and SDK
Software installation was simple and progressed with a hitch. It should be noted that the software does not appear to ship with an uninstaller utility.
Once the software was installed the camera was plugged in and the software run. The software runs a tutorial on first run which covers aspects such as positioning the eye tracker camera and also an initial calibration. A calibration score is obtained (I received perfect!) and then an interface screen with options is shown:
Eye Tribe interface screen with options


Options include :
  • API console (shows the eye tracking events in real time data)
  • Online Help (links to the Developer section of the Eye Tribe site)
  • Start in demo mode (shows setup screens at start up)
  • Mouse Gaze (enables eye tracking mouse function)
  • Mouse Smooth (reduces shudder effect in Eye Mouse)
A track status window displays whether the users eyes are in the correct position (indicated by colour and the presence of the eye graphics)
The Calibration screen is also accessible from here:
Eye Tribe Calibration screen  with options


This screen allows the setup of the calibration related functions. Included options are:
  • Number of points of calibration (9, 12, or 16. More points is usually correlated with a more accurate result)
  • Sample rate (time point is held before the next one is offered)
  • Monitor being calibrated (as in the case with multimonitor setups or an external display)
  • Vertical and horizontal alignment
  • Area size (area in which calibration and tracking occurs)
  • Background and Point colours (preferred contrast /colours, custom colours available)
In mouse move mode, I found the tracking to be consistent across the both screens tested (17” widescreen and 13” widescreen). There was a mild amount of tracking shudder when I paused over targets. I used Sensory Software Dwell Clicker 2 to generate mouse click events. This worked effectively and I was able to select targets down to the usual 20mmx20mm level (this is an established threshold of accurate tracking for eye gaze technology).


I also tested the tracking in mouse mode with both Tobii Communicator 4 Sono Key and The Grid 2 Fast Talker 2 pagesets set to dwell click and direct mouse selection. As the Eye Tribe tracker is new, it does not show up in the list of internally supported trackers for either program. At time of testing, the Eye Tribe tracker did not work with The Grid 2 on our machine.


The Eye Tribe tracker is compatible with a basic free AAC program called Gaze Talk This is an external link which was developer for COGAIN www.cogain.org This is an external link a collaboration between EU Eye Gaze manufacturers, academics, users, and other interested parties


In conclusion, currently the Eye Tribe tracker is primarily of interest to technical people and hobbyists; and also for the potential of the business model (release to software developers) to drive innovation in making applications eye-gaze accessible. Crowd sourced funding, and the direct relationships with developers that are now possible because of social media, offer some degree of promise for new directions in Assistive Technology that may make it more accessible and affordable to people wishing to explore these options.




http://www.spectronicsinoz.com/blog/tools-and-resources/the-eyes-have-it/
----------------------------------
About the Writer:
David is an Occupational Therapist at ComTEC Yooralla, a Victorian statewide assistive technology advisory and information service. He assists people with disabilities and their teams to problem solve solutions in the areas of equipment for communication, computer access, mounting, and environmental control. David is also a Clinical Advisor to the Statewide Equipment Program in the area of Environmental Control Units; and has presented at local, national and international conferences. He is passionate about the potential of Assistive Technology to make a difference in the lives of people with disabilities.