Definition of an Accessor
A personalized device that provides its user with his or her preferred interaction modalities. Options range from specialized keyboards and pointing devices through to sophisticated speech recognizers, head trackers and eye gaze trackers.
Speech recognition has been the most popular access strategy for people with physical disabilities. A speech accessor consists of a small PC running Dragon Dictate or Naturally Speaking speech recognition software and accessor software called the Bridge. The performance of a speech recognizer running in its own processor is significantly better than when it is running it in the same processor as the application that is being accessed.
Head Tracking Accessor
Head tracking provides a very natural way to move the mouse cursor when using speech recognition to enter text and button commands. A variety of different movement sensing technologies were evaluated for head tracking:
A version of the Bridge software was developed for the Palm Pilot PDA. Standard Palm graffiti is used to generate keyboard entries and the touch screen is used as an x-y tablet to generate mouse movements. Mouse button operations are generated by tapping or pressing on the touch screen.
Eye Tracking Accessor
Archimedes Researchers started working with Eye Tracking while developing practical communication strategies for Gerry Lieberman, the retired Provost of Stanford, who had developed Lou Gherig's Disease. After a great deal of effort, we concluded that the state-of-the-art eye tracking equipment did not deliver on the promises of the manufacturers and that it was incredibly difficult to set up a practical installation unless the user was completely immobilized. Because Gerry was able to sit up in his wheel chair until the disease became very advanced we were unable to make the system work reliably. By the time he was confined to bed, his eyes had deteriorated to much form him to be able to use the system. In contrast to this, we set up an eye tracker for the football coach Charlie Weidermeyer, who also had Lou Gherig's disease and was completely immobilized in his bed. His eyes were still fully functional and while the system was still being set up and one of the researches was fussing about with the screen, Charlie's first message was, " get out of the way, you are blocking my view of the screen!"
Since these early eye tracking activities, Archimedes researchers have done several studies using a precision eye tracking system to study how people read information on a computer screen. One study for the Poynter Institute investigated the impact of advertising messages when people are reading new from a web page. Another study for Oracle investigated how sighted people interacted with web-based applications. Information from this study was used to guide the layout of application screens to make them easier for blind people to navigate and understand.
Commercial eye tracking equipment is very expensive. The systems we have purchased were in the range of twenty to forty thousand dollars each. During the period 2000 - 2002, Archimedes researchers have been developing a low cost eye tracking accessor using a neural network chip to analyze images from a low-cost web cam. The potential cost for an eye tracking accessor using this technology is about two hundred dollars.
Sonic Display Accessor
Explored concept of a blind person using a head tracker to navigate a GUI interface by having icons produce stereophonic musical chords and arpeggios when pointed to by the head tracker. Uses algorithmic process to create the chords based on the characteristics of the icon. If user doesn't recognize the chord, the system speaks the name of the icon after a delay.
There are many sources of spoken information that are currently inaccessible to deaf people. Multi modal computer presentations and telephone-based Web access systems are two examples in which this situation is becoming increasingly common. However, there are also many low-tech situations in which the same problems are experienced. Person?to?person communications using a telephone, making a court appearance, negotiating an insurance policy, visiting a doctor, meeting with an accountant, being advised about drug side effects by a pharmacist, or attending a lecture at school or university are all situations in which a deaf person would benefit from having a personal ASL accessor.
The fundamental problems facing deaf people fall into the following categories:
Under a small grant from the National Science Foundation, Archimedes researchers demonstrated an ASL Accessor that translated text messages into two-dimensional cartoon-like animations. Initial efforts were focused on creating three-dimensional animations but this was discarded in favor of the 2D representations due to the disappointing images and distracting artifacts resulting from the limitations of the 3D graphics software, particularly when using low powered computers. A professional animator was employed for several months to develop 2D cartoon-like animation primitives that can be pieced together to build ASL messages.
The strategy we adopted for translating text into ASL was not to attempt to perform true word-for-word translation but instead, to determine the intention of the input message and to build an ASL representation of that intention. We were unable to complete an intention detection system during the project and the overall project was shelved. Since that time, however, we have invented a new Integration Manager and Natural Interaction Processor (IMNIP) that determines the intent of a user from natural spoken and gestural language. The IMNIP will enable the ASL accessor to be completed.