Investigating Multimodal Strategies to Improve Access to Augmentative and Alternative Communication (AAC)

Individuals who have lost their ability to speak due to severe traumatic brain injury, brainstem stroke, or amyotrophic lateral sclerosis often use specialized computers to communicate. However, many also have severe physical limitations that make access to these computers with their hands difficult or impossible. This project focuses on developing new and innovative methods of accessing computers to support communication. The long-term goal of this project is to study and develop multimodal access methods (e.g., combining eye and head tracking, touch access to large locations and switch scanning to small/discrete locations on a computer) that are intuitive to the user and decrease the overall fatigue that individuals with severe physical impairments experience using alternative access methods. This project is a part of the Rehabilitation Engineering Research Center on Augmentative and Alternative Communication (RERC on AAC) funded through NIDILRR (grant # 90RE5017).

Research Collaborators: Tom Jakobs, P.E., Invotek, Inc.; Susan Fager Ph.D. C.C.C.-S.L.P., Madonna Rehabilitation Hospital; David Beukelman Ph.D. C.C.C.-S.L.P., Madonna Rehabilitation Hospital; Janice Light Ph.D. C.C.C.-S.L.P., Penn State University; David Hershberger, Saltillo

Investigating the Visual Cognitive Processing Demands of Augmentative and Alternative Communication (AAC) Interfaces

Individuals who have lost their ability to speak due to conditions such as severe strokes or traumatic brain injuries often rely on specialized software to communicate. For some, the ability to spell has also been lost requiring the use of pictures or digital images to relay messages. It is challenging for clinicians and families to know what picture content to provide in communication software to best support the communication needs of their patients and family members with severe impairments. This investigation uses eye tracking data to objectively study the visual cognitive processing demands of different kinds of visual content used in AAC interfaces. The result of this work will improve AAC interfaces for individuals with severe communication impairments. This project is a part of the Rehabilitation Engineering Research Center on Augmentative and Alternative Communication (RERC on AAC) funded through the NIDILRR (grant # 90RE5017).

Research Collaborators: Susan Fager Ph.D. C.C.C.-S.L.P., Madonna Rehabilitation Hospital; David Beukelman Ph.D. C.C.C.-S.L.P., Madonna Rehabilitation Hospital; Janice Light Ph.D., C.C.C.-S.L.P., Penn State University

Novel Sensor Technology for Environmental Control and Augmentative and Alternative Communication (AAC) Access

Individuals who have severe physical impairments due to conditions such as high level Spinal Cord Injury (SCI), brainstem strokes, or Guillain Barre syndrome, often experience difficulties accessing computerized technologies to control their environments (turn on/off lights, run a TV remote control, open/close windows and doors, etc.), smart phones and computers. This project develops a new access method using capacitive sensors. These sensors can be trained to recognize an individual’s unique movements or gestures to control one’s environment or computer. The result of this project will develop a new access technique that will help individuals with severe physical impairments regain control of their environments and computerized technologies. This grant is funded through the National Science Foundation (grant # IIS-1406626).

Research Collaborators: Nilanjan Banerjee, Ph.D., University of Maryland, Baltimore County; Ryan Robucci, Ph.D., University of Maryland, Baltimore County; Susan Fager Ph.D. C.C.C.-S.L.P., Madonna Rehabilitation Hospital.

Prosodic Control of Speech Synthesis for Assistive Communication in Severe Paralysis

Individuals who have lost their ability to speak often use communication software that can speak for them with synthesized or computerized voices. These computerized voices often lack the intonation that is present in natural speech, resulting in communication that feels unnatural and can be misinterpreted by the listener. This project will develop a method for the individual using the software to easily control the intonation, or prosody that they intend to communicate in a message, resulting in more natural and understandable communicative interactions. This grant is supported through the National Science Foundation (grant # 1509791).

Research Collaborators: Susan Fager Ph.D, C.C.C.-S.L.P., Madonna Rehabilitation Hospital; Cara Step, Ph.D., Boston University.

Top