Back     Close

Translation Of 3-D Articulatory Signals Acquired By Electromagnetic Articulography To A Visual Display Of Lingual Movements For Biofeedback: Preliminary Results
Geralyn M. Schulz, James Hahn, Ge Jin, Jared Kiraly, Bahne Carstens, Brigitta Carstens

The number of persons who suffer a speech production impairment following neurologic damage is extremely high. The evidence base for the efficacy of articulation remediation in neurogenic speech disorders is insufficient. Traditional techniques for re-training speech rely primarily on the adequacy of auditory feedback to shape articulatory movements of the tongue, lips, jaw, and soft palate. Failure of such techniques to generalize or to be maintained may be the result of neurological damage that impairs the ability to accurately utilize auditory feedback to shape articulator movements during speech re-learning. Visual (bio)feedback of lingual movement of one’s own speech and/or that of others might therefore be effective in establishing and promoting more accurate speech. However, one of the most difficult aspects of speech to convey visually is lingual movement in the oral cavity. The latest electromagnetic articulography system (AG500) can track articulatory movement in 3-dimensions. The purpose of this preliminary study was to demonstrate that lingual movement signals acquired by the AG500 can be translated into visual representations of lingual movement that subjects could use as biofeedback during speech (re)learning. We will discuss the development of the translation programs and demonstrate the preliminary data collected from models and from several non-impaired speakers.

Back     Close