Brain-Machine Interfaces for Real-time Speech Synthesis

View/ Open
Issue Date
2011-08Author
Guenther, Frank H.
Brumberg, Jonathan S.
Publisher
Institute of Electrical and Electronics Engineers
Type
Article
Article Version
Scholarly/refereed, author accepted manuscript
Rights
© 2011 IEEE
Metadata
Show full item recordAbstract
This paper reports on studies involving brain-machine interfaces (BMIs) that provide near-instantaneous audio feedback from a speech synthesizer to the BMI user. In one study, neural signals recorded by an intracranial electrode implanted in a speech-related region of the left precentral gyrus of a human volunteer suffering from locked-in syndrome were transmitted wirelessly across the scalp and used to drive a formant synthesizer, allowing the user to produce vowels. In a second, pilot study, a neurologically normal user was able to drive the formant synthesizer with imagined movements detected using electroencephalography. Our results support the feasibility of neural prostheses that have the potential to provide near-conversational synthetic speech for individuals with severely impaired speech output.
Collections
Citation
Guenther, F. H., & Brumberg, J. S. (2011). Brain-Machine Interfaces for Real-time Speech Synthesis. Conference Proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference, 2011, 5360–5363. http://doi.org/10.1109/IEMBS.2011.6091326
Items in KU ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
We want to hear from you! Please share your stories about how Open Access to this item benefits YOU.