A Noninvasive Brain-Computer Interface for Real-Time Speech Synthesis: The Importance of Multimodal Feedback.
Issue Date
2018-04Author
Brumberg, Jonathan S.
Pitt, Kevin M.
Burnison, Jeremy Dean
Publisher
Institute of Electrical and Electronics Engineers
Type
Article
Article Version
Scholarly/refereed, author accepted manuscript
Rights
© 2018 IEEE
Metadata
Show full item recordAbstract
We conducted a study of a motor imagery brain-computer interface (BCI) using electroencephalography to continuously control a formant frequency speech synthesizer with instantaneous auditory and visual feedback. Over a three-session training period, sixteen participants learned to control the BCI for production of three vowel sounds (/ textipa i/ [heed], / textipa A/ [hot], and / textipa u/ [who'd]) and were split into three groups: those receiving unimodal auditory feedback of synthesized speech, those receiving unimodal visual feedback of formant frequencies, and those receiving multimodal, audio-visual (AV) feedback. Audio feedback was provided by a formant frequency artificial speech synthesizer, and visual feedback was given as a 2-D cursor on a graphical representation of the plane defined by the first two formant frequencies. We found that combined AV feedback led to the greatest performance in terms of percent accuracy, distance to target, and movement time to target compared with either unimodal feedback of auditory or visual information. These results indicate that performance is enhanced when multimodal feedback is meaningful for the BCI task goals, rather than as a generic biofeedback signal of BCI progress.
Collections
Citation
Brumberg, J. S., Pitt, K. M., & Burnison, J. D. (2018). A non-invasive brain-computer interface for real-time speech synthesis: The Importance of Multimodal Feedback. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(4), 874-881. DOI:10.1109/TNSRE.2018.2808425
Items in KU ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
We want to hear from you! Please share your stories about how Open Access to this item benefits YOU.