Loading [MathJax]/extensions/MathMenu.js
Role of Brainwaves in Neural Speech Decoding | IEEE Conference Publication | IEEE Xplore

Role of Brainwaves in Neural Speech Decoding


Abstract:

Neural speech decoding aims at direct decoding of speech from the brain to restore speech communication in patients with locked-in syndrome (fully paralyzed but aware). D...Show More

Abstract:

Neural speech decoding aims at direct decoding of speech from the brain to restore speech communication in patients with locked-in syndrome (fully paralyzed but aware). Despite the recent progress, exactly which aspects of neural activities are characterizing the decoding process is still unclear. Neural oscillations have been associated with playing a key functional role in neural information processing and thus might provide significant insight into the decoding process. Previous research has investigated a limited range of neural frequencies for decoding, usually the high-gamma oscillations (70−200Hz) in electrocorticography (ECoG) and lower-frequency waves (1−70Hz) in electroencephalography (EEG). Hence, the exact contribution of specific frequency bands is still unclear. Magnetoencephalography (MEG) is a non-invasive method for directly measuring underlying brain activity and has the temporal resolution needed to investigate the role of cortical oscillations in speech decoding, which we attempted in this study. We used three machine learning classifiers (linear discriminant analysis (LDA), support vector machine (SVM), and artificial neural network (ANN) to classify different imagined and spoken phrases for finding the role of brainwaves in speech decoding. The experimental results showed a significant contribution of low-frequency Delta oscillations (0.1−4 Hz) in decoding and the best performance was achieved when all the brainwaves were combined.
Date of Conference: 18-21 January 2021
Date Added to IEEE Xplore: 18 December 2020
ISBN Information:

ISSN Information:

Conference Location: Amsterdam, Netherlands

Funding Agency:


I. Introduction

Severe brain damage or amyotrophic lateral sclerosis (ALS) may cause locked-in syndrome, a state of paralysis but with cognitive awareness [1]. These patients lose their communication ability due to articulatory paralysis, leaving only the neural pathway as a medium for restoring a certain level of communication. Current brain-computer interface (BCI) spellers address this challenge by decoding attentional correlates from the brain while the patients focus on selecting letters randomly displayed on a keyboard [2]. The slow communication rate (< 10 words/minute) of these BCIs is a major impediment for cultivating natural communication. Moving beyond the slow and laborious BCIs, current research is progressing towards finding a solution for fast communication by attempting to decode speech directly from the brain. These neural speech decoding paradigms or speech-BCIs have the potential to offer real-time communication assistance, thereby, improving the quality of life for these neurologically impaired patients.

Contact IEEE to Subscribe

References

References is not available for this document.