These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Decoding selective auditory attention with EEG using a transformer model.
    Author: Xu Z, Bai Y, Zhao R, Hu H, Ni G, Ming D.
    Journal: Methods; 2022 Aug; 204():410-417. PubMed ID: 35447360.
    Abstract:
    The human auditory system extracts valid information in noisy environments while ignoring other distractions, relying primarily on auditory attention. Studies have shown that the cerebral cortex responds differently to the sound source locations and that auditory attention is time-varying. In this work, we proposed a data-driven encoder-decoder architecture model for auditory attention detection (AAD), denoted as AAD-transformer. The model contains temporal self-attention and channel attention modules and could reconstruct the speech envelope by dynamically assigning weights according to the temporal self-attention and channel attention mechanisms of electroencephalogram (EEG). In addition, the model is conducted based on data-driven without additional preprocessing steps. The proposed model was validated using a binaural listening dataset, in which the speech stimulus was Mandarin, and compared with other models. The results showed that the decoding accuracy of the AAD-transformer in the 0.15-second decoding time window was 76.35%, which was much higher than the accuracy of the linear model using temporal response function in the 3-second decoding time window (increased by 16.27%). This work provides a novel auditory attention detection method, and the data-driven characteristic makes it convenient for neural-steered hearing devices, especially those who speak tonal languages.
    [Abstract] [Full Text] [Related] [New Search]