These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Recoding between two types of STM representation revealed by the dynamics of memory search. Author: Leszczyński M, Myers NE, Akyürek EG, Schubö A. Journal: J Cogn Neurosci; 2012 Mar; 24(3):653-63. PubMed ID: 21812563. Abstract: Visual STM (VSTM) is thought to be related to visual attention in several ways. Attention controls access to VSTM during memory encoding and plays a role in the maintenance of stored information by strengthening memorized content. We investigated the involvement of visual attention in recall from VSTM. In two experiments, we measured electrophysiological markers of attention in a memory search task with varying intervals between VSTM encoding and recall, and so we were able to track recoding of representations in memory. Results confirmed the involvement of attention in VSTM recall. However, the amplitude of the N2pc and N3rs components, which mark orienting of attention and search within VSTM, decreased as a function of delay. Conversely, the amplitude of the P3 and sustained posterior contralateral negativity components increased as a function of delay, effectively the opposite of the N2pc and N3rs modulations. These effects were only observed when verbal memory was not taxed. Thus, the results suggested that gradual recoding from visuospatial orienting of attention into verbal recall mechanisms takes place from short to long retention intervals. Interestingly, recall at longer delays was faster than at short delays, indicating that verbal representation is coupled with faster responses. These results extend the orienting-of-attention hypothesis by including an account of representational recoding during short-term consolidation and its consequences for recall from VSTM.[Abstract] [Full Text] [Related] [New Search]