These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Retrieval of visual, auditory, and abstract semantics. Author: Noppeney U, Price CJ. Journal: Neuroimage; 2002 Apr; 15(4):917-26. PubMed ID: 11906232. Abstract: Conceptual knowledge is thought to be represented in a large distributed network, indexing a range of different semantic features (e.g., visual, auditory, functional). We investigated the anatomical organization of these features, using PET, by contrasting brain activity elicited by heard words with (i) visual (e.g., blue), (ii) auditory (e.g., noise), or (iii) abstract (e.g., truth) meaning. The activation task was either repetition or semantic decision (e.g., does the meaning of the word relate to religion?). In the baseline conditions, the sound track of the words was reversed and subjects had to say "OK" (control for repetition) or make an acoustic decision (control for semantic decision). Irrespective of task, words relative to their corresponding controls activated the left posterior inferior temporal and inferior frontal cortices. In addition, semantic decisions on words with sensory (visual and auditory) meanings enhanced activation in a ventral region of the left anterior temporal pole. These results are consistent with neuropsychological studies showing that anterior temporal lobe damage can cause deficits for items that are mainly defined by their sensory features (i.e., concrete, particularly living items). Since modality-specific activation was observed only during the semantic decision task, we discuss whether it reflects retrieval of sensory semantics per se or the degree to which semantic associations are triggered during effortful retrieval.[Abstract] [Full Text] [Related] [New Search]