Wednesday, May 19, 2010

CAPD and Underlying Perceptual and Neural Processes

Mesal aspect of a brain sectioned in the media...Image via Wikipedia
 CAPD  or APD and Underlying Perceptual and Neural Processes
  • Sound Localization & Lateralization
  • Auditory Discrimination
  • Abnormal neurophysiologic representation of speech & nonspeech signals
  • Auditory Pattern Recognition
  •  Temporal Processing
  • Auditory Performance With Competing/Degraded Acoustic Signals



Neurobiologic Origins of APD Abnormal Neurophysiologic Representation of Auditory Stimuli
  • Abnormal neurophysiologic representation of speech & nonspeech signals
  • Interhemispheric transfer deficits (in children likely due to developmental delay in myelination)
  • Lack of appropriate hemispheric lateralizatoin
  • Atypical hemispheric asymmetries
  • Atypical timing in CANS/imprecise synchrony of neural firing 
  • Decreased central inhibition 
Mapping the Auditory System
  • Auditory System is  complex


    • connectivity pattern of subcortical auditory system subserves precise temporal processing)


  • Auditory System Overlaps Other Systems


    • Sensory
    • Cognitive/Executive 
    • Motor Control
Neural Substrate of Audiological System
  • Cortical Temporal, Frontal, Parietal, Limbic & Insular Cortices
  • Corpus Callosum (interhemispheric commisural fibers)
  • Subcortical Medial Geniculate Body (Thalamus)
    Striatum & Amygdala (Basal Ganglia)
  • Reticular Activating System (neurons project to CN; part of circuit controlling startle reflex) 
Tonotopicity

Tonotopicity is the spatial arrangement of where sounds of different frequency are processed in the brain.  The cochlea nerves  sends auditory information to the basilar membrane.  The basilar membranes vibrate sympathetically in tune with the sound frequency and encode those frequencies.  The basilar membrane is a pseudo resonant membrane, much like strings on a musical instrument, that varies in width, stiffness, mass,  and damping, etc.  High-frequency sounds localize near the bottom of the cochlea  and  low-frequency sounds localize near the top.  The auditory nerves encode frequency tonotopicity and  transmit sound information to the vestibule cochlea nerves cortex, the part of the brain that receives and interprets sound information.

Listening in Noise 
  • Activates Auditory, Linguistic, Attention, Executive Control, Working Memory, Motor Planning Areas of Brain
  • Temporal Gyri, Thalamus, Cerebellum, Frontal Gyri
 Non-Modular and Interactive Nature of the Brain
  • Auditory system overlaps  sensory, cognitive, executive, & motor control systems.
  • Considerable evidence of multimodal convergence or interaction of sensory neurons responsive to stimulation of different sensory modalities & the modulation of activity evoked by one modality on that evoked by another


    • Many nuclei across the brainstem are sites of integration and multimodal sensory inputs
    •  Polysensory processing areas exist within the cortex.


  • Most domain-specific functions (including language) typically activate multiple areas across widespread regions of the brain.
  • Most neural regions support multiple functions (i.e., multiple cognitive domains can activate a single neural region
  •  Increasing evidence that auditory cortex is under multisensory influence, AND
  • Suggestion that ALL sensory processing in neocortex might be multisensory
  • Sensory input is not only modulated by concurrent stimulation from other sensory modalities, but also modulated by top-down influences (e.g., attention, memory, or language)
  • Auditory processing is influenced by higher-order, nonmodality-
    decision processes, & the underlying multimodal, crossmodal, & supramodal neural interfaces supporting performance of these behavioral tasks.
  • Emotion affects auditory processing to an unchanging auditory (speech) stimulus early in the sensory processing stream.
Attention and Auditory Processing:  Cortical Level
  •  Top-down priming of sensory cortex: top-down inputs from other cortical areas form majority of input connections to both primary & secondary auditory cortex
  • Selective attention improves high-priority stimuli in the environment at the expense of other less relevant stimuli (e.g., Cocktail –Party phenomenon)
  • Attentional effort is generally associated with greater task difficulty. When speech degraded by noise, cortical areas related to attentional processing show increased activation reflecting an increase in relative importance of top-down influences in speech perception
  • Focusing attention to a given acoustic feature not only increases neural activity level, it also enhances neuronal selectivity to that feature in the particular part of the auditory cortex specialized in processing it. 
  • When subjects expect tones of a certain frequency, they tend to detect the expected frequency tones better than ones with an unexpected frequency in a continuous noise masker 
  • Attention activates regions of the auditory cortex that respond weakly or not at all to unattended tones. 
  • Attention does not just increase stimulus-dependent activation in auditory cortex– it leads to addition of activity in auditory cortex not activated by non-attended sounds. Attention activates neural populations separate from those processing the stimuli 
  • Auditory frequency discrimination learning induced by training with identical stimuli
  • Focused auditory attention selectively modulates sensory processing in cortex as early as 20 msec post-stimulus as evidenced in ERPs & neuromagentic fields 
Attention and Auditory Processing:  SubCortical Level and Cochlear Level
  • Localization: Attention influences which cue in time/intensity trading relationships more weighted
  • Auditory attention modulates OAE: Increased OAE amplitude suppression seen at frequencies to which subject's attention is focused (counting probes in noise) in contralateral ear, demonstrating that OCB activity can be selectively enhanced and decreased when attention directed to ipsilateral ear. 
  • Efferent olivocochlear activity predicts improvement in an auditory discrimination learning task
Auditory (Information) Processing
  •  Neither exclusively bottom-up (i.e., stimulus driven) nor top-down (i.e., strategy related including attentional)
  • Auditory cortex represents characteristics of both incoming auditory stimulus & of subjective sensory decisions.
  • Evidence counters traditional view of hierarchy of processing from more stimulus-driven (bottom-up) at early processing levels to more complex & cognitive processing (top-down) at higher levels.
  • Auditory processing consists of interactive networks, patterns of convergence & divergence, as well as parallel processing
  • Relative influence of top-down or bottom-up processing is influenced by changing listening demands (e.g., focused dichotic listening– attending to one ear first– elicit top-down executive & attentional processes) 
Comorbidity 
  • ADHD
  • Autism/Asperger’s 
  • Fragile X
  • Shared Neurophysiologic Substrate &
  • Learning Disability (LD) 
  • Language Impairment 
  • Auditory Neuropathy/AD
  • Nonverbal LD 
  • Social-Emotional LD 
  • Fetal Alcohol Syndrome 
Why is there Co-morbidity?
  • Brain Organization & Information Processing 


    • Shared Neurophysiologic Substrate and Vasculature
    • Nonmodular, Temporally Coupled, Interfacing, Polymodal, Overlapping, Interconnected, Synchronized Networks


  • Brain Connectivity and Synchronization


    • Temporally Coupled Across Cortex, Modalities, & Hemispheres
    •  Deficient Timing (i.e., Prolonged Temporal Integration & Segmentation) Imposes Limitations That Spread or Cascade Across Modalities/Region


  • Nature of Insults Less than circumscribed; extend across artificial boundaries
  • Shared Risk Factors- e.g., hyperbilirubinemia & prematurity are significant risk factors for AN/AD and CAPD; abnormal brainstem function possible risk factor contributing to learning problems
  • Shared Histories– e.g., Recurrent Otitis Media
  • Shared Genetics (e.g., ODD, CD, ADHD, & Reading Disorder; Pax 6 Mutation & working memory & interhemispheric transfer deficits; 73% of variation in dichotic listening due to genetic differences)
  • Shared Functional Deficits-- Working Memory Deficits??

Reblog this post [with Zemanta]