Beyond the right hemisphere: brain mechanisms mediating vocal emotional processing

Size: px
Start display at page:

Download "Beyond the right hemisphere: brain mechanisms mediating vocal emotional processing"

Transcription

1 Review TRENDS in Cognitive Sciences Vol.10 No.1 January 2006 Beyond the right hemisphere: brain mechanisms mediating vocal emotional processing Annett Schirmer 1 and Sonja A. Kotz 2 1 Department of Psychology, University of Georgia, Athens, Georgia, USA 2 Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany Vocal perception is particularly important for understanding a speaker s emotional state and intentions because, unlike facial perception, it is relatively independent of speaker distance and viewing conditions. The idea, derived from brain lesion studies, that vocal emotional comprehension is a special domain of the right hemisphere has failed to receive consistent support from neuroimaging. This conflict can be reconciled if vocal emotional comprehension is viewed as a multi-step process with individual neural representations. This view reveals a processing chain that proceeds from the ventral auditory pathway to brain structures implicated in cognition and emotion. Thus, vocal emotional comprehension appears to be mediated by bilateral mechanisms anchored within sensory, cognitive and emotional processing systems. Introduction It is a long-established [1] and persistent notion [2] that the right hemisphere is specialized for processing the emotional information conveyed in a speaker s voice. This notion derives from research demonstrating that damage to the right hemisphere is more detrimental to an individual s ability to recognize vocal emotional expressions than is damage to the left hemisphere [3]. Despite its significance for current views of brain function, the right hemisphere model is not unchallenged. For example, evidence exists that emotionally relevant acoustic cues such as frequency and temporal information are differently lateralized in the brain [4,5]. Furthermore, some studies implicate subcortical structures such as the basal ganglia [6] and the amygdala [7]. However, rather than leading to a unified model of vocal emotional processing, these findings nourish opposing views which divide the field of vocal emotion research. One approach to integrating the seemingly conflicting findings is to consider vocal emotional comprehension as a multi-step process with individual sub-processes that are differentially represented in the brain. These subprocesses can be described as (i) analyzing the acoustic cues of vocalizations, (ii) deriving emotional significance Corresponding author: Schirmer, A. (schirmer@uga.edu). Available online 29 November 2005 from a set of acoustic cues, and (iii) applying emotional significance to higher order cognition. The work reviewed here addresses these sub-processes and elucidates their neuroanatomical and temporal underpinnings. Moreover, the findings are integrated into a working model of vocal emotional processing. The sounds of emotion Whether we think someone is scared or annoyed greatly depends on the sound of his or her voice. That the voice can betray these feelings is the result of vocal production being modulated by physiological parameters that change depending upon emotional state. Arousal mediated changes in heart rate, blood flow and muscle tension, among other things, modulate the shape, functionality and sound of the vocal production system. For example, increased emotional arousal is accompanied by greater laryngal tension and increased subglottal pressure which increases a speaker s vocal intensity. Additionally, vocal emotional expressions reflect communicative intentions. For example, Darwin observed that angry utterances sound harsh and unpleasant because they are meant to strike terror into an enemy [8]. Together physiological modulations of the vocal production system and communicative intentions shape the way we speak, making us sound frightened or frightening. The acoustic cues that convey emotions comprise amplitude, timing and fundamental frequency (F0), the last of these being perceived as pitch. An additional cue to emotion is voice quality the percept derived from the energy distribution of a speaker s frequency spectrum, which can be described using adjectives such as shrill, harsh, or soft. As some emotions are believed to have a unique physiological imprint, they are proposed to be expressed in a unique manner. For example, happiness is characterized by fast speech rate, high intensity, mean F0 and F0 variability sounding both melodic and energetic. By contrast, sad vocalizations are characterized by slow speech rate, low intensity, mean F0 and F0 variability but high in spectral noise resulting in the impression of a broken voice [9] (Figure 1). Thus, understanding a vocal emotional message requires the analysis and integration of a variety of acoustic cues /$ - see front matter Q 2005 Elsevier Ltd. All rights reserved. doi: /j.tics

2 Review TRENDS in Cognitive Sciences Vol.10 No.1 January Sentence spoken with happy prosody... Sentence spoken with sad prosody... Frequency (Hz) Frequency (Hz) Time (s) Time (s) TRENDS in Cognitive Sciences Figure 1. Oscillogram (top panel) and spectrogram (bottom panel) of the German sentence Die ganze Zeit hatte ich ein Ziel. ( During all this time I had one goal. ). The sentence is shorter when produced with a happy prosody (2 s) than with a sad prosody (2.2 s). Additionally, the speaker is louder as can be seen by comparing the sound envelope illustrated in the oscillogram. This envelope is larger (i.e. it deviates more from baseline) for happy than for sad prosody. Spectral differences between happy and sad prosody are illustrated in the spectrogram. The dark shading indicates the energy of frequencies up to 5000 Hz. The superimposed blue lines represent the fundamental frequency contour, which is perceived as speech melody. This contour shows greater variability and a higher mean for happy than for sad prosody. Sensory processing Neuroanatomical and temporal underpinnings The analysis of emotionally relevant acoustic cues is mediated by a pathway that runs from the ear to several stations in the brain stem up to the thalamus in the ipsilateral and contralateral hemispheres. Input to the contralateral thalamus is considerably larger than that to the ipsilateral thalamus a pattern that continues at the level of the auditory cortex, located in the superior temporal lobe of each hemisphere. The auditory cortex is divided into a core or primary region that is surrounded by a belt or secondary region [10] (Figure 2). The core region has been shown to be tonotopically organized, which means that a specific frequency activates a specific group of neurons that is less likely to be activated by another frequency. Complex tones (e.g. vocal expressions), which comprise several frequencies, are represented in the belt region of the auditory cortex where input from the core is integrated [11]. The firing rate of neurons in both core and belt regions is modulated by sound intensity [12]. Additionally, increases in sound intensity are associated with an increase in the number of firing neurons [13], suggesting the existence of different intensity encoding mechanisms at the level of the auditory cortex. One way to specify the time course of frequency and intensity encoding is to measure event-related potentials (ERP). Differences in frequency or sound intensity modulate the amplitude of an ERP negativity that peaks w100 ms following stimulus onset and is referred to as the N1. The N1 is generated in bilateral secondary auditory (a) Contextual and individual significance (b) Utterance Sensory processing Auditory processing areas: auditory cortex STS Temporal resolution: Integration of emotionally significant acoustic cues Auditory what pathway: bilateral STG anterior STS Cognition Evaluative judgments: right IFG, OFC High LH RH Low Linguistic LH Paralinguistic RH Semantic processing: left IFG Time (ms) TRENDS in Cognitive Sciences Figure 2. (a) Three stage working model for the processing of emotional prosody. Sensory processing (Stage 1): Acoustic analysis is mediated by bilateral auditory processing areas. Integration (Stage 2): Processing along the auditory what pathway integrates emotionally significant acoustic information to derive an emotional gestalt. This pathway projects from the superior temporal gyrus (STG) to the anterior superior temporal sulcus (STS) and might be lateralized to the right hemisphere (RH). Cognition (Stage 3): Emotional information derived at the level of the STS is made available for higher-order cognitive processes. For example, explicit evaluative judgments of emotional prosody are mediated by the right inferior gyrus (IFG) and orbitofrontal cortex (OFC), whereas the integration of emotional prosody into language processing recruits inferior frontal gyrus in the left hemisphere (LH). Contextual or individual significance might facilitate or enhance processing at any of the three stages. (b) Schematic presentation of brain areas implicated in vocal emotional processing in a right sagittal view: primary, secondary and tertiary auditory cortex (light blue) extending to the anterior portion of the superior temporal sulcus (dark blue), from where projections reach inferior frontal gyrus and orbitofrontal gyrus (green). Arrows (yellow) indicate presumed processing directions (colors/numbers correspond to the processing stages outlined in (a); brain template adapted from [70]).

3 26 Review TRENDS in Cognitive Sciences Vol.10 No.1 January 2006 cortex [14] suggesting that this region encodes frequency and intensity within a hundred milliseconds. Likewise early effects have been reported for the processing of changes in duration or temporal structure. For example, unexpected duration decrements and stimulus omissions elicit a brain response between 100 and 200 ms after stimulus onset, which is termed the mismatch negativity (MMN). This brain response reflects activity in secondary auditory cortex followed by activity in inferior frontal activity [15,16]. This inferior frontal activity is thought to reflect attentional reallocation induced by acoustic change. This, together with the finding of increased N1 amplitudes and increased activity in the auditory cortex to attended than to unattended stimuli indicates that attention modulates early acoustic processing [17,18]. Hemispheric lateralization Hemispheric differences in the sensitivity to different kinds of acoustic information were proposed in the late 1970s [19]. Although this idea has not been consistently supported [20], recent studies using intracranial recordings [21] or fmri/pet [22] provided compelling evidence. This work showed that auditory processing areas in the left and right hemisphere differ in their temporal resolution. Compared with the right hemisphere, the left hemisphere seems to operate on a finer temporal scale and is thus better equipped for the analysis of rapidly changing information as present in speech. By contrast, the supposed lower temporal resolution of the right hemisphere has been associated with a specialization in pitch processing. Although current models of hemispheric specialization largely agree upon this pattern, differences exist as to the proposed underlying mechanisms and the auditory areas that are thought to show hemispheric specialization [22,23]. A recent proposal [23] holds that auditory processing occurs on two separate time scales: one ranging from 25 to 50 ms and one ranging from 200 to 300 ms. Although bilateral superior temporal gyrus (STG) operates on both of these scales, the output is believed to be differentially projected to superior temporal sulcus (STS) in the left and right hemisphere. Compared with the left hemisphere, the right hemisphere seems to rely more strongly on the ms scale, which allows a better estimate of dynamic pitch. Integration of emotionally significant acoustic cues A processing pathway for auditory objects By analogy with the figure ground grouping mechanisms that mediate the perception of visual objects [24], some researchers propose the existence of mechanisms that allow listeners to perceive acoustic events as perceptual entities or auditory objects [25]. Several models explain how these objects are processed by the auditory system [26,27]. One model that received considerable support from monkey [28] and human data [29 31] holds that auditory objects are categorized ( what ) and localized ( where ) in segregated processing streams. The what processing stream is considered particularly important for the processing of speech and non-speech vocalizations and has been described as reaching from the auditory cortex to the lateral STG and the STS [32 34]. Processing is believed to become increasingly complex and integrative as it progresses towards the anterior STS from where it feeds into higher order cognitive processes via connections to the prefrontal and middle temporal cortex. One approach to studying linguistic auditory object (i.e. words) processing is to compare intelligible speech with spectrally rotated, unintelligible speech. Such a comparison is advantageous because the spectrally rotated control stimuli match the experimental stimuli in acoustic complexity but lack a memory representation or meaning. In accord with the assumption of an anteriorly directed what processing stream, both intelligible and unintelligible speech activate the left STG whereas the left anterior STS responds to intelligible speech only [35]. As speech intelligibility fails to modulate STG and STS activity in the right hemisphere, it seems that the processing stream mediating the identification of linguistic auditory objects is lateralized to the left hemisphere [33]. By contrast, the comparison between intelligible and unintelligible non-speech vocal sounds (e.g. natural and frequency scrambled moans) elicits activity in the right middle and anterior STS suggesting that paralinguistic aspects of vocal processing (e.g. speaker gender, age, emotional state) are lateralized to the right hemisphere [36]. Emotional vocalization: a class of auditory objects If one accepts the notion that emotional vocalizations are paralinguistic auditory objects, one might assume that the right STS contributes to the processing of vocal emotional expressions. Unfortunately, investigations addressing this question are rare. Moreover, to our knowledge there are only five published studies that compared emotional vocalizations with neutral or emotionally different vocalizations [7,37 40]. Two of these studies implicated bilateral middle [37] and posterior [38] STS and two studies found activations in surrounding superior and middle temporal cortex in the left hemisphere [7,40]. Overall, these findings match only partially the predictions derived from work comparing intelligible and unintelligible speech and non-speech vocal sounds. This could be explained by the fact that the second two studies used intelligible vocalizations as control stimuli. Such a comparison might be more sensitive to lower level acoustic processes than the comparison with unintelligible emotional vocalizations and thus elicit more posterior activity in the auditory what processing stream. Moreover, as both emotional and neutral vocalizations are meaningful auditory objects, both might activate the anterior STS such that a contrast between both conditions disguises the role of this region in processing the vocal cues to emotion. Support for the notion that the emotional significance of vocalizations is represented in the anterior STS and that this representation is lateralized to the right hemisphere comes from recent work by Kotz and colleagues (unpublished data), who compared intelligible and unintelligible emotional speech. From acoustic cues to emotional significance in 200 ms The timing of the processes that abstract emotional significance from a set of auditory cues can be estimated

4 Review TRENDS in Cognitive Sciences Vol.10 No.1 January based on ERP findings. Similar to the discrimination of basic acoustic cues such as stimulus duration, pitch and amplitude, emotional discrimination can be studied by means of the MMN [41] a correlate of preattentive auditory processing. The MMN is considered preattentive because it is elicited by acoustic change even when participants perform an unrelated visual task [42]. Work that used the MMN as a marker for preattentive vocal emotional processing presented emotionally and neutrally spoken syllables as standards and deviants in an oddball paradigm. Responses to stimuli that served as standards in one condition were subtracted from responses to physically identical stimuli that served as deviants in another condition. The MMN revealed by this subtraction was larger for emotional than for neutral vocalizations [41]. Given that the difference in MMN amplitude between emotional and neutral vocalizations peaked around 200 ms after stimulus onset, one can assume that at this point listeners have integrated primary acoustic information and derived emotional significance. Factors that modulate MMN amplitude to emotional vocalizations are contextual and individual relevance. For example, the individual relevance of emotional vocalizations seems to differ between men and women which results in a differential emotional sensitivity (see Box 1). Note that although it is known that the MMN has generators in the superior temporal gyrus [43], it is unclear what brain regions generate the emotion effect associated with the MMN. One might speculate based on evidence implicating the right anterior STS in vocal processing [36,37] that this region mediates the emotional modulation of the MMN. Cognition Cognitive evaluation of emotional significance Most published neuroimaging studies on emotionalprosodic processing require participants to perform Box 1. Sex differences in vocal emotional processing Gender stereotypes hold that women are more sensitive than men to the feelings of others. Behavioral research reviewed by Hall [65] in the late 1970s suggested that there is some truth to this belief. On average, women were found to be more accurate than men in recognizing emotions from faces, gestures and voices. More recent work further suggests sex differences in the neurophysiological correlates of vocal emotional processing. Women, but not men, show a larger mismatch negativity to emotional than to neutral vocalizations that are outside their attentional focus [66]. Furthermore, women, but not men, show an N400 effect and activity in the left IFG to words with incongruous as compared with congruous emotional prosody when only one of the two types of information is task-relevant [41,60] (see Figure I). However, if the task is to judge the congruence between emotional prosody and word meaning, processing in men and women is comparable [62]. These findings indicate that men and women differ in how automatically they access and integrate emotional-prosodic information into language processing. Evidence that women are better than men in remembering the emotional tone of an utterance further suggests that these processing differences result in genderspecific memory representations of social interactions (Mecklinger et al., unpublished data). Based on these findings one can assume that emotional expressions and thus social interactions are of greater significance to women than to men. Evidence that supports this assumption comes from developmental research showing that girls more frequently select toys that elicit social play (e.g. dolls) than do boys. These preferences cannot be overruled by parental influences [67] and can be observed even in monkeys [68]. This suggests that sex differences in social perception and behavior are biologically mediated. The action of steroids and neuropeptides offers one possible mechanism for this mediation. For example, oxytocin although present in both sexes is more important in women and is believed to increase women s affiliative behavior in stressful situations [69]. The greater interest in affiliation might make women more dependent than men on the emotional state of others, which might in turn enhance their perception of emotional cues. (a) Female participants z=3.1 Male particpants (b) CZ -5µV CZ -5µV 800ms 800ms Emotional prosody match Emotional prosody mismatch TRENDS in Cognitive Sciences Figure I. (a) fmri contrast [41] and (b) ERP [60] for emotional words (e.g. loved) spoken with congruous (e.g. happy, solid line) as compared with incongruous (e.g. angry, dotted line) emotional prosody when emotional prosody is task-irrelevant. The fmri contrast reveals activity in the left inferior frontal gyrus in women but not in men. The ERP reveals an N400 effect in women but not in men. These findings suggest that listeners integrate vocal and verbal emotional information w400 ms following word onset and that this integration is mediated by left inferior frontal gyrus. Moreover, women seem to use emotional prosody more automatically for language processing than men. Reproduced with permission from [41,60].

5 28 Review TRENDS in Cognitive Sciences Vol.10 No.1 January 2006 Table 1. Brain structures activated by emotional judgments Cortical structures BA only LH LHORH LH/RH LH!RH only RH Frontal Orbital 11 [48] a,b [47] Inferior 44 [45] a [47], [49] a [48] b 45 [47], [49] a [48] b, [45] c, [44] 47 [49] a, [48] a,b [47] [49] b, [46] Middle 46 [47], [49] a [48] b, [45] c Middle/Superior 9 [45] a,b [47], [49] a [48] b Temporal Superior 22 [48] b, [45] a [49] a [49] b, [45] b 41 [48] b [49] a [49] b 42 [48] b [49] a [49] b Parietal Inferior 40 [45] a [47], [49] a, [45] b Included structures were activated in three or more of the following studies: [39]: prosodic vs. verbal emotion identification. [44]: emotion vs. consonant detection. [45]: (a) emotion identification vs. passive listening in Chinese participants, (b) emotion identification vs. passive listening in English participants, (c) emotion vs. intonation identification in Chinese participants, (d)emotion vs.intonation identification in English participants; [46]: emotional identification vs.word repetition. [47]: discrimination of expressiveness of sentence 1 vs. sentence 2. [48]: (a) emotion vs. intonation identification, (b) emotion identification vs. rest. [49]: (a) emotion identification vs. rest, (b) emotion vs. vowel identification. Only LH Z activity only in the left hemisphere; LH O RH Z activity larger in the left than in the right hemisphere; LH/RH Z activity comparable in left and right hemisphere or no information about lateralization. The two structures that have been implicated most frequently and most consistently across studies (BA 45/47) are highlighted in gray. emotional judgments. To make these judgments, participants have to attach a verbal label to a perceived vocal expression. This implies that next to brain structures implicated in simple preattentive emotional discrimination, additional brain structures have to be recruited. These structures can be isolated by comparing emotional judgments against a resting baseline or another task (see Table 1). Out of seven studies that conducted such a comparison, six found activity in the right inferior frontal gyrus [44 49] (IFG; but see [39]). Specifically, these activations were located in Brodmann Areas (BA) 45 and 47, the latter reaching into orbitofrontal cortex [50,51]. Given cytoarchitectonic differences between BA45 and 47 [52], their functional role in emotional judgment tasks might also differ: whereas BA45 seems to mediate more cognitive aspects of emotional judgments such as labeling emotional expressions, BA47 might be engaged in retrieving their reward value [53,54]. Given that activations in BA45 and 47 show not only in contrasts with a resting baseline but also with another task, they are unlikely to reflect task induced mnemonic or attentional demands. Integration of vocal and verbal information Once listeners have evaluated the emotions conveyed in a speaker s voice, this information can be integrated in other co-occurring processes. For example, the processing of spoken utterances in social interactions relies crucially on the integration of vocal and verbal information. The importance of this integration is best illustrated by utterances such as You moron! or Great job! which have to be interpreted as banter or sarcasm respectively if tone of voice contradicts verbal meaning. Interpreting spoken utterances as banter or sarcasm requires additional processing effort, which is reflected by increased activity in the left IFG for words spoken with incongruous as compared with congruous emotional prosody [41]. That this additional effort impounds on semantic processing can be inferred from evidence that IFG activation correlates with the effort of semantic selection [55], retrieval [56] and integration [57]. An estimate of when in time emotional prosody modulates semantic processing can be obtained from ERP measures. Specifically, the N400 a negativity peaking around 400 ms following word onset reflects semantic processing effort. This negativity is larger for words presented in incongruous than congruous sentence contexts [58] (e.g. I drink my coffee with cream and socks/ sugar.). Similar N400 differences can be observed between words presented with incongruous as compared with congruous emotional prosody [59 62] suggesting that emotional prosody modulates similar aspects of semantic processing as does sentence context. However, unlike sentence context, emotional prosody shows effects that depend on task relevance, its temporal availability relative to the onset of crucial verbal information [61], as well as inter-individual variables (e.g. gender [60], see Box 1) indicating that the use of emotional prosody for language processing is more variable in that it is context and person specific. Note that as for the MMN, one has to be careful in relating temporal and neuroanatomical information revealed by ERP and fmri/pet, respectively. However, based on research that localized the source of the N400 elicited by semantic incongruities one can draw tentative conclusions. This work implicates several temporal and frontal lobe structures including left IFG [63] suggesting that activity in these regions observed with fmri [for a review see 41] occurs with a latency of 400 ms. Given that the generators of the emotionalprosodic N400 effect are unknown one can only speculate as to whether this effect is mediated by IFG. Towards a model of emotional-prosodic processing Together the evidence reviewed above supports the idea that vocal emotional comprehension comprises sub-processes that are differentially represented in the brain. Concluding from this evidence, emotional-prosodic comprehension can be envisioned as a hierarchical process and a working model as a starting point for future research can be derived. According to this model, auditory cortex mediates the analysis of acoustic information (Stage 1 in Figure 2a). This brain region codes frequency and amplitude information, as well as their temporal envelope within the first 100 ms following stimulus onset. Hemispheric differences in temporal resolution mediate a right hemisphere lateralization for spectral processing

6 Review TRENDS in Cognitive Sciences Vol.10 No.1 January Box 2. Questions for future research What sub-processes of vocal-emotional comprehension are emotional and therefore would be expected to show a correlation between neuronal activity and other physiological measures such as heart rate and electrodermal activity? What are the similarities and differences in the brain mechanisms that mediate the representation of auditory and visual emotional objects? Does the processing of emotional expressions differ depending on whether these expressions reflect a speaker s emotional state or intentions? How are different types of vocal emotional expressions (e.g. happiness, anger) represented in the brain? Does the right anterior STS discriminate between emotional and other types of paralinguistic vocal information (e.g. speaker gender, age)? How are non-vocal emotional sounds (e.g. gun shots) represented in the brain? Can we further specify the influence of contextual (e.g. cognitive demands) and individual (e.g. personality, age, gender) variables on emotional-prosodic processing? and a left hemisphere lateralization for temporal processing. Following basic acoustic processing, vocal emotional expressions recruit areas along the auditory what processing stream that encode the emotional significance of vocalizations (Stage 2 in Figure 2a). During this processing stage, different acoustic cues that convey emotional information are integrated as processing progresses towards the anterior STS. Activity at the level of the STS seems lateralized to the right hemisphere and occurs with a latency of w200 ms. Emotional significance derived at the level of the anterior STS is then available for higher order cognitive processes, such as evaluative judgments mediated by the right inferior and orbitofrontal cortex or effortful semantic processing associated with banter and sarcasm mediated by left inferior frontal cortex (Stage 3 in Figure 2a). Drawing on evidence that vocal emotional processing is modulated by attentional focus and inter-individual variables (Box 1) the working model also proposes that the significance of emotional prosody in a given situation and for a given person can facilitate or enhance individual sub-processes. Such primacy effects might be induced by top-down mechanisms that increase the attention that listeners intentionally direct to emotional prosody. Additionally, bottom-up mechanisms triggered because certain emotional expressions are unexpected (e.g. abrupt changes in speaker tone) or have intrinsic relevance (e.g. signals of threat) might lead to increased processing efforts. This increase could be mediated by emotionally relevant structures such as the amygdala [64] or the ventral striatum [6] and reflect the need for behavioral adjustments (e.g. fight/flight). Conclusions The brain mechanisms that allow us to infer a speaker s emotional state or intentions are highly complex and represent an important asset in the history of human kind. They evolved as an adaptation to a life in social groups and are anchored within more basic neural systems devoted to sensation and emotion. The model proposed here relates vocal emotional processing to these systems as well as to higher order cognition. As such it views vocal emotional comprehension as being composed of several sub-processes that are differentialy represented in the brain and that draw on sensation, emotion, and cognition, respectively. Although the evidence from which this model is derived is still fragmentary (Box 2) it challenges the classic right hemisphere model and highlights the significance of a more analytic view of vocal emotional comprehension. Such a view seems vital for improving our understanding of both normal and dysfunctional social perception and for effectively treating the social perception impairments observed in several neurological and psychiatric conditions. Acknowledgements S.A.K. is supported by the German Research Foundation (DFG-FO499). References 1 Van Lancker, D. and Fromkin, V.A. (1973) Hemispheric specialization for pitch and tone : Evidence from Thai. J. Phonetics 1, Mitchell, R.L. and Crow, T.J. (2005) Right hemisphere language functions and schizophrenia: the forgotten hemisphere? Brain 128, Shamay-Tsoory, S.G. et al. (2004) Impairment in cognitive and affective empathy in patients with brain lesions: anatomical and cognitive correlates. J. Clin. Exp. Neuropsychol. 26, Van Lancker, D. and Sidtis, J.J. (1992) The identification of affectiveprosodic stimuli by left- and right-hemisphere-damaged subjects: all errors are not created equal. J. Speech Hear. Res. 35, Schirmer, A. (2004) Timing speech: A review of lesion and neuroimaging findings. Brain Res. Cogn. Brain Res. 21, Pell, M. and Lenoard, C.L. (2003) Processing emotional tone from speech in Parkinson s disease: A role for the basal ganglia. Cogn. Affect. Behav. Neurosci. 3, Phillips, M.L. et al. (1998) Neural responses to facial and vocal expressions of fear and disgust. Proc. Biol. Sci. 265, Darwin, C., ed. (1872) The Expression of the Emotions in Man and Animals, John Murray 9 Banse, R. and Scherer, K.R. (1996) Acoustic profiles in vocal emotion expression. J. Pers. Soc. Psychol. 70, Kaas, J.H. and Hackett, T.A. (2000) Subdivisions of auditory cortex and processing streams in primates. Proc. Natl. Acad. Sci. U. S. A. 97, Rauschecker, J.P. et al. (1995) Processing of complex sounds in the macaque nonprimary auditory cortex. Science 268, Recanzone, G.H. et al. (2000) Frequency and intensity response properties of single neurons in the auditory cortex of the behaving macaque monkey. J. Neurophysiol. 83, Hart, H.C. et al. (2003) The sound-level-dependent growth in the extent of fmri activation in Heschl s gyrus is different for low- and high-frequency tones. Hear. Res. 179, Engelien, A. et al. (2000) A combined functional in vivo measure for primary and secondary auditory cortices. Hear. Res. 148, Tse, C.Y. et al. Event-related optical imaging reveals the temporal dynamics of right temporal and frontal cortex activation in preattentive change detection. Neuroimage (in press) 16 Kircher, T.T.J. et al. (2004) Mismatch negativity responses in schizophrenia: A combined fmri and whole-head MEG study. Am. J. Psychiatry 161, Alho, K. et al. (1994) Strongly focused attention and auditory eventrelated potentials. Biol. Psychol. 38, Rinne, T. et al. Modulation of auditory cortex activation by sound presentation rate and attention. Hum. Brain Mapp. (in press) 19 Mills, L. and Rollman, G.B. (1979) Left hemisphere selectivity for processing duration in normal subjects. Brain Lang. 7, Gandour, J. et al. (2002) A cross-linguistic FMRI study of spectral and temporal cues underlying phonological processing. J. Cogn. Neurosci. 14,

7 30 Review TRENDS in Cognitive Sciences Vol.10 No.1 January Liégeois-Chauvel, C. et al. (2004) Temporal envelope processing in the human left and right auditory cortices. Cereb. Cortex 14, Zatorre, R.J. and Belin, P. (2001) Spectral and temporal processing in human auditory cortex. Cereb. Cortex 11, Boemio, A. et al. (2005) Hierarchical and asymmetric temporal sensitivity in human auditory cortices. Nat. Neurosci. 8, Haxby, J.V. et al. (1991) Dissociation of object and spatial visual processing pathways in human extrastriate cortex. Proc. Natl. Acad. Sci. U. S. A. 88, Griffiths, T.D. and Warren, J.D. (2004) What is an auditory object? Nat. Rev. Neurosci. 5, Zatorre, R.J. et al. (2002) Where is where in the human auditory cortex? Nat. Neurosci. 5, Hall, D.A. (2003) Auditory pathways: are what and where appropriate? Curr. Biol. 13, R406 R Tian, B. et al. (2001) Functional specialization in the rhesus monkey auditory cortex. Science 292, Warren, J.D. et al. (2005) Analysis of the spectral envelope of sounds by the human brain. Neuroimage 24, Arnott, S.R. et al. (2004) Assessing the auditory dual-pathway model in humans. Neuroimage 22, Kraus, N. and Nicol, T. (2005) Brainstem origins for cortical what and where pathways in the auditory system. Trends Neurosci. 28, Liebenthal, E. et al. Neural substrates of phonemic perception. Cereb. Cortex (in press) 33 Parker, G.J.M. et al. (2005) Lateralization of ventral and dorsal auditory-language pathways in the human brain. Neuroimage 24, Scott, S.K. and Wise, R.J.S. (2004) The functional neuroanatomy of prelexical processing in speech perception. Cognition 92, Scott, S.K. et al. (2000) Identification of a pathway for intelligible speech in the left temporal lobe. Brain 123, Belin, P. et al. (2004) Thinking the voice: neural correlates of voice perception. Trends Cogn. Sci. 8, Grandjean, D. et al. (2005) The voices of wrath: brain responses to angry prosody in meaningless speech. Nat. Neurosci. 8, Kotz, S.A. et al. (2003) On the lateralization of emotional prosody: An event-related functional imaging study. Brain Lang. 86, Mitchell, R.L.C. et al. (2003) The neural response to emotional prosody, as revealed by functional magnetic resonance imaging. Neuropsychologia 41, Morris, J.S. et al. (1999) Saying it with feeling: neural responses to emotional vocalizations. Neuropsychologia 37, Schirmer, A. et al. (2004) Gender differences in the activation of inferior frontal cortex during emotional speech perception. Neuroimage 21, Winkler, I. et al. (2005) Preattentive binding of auditory and visual stimulus features. J. Cogn. Neurosci. 17, Liebenthal, E. et al. (2003) Simultaneous ERP and fmri of the auditory cortex in a passive oddball paradigm. Neuroimage 19, Buchanan, T.W. et al. (2000) Recognition of emotional prosody and verbal components of spoken language: an fmri study. Brain Res. Cogn. Brain Res. 9, Gandour, J. et al. (2003) A cross-linguistic fmri study of perception of intonation and emotion in Chinese. Hum. Brain Mapp. 18, George, M.S. et al. (1996) Understanding emotional prosody activates right hemisphere regions. Arch. Neurol. 53, Wildgruber, D. et al. (2002) Dynamic brain activation during processing of emotional intonation: influence of acoustic parameters, emotional valence, and sex. Neuroimage 15, Wildgruber, D. et al. (2004) Distinct frontal regions subserve evaluation of linguistic and emotional aspects of speech intonation. Cereb. Cortex 14, Wildgruber, D. et al. (2005) Identification of emotional intonation evaluated by fmri. Neuroimage 24, Hoesen, G.W. et al. (2000) Orbitofrontal cortex pathology in Alzheimer s disease. Cereb. Cortex 10, Öngür, D. et al. (2003) Architectonic subdivision of the human orbital and medial prefrontal cortex. J. Comp. Neurol. 460, Amunts, K. and Zilles, K. (in press) Broca s region re-visted: a multmodal analysis of its structure and function. In Broca s Region (Grodzinski, Y. and Amunts, K., eds), Oxford University Press 53 Hornak, J. et al. (2003) Changes in emotion after circumscribed surgical lesions of the orbitofrontal and cingulate cortices. Brain 126, Rolls, E.T. (2000) The orbitofrontal cortex and reward. Cereb. Cortex 10, Thompson-Schill, S.L. (2003) Neuroimaging studies of semantic memory: inferring how from where. Neuropsychologia 41, Wagner, A.D. et al. (2001) Recovering meaning: Left prefrontal cortex guides controlled semantic retrieval. Neuron 31, Kotz, S.A. et al. (2002) Modulation of the lexical-semantic network by auditory semantic priming: an event-related functional MRI study. Neuroimage 17, Kutas, M. and Hillyard, S.A. (1980) Reading senseless sentences: Brain potentials reflect semantic incongruity. Science 207, Bostanov, V. and Kotchoubey, B. (2004) Recognition of affective prosody: Continuous wavelet measures of event-related brain potentials to emotional exclamations. Psychophysiology 41, Schirmer, A. and Kotz, S.A. (2003) ERP evidence for a gender specific Stroop effect in emotional speech. J. Cogn. Neurosci. 15, Schirmer, A. et al. (2002) Sex differentiates the role of emotional prosody during word processing. Brain Res. Cogn. Brain Res. 14, Schirmer, A. et al. (2005) On the role of attention for the processing of emotions in speech: sex differences revisited. Brain Res. Cogn. Brain Res. 24, Halgren, E. et al. (2002) N400-like magnetoencephalography responses modulated by semantic context, word frequency, and lexical class in sentences. Neuroimage 17, Sander, D. et al. (2003) The human amygdala: an evolved system for relevance detection. Rev. Neurosci. 14, Hall, J.A. (1978) Gender effects in decoding nonverbal cues. Psychol. Bull. 4, Schirmer, A. et al. (2005) Sex differences in the pre-attentive processing of vocal emotional expressions. Neuroreport 16, Pasterski, V.L. et al. (2005) Prenatal hormones and postnatal socialization by parents as determinants of male-typical toy play in girls with congenital adrenal hyperplasia. Child Dev. 76, Alexander, G.M. and Hines, M. (2002) Sex differences in response to children s toys in nonhuman primates (Cercopithecus aethiops sabaeus). Evol. Hum. Behav. 23, Taylor, S.E. et al. (2000) Biobehavioral responses to stress in females: tend-and-befriend, not fight-or-flight. Psychol. Rev. 107, Haines, D.E. (2004) Neuroanatomy, an Atlas of Structures, Sections and Systems, 6th edn, Lippincott Williams & Wilkins Reproduction of material from Elsevier articles Interested in reproducing part or all of an article published by Elsevier, or one of our article figures? If so, please contact our Global Rights Department with details of how and where the requested material will be used. To submit a permission request on-line, please visit: Alternatively, please contact: Elsevier Global Rights Department Phone: (+44) permissions@elsevier.com

Language Speech. Speech is the preferred modality for language.

Language Speech. Speech is the preferred modality for language. Language Speech Speech is the preferred modality for language. Outer ear Collects sound waves. The configuration of the outer ear serves to amplify sound, particularly at 2000-5000 Hz, a frequency range

More information

Temporal dynamics of amygdala and orbitofrontal responses to emotional prosody using intracerebral local field potentials in humans

Temporal dynamics of amygdala and orbitofrontal responses to emotional prosody using intracerebral local field potentials in humans Temporal dynamics of amygdala and orbitofrontal responses to emotional prosody using intracerebral local field potentials in humans Andy Christen, Didier Grandjean euroscience of Emotion and Affective

More information

Outline.! Neural representation of speech sounds. " Basic intro " Sounds and categories " How do we perceive sounds? " Is speech sounds special?

Outline.! Neural representation of speech sounds.  Basic intro  Sounds and categories  How do we perceive sounds?  Is speech sounds special? Outline! Neural representation of speech sounds " Basic intro " Sounds and categories " How do we perceive sounds? " Is speech sounds special? ! What is a phoneme?! It s the basic linguistic unit of speech!

More information

MULTI-CHANNEL COMMUNICATION

MULTI-CHANNEL COMMUNICATION INTRODUCTION Research on the Deaf Brain is beginning to provide a new evidence base for policy and practice in relation to intervention with deaf children. This talk outlines the multi-channel nature of

More information

Chapter 5. Summary and Conclusions! 131

Chapter 5. Summary and Conclusions! 131 ! Chapter 5 Summary and Conclusions! 131 Chapter 5!!!! Summary of the main findings The present thesis investigated the sensory representation of natural sounds in the human auditory cortex. Specifically,

More information

Atypical processing of prosodic changes in natural speech stimuli in school-age children with Asperger syndrome

Atypical processing of prosodic changes in natural speech stimuli in school-age children with Asperger syndrome Atypical processing of prosodic changes in natural speech stimuli in school-age children with Asperger syndrome Riikka Lindström, PhD student Cognitive Brain Research Unit University of Helsinki 31.8.2012

More information

A study of the effect of auditory prime type on emotional facial expression recognition

A study of the effect of auditory prime type on emotional facial expression recognition RESEARCH ARTICLE A study of the effect of auditory prime type on emotional facial expression recognition Sameer Sethi 1 *, Dr. Simon Rigoulot 2, Dr. Marc D. Pell 3 1 Faculty of Science, McGill University,

More information

Hierarchically Organized Mirroring Processes in Social Cognition: The Functional Neuroanatomy of Empathy

Hierarchically Organized Mirroring Processes in Social Cognition: The Functional Neuroanatomy of Empathy Hierarchically Organized Mirroring Processes in Social Cognition: The Functional Neuroanatomy of Empathy Jaime A. Pineda, A. Roxanne Moore, Hanie Elfenbeinand, and Roy Cox Motivation Review the complex

More information

Auditory Processing Of Schizophrenia

Auditory Processing Of Schizophrenia Auditory Processing Of Schizophrenia In general, sensory processing as well as selective attention impairments are common amongst people with schizophrenia. It is important to note that experts have in

More information

The Frontal Lobes. Anatomy of the Frontal Lobes. Anatomy of the Frontal Lobes 3/2/2011. Portrait: Losing Frontal-Lobe Functions. Readings: KW Ch.

The Frontal Lobes. Anatomy of the Frontal Lobes. Anatomy of the Frontal Lobes 3/2/2011. Portrait: Losing Frontal-Lobe Functions. Readings: KW Ch. The Frontal Lobes Readings: KW Ch. 16 Portrait: Losing Frontal-Lobe Functions E.L. Highly organized college professor Became disorganized, showed little emotion, and began to miss deadlines Scores on intelligence

More information

Dynamic functional integration of distinct neural empathy systems

Dynamic functional integration of distinct neural empathy systems Social Cognitive and Affective Neuroscience Advance Access published August 16, 2013 Dynamic functional integration of distinct neural empathy systems Shamay-Tsoory, Simone G. Department of Psychology,

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,800 116,000 120M Open access books available International authors and editors Downloads Our

More information

Rhythm and Rate: Perception and Physiology HST November Jennifer Melcher

Rhythm and Rate: Perception and Physiology HST November Jennifer Melcher Rhythm and Rate: Perception and Physiology HST 722 - November 27 Jennifer Melcher Forward suppression of unit activity in auditory cortex Brosch and Schreiner (1997) J Neurophysiol 77: 923-943. Forward

More information

Does Wernicke's Aphasia necessitate pure word deafness? Or the other way around? Or can they be independent? Or is that completely uncertain yet?

Does Wernicke's Aphasia necessitate pure word deafness? Or the other way around? Or can they be independent? Or is that completely uncertain yet? Does Wernicke's Aphasia necessitate pure word deafness? Or the other way around? Or can they be independent? Or is that completely uncertain yet? Two types of AVA: 1. Deficit at the prephonemic level and

More information

The Synchronized Brain in Emotional Processing

The Synchronized Brain in Emotional Processing The Synchronized Brain in Emotional Processing Didier Grandjean University of Geneva University of Houston, November 2014 Definition of emotion Decoding others' emotional mental states through voice 1

More information

Fundamentals of Cognitive Psychology, 3e by Ronald T. Kellogg Chapter 2. Multiple Choice

Fundamentals of Cognitive Psychology, 3e by Ronald T. Kellogg Chapter 2. Multiple Choice Multiple Choice 1. Which structure is not part of the visual pathway in the brain? a. occipital lobe b. optic chiasm c. lateral geniculate nucleus *d. frontal lobe Answer location: Visual Pathways 2. Which

More information

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION

MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION MODULE 41: THEORIES AND PHYSIOLOGY OF EMOTION EMOTION: a response of the whole organism, involving 1. physiological arousal 2. expressive behaviors, and 3. conscious experience A mix of bodily arousal

More information

Computational Explorations in Cognitive Neuroscience Chapter 7: Large-Scale Brain Area Functional Organization

Computational Explorations in Cognitive Neuroscience Chapter 7: Large-Scale Brain Area Functional Organization Computational Explorations in Cognitive Neuroscience Chapter 7: Large-Scale Brain Area Functional Organization 1 7.1 Overview This chapter aims to provide a framework for modeling cognitive phenomena based

More information

Geography of the Forehead

Geography of the Forehead 5. Brain Areas Geography of the Forehead Everyone thinks the brain is so complicated, but let s look at the facts. The frontal lobe, for example, is located in the front! And the temporal lobe is where

More information

The Deaf Brain. Bencie Woll Deafness Cognition and Language Research Centre

The Deaf Brain. Bencie Woll Deafness Cognition and Language Research Centre The Deaf Brain Bencie Woll Deafness Cognition and Language Research Centre 1 Outline Introduction: the multi-channel nature of language Audio-visual language BSL Speech processing Silent speech Auditory

More information

Title:Atypical language organization in temporal lobe epilepsy revealed by a passive semantic paradigm

Title:Atypical language organization in temporal lobe epilepsy revealed by a passive semantic paradigm Author's response to reviews Title:Atypical language organization in temporal lobe epilepsy revealed by a passive semantic paradigm Authors: Julia Miro (juliamirollado@gmail.com) Pablo Ripollès (pablo.ripolles.vidal@gmail.com)

More information

Chapter 6. Attention. Attention

Chapter 6. Attention. Attention Chapter 6 Attention Attention William James, in 1890, wrote Everyone knows what attention is. Attention is the taking possession of the mind, in clear and vivid form, of one out of what seem several simultaneously

More information

Early Occurrence Of Auditory Change Detection In The Human Brain

Early Occurrence Of Auditory Change Detection In The Human Brain Early Occurrence Of Auditory Change Detection In The Human Brain SABINE GRIMM University of Leipzig, Germany (formerly University of Barcelona, Spain) ICON, Brisbane, July 28 th, 2014 Auditory deviance

More information

Visual Context Dan O Shea Prof. Fei Fei Li, COS 598B

Visual Context Dan O Shea Prof. Fei Fei Li, COS 598B Visual Context Dan O Shea Prof. Fei Fei Li, COS 598B Cortical Analysis of Visual Context Moshe Bar, Elissa Aminoff. 2003. Neuron, Volume 38, Issue 2, Pages 347 358. Visual objects in context Moshe Bar.

More information

Disparity of Non-verbal Language Learning

Disparity of Non-verbal Language Learning Bethel Sileshi Disparity of Non-verbal Language Learning Identical Rates of Acquisition, Neurological Adaptations, Cognitive Outcomes Science 100 Capstone, University of Alberta April 7, 2014 Introduction

More information

ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION

ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION AVSP 2 International Conference on Auditory-Visual Speech Processing ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION Lynne E. Bernstein, Curtis W. Ponton 2, Edward T. Auer, Jr. House Ear

More information

fmri (functional MRI)

fmri (functional MRI) Lesion fmri (functional MRI) Electroencephalogram (EEG) Brainstem CT (computed tomography) Scan Medulla PET (positron emission tomography) Scan Reticular Formation MRI (magnetic resonance imaging) Thalamus

More information

Neural Correlates of Human Cognitive Function:

Neural Correlates of Human Cognitive Function: Neural Correlates of Human Cognitive Function: A Comparison of Electrophysiological and Other Neuroimaging Approaches Leun J. Otten Institute of Cognitive Neuroscience & Department of Psychology University

More information

FINAL PROGRESS REPORT

FINAL PROGRESS REPORT (1) Foreword (optional) (2) Table of Contents (if report is more than 10 pages) (3) List of Appendixes, Illustrations and Tables (if applicable) (4) Statement of the problem studied FINAL PROGRESS REPORT

More information

Structure and Function of the Auditory and Vestibular Systems (Fall 2014) Auditory Cortex (1) Prof. Xiaoqin Wang

Structure and Function of the Auditory and Vestibular Systems (Fall 2014) Auditory Cortex (1) Prof. Xiaoqin Wang 580.626 Structure and Function of the Auditory and Vestibular Systems (Fall 2014) Auditory Cortex (1) Prof. Xiaoqin Wang Laboratory of Auditory Neurophysiology Department of Biomedical Engineering Johns

More information

Myers Psychology for AP*

Myers Psychology for AP* Myers Psychology for AP* David G. Myers PowerPoint Presentation Slides by Kent Korek Germantown High School Worth Publishers, 2010 *AP is a trademark registered and/or owned by the College Board, which

More information

Psy /16 Human Communication. By Joseline

Psy /16 Human Communication. By Joseline Psy-302 11/16 Human Communication By Joseline Lateralization Left Hemisphere dominance in speech production in 95% of right handed and 70% of left handed people Left -> Timing, Sequence of events Right

More information

Social Cognition and the Mirror Neuron System of the Brain

Social Cognition and the Mirror Neuron System of the Brain Motivating Questions Social Cognition and the Mirror Neuron System of the Brain Jaime A. Pineda, Ph.D. Cognitive Neuroscience Laboratory COGS1 class How do our brains perceive the mental states of others

More information

Introduction to Physiological Psychology Review

Introduction to Physiological Psychology Review Introduction to Physiological Psychology Review ksweeney@cogsci.ucsd.edu www.cogsci.ucsd.edu/~ksweeney/psy260.html n Learning and Memory n Human Communication n Emotion 1 What is memory? n Working Memory:

More information

PERCEPTION OF UNATTENDED SPEECH. University of Sussex Falmer, Brighton, BN1 9QG, UK

PERCEPTION OF UNATTENDED SPEECH. University of Sussex Falmer, Brighton, BN1 9QG, UK PERCEPTION OF UNATTENDED SPEECH Marie Rivenez 1,2, Chris Darwin 1, Anne Guillaume 2 1 Department of Psychology University of Sussex Falmer, Brighton, BN1 9QG, UK 2 Département Sciences Cognitives Institut

More information

SPEECH ANALYSIS 3/3 PERCEPTION

SPEECH ANALYSIS 3/3 PERCEPTION NATURAL LANGUAGE PROCESSING Prof. L. Sbattella SPEECH ANALYSIS 3/3 PERCEPTION Dott. Ing. Sonia Cenceschi Sonia.cenceschi@polimi.it Psychoacoustics and perception Philip G. Zimbardo, Richard J. Gerrig,

More information

Cognitive Neuroscience

Cognitive Neuroscience Gazzaniga Ivry Mangun Cognitive Neuroscience FOURTH EDITION Chapter 4 Hemispheric Specialization Science debate: Is it true? Are the followings true? Left and right hemispheres perform different functions.

More information

Primitive intelligence in the auditory cortex

Primitive intelligence in the auditory cortex Review Primitive intelligence in the auditory cortex Risto Näätänen, Mari Tervaniemi, Elyse Sussman, Petri Paavilainen and István Winkler 283 The everyday auditory environment consists of multiple simultaneously

More information

CISC 3250 Systems Neuroscience

CISC 3250 Systems Neuroscience CISC 3250 Systems Neuroscience Levels of organization Central Nervous System 1m 10 11 neurons Neural systems and neuroanatomy Systems 10cm Networks 1mm Neurons 100μm 10 8 neurons Professor Daniel Leeds

More information

Methods to examine brain activity associated with emotional states and traits

Methods to examine brain activity associated with emotional states and traits Methods to examine brain activity associated with emotional states and traits Brain electrical activity methods description and explanation of method state effects trait effects Positron emission tomography

More information

The Methods of Cognitive Neuroscience. Sensory Systems and Perception: Auditory, Mechanical, and Chemical Senses 93

The Methods of Cognitive Neuroscience. Sensory Systems and Perception: Auditory, Mechanical, and Chemical Senses 93 Contents in Brief CHAPTER 1 Cognitive Neuroscience: Definitions, Themes, and Approaches 1 CHAPTER 2 The Methods of Cognitive Neuroscience CHAPTER 3 Sensory Systems and Perception: Vision 55 CHAPTER 4 CHAPTER

More information

Facial Emotion Processing in Paranoid and Non-Paranoid Schizophrenia

Facial Emotion Processing in Paranoid and Non-Paranoid Schizophrenia FACIAL EMOTION PROCESSING IN SCHIZOPHRENIA 1 Running head: FACIAL EMOTION PROCESSING IN SCHIZOPHRENIA Facial Emotion Processing in Paranoid and Non-Paranoid Schizophrenia Sophie Jacobsson Bachelor Degree

More information

Hearing in the Environment

Hearing in the Environment 10 Hearing in the Environment Click Chapter to edit 10 Master Hearing title in the style Environment Sound Localization Complex Sounds Auditory Scene Analysis Continuity and Restoration Effects Auditory

More information

Resistance to forgetting associated with hippocampus-mediated. reactivation during new learning

Resistance to forgetting associated with hippocampus-mediated. reactivation during new learning Resistance to Forgetting 1 Resistance to forgetting associated with hippocampus-mediated reactivation during new learning Brice A. Kuhl, Arpeet T. Shah, Sarah DuBrow, & Anthony D. Wagner Resistance to

More information

HST.583 Functional Magnetic Resonance Imaging: Data Acquisition and Analysis Fall 2006

HST.583 Functional Magnetic Resonance Imaging: Data Acquisition and Analysis Fall 2006 MIT OpenCourseWare http://ocw.mit.edu HST.583 Functional Magnetic Resonance Imaging: Data Acquisition and Analysis Fall 2006 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

DEFINING EMOTION 11/19/2009 THE BIOLOGY OF EMOTION & STRESS. A change in physiological arousal, ranging from slight to intense.

DEFINING EMOTION 11/19/2009 THE BIOLOGY OF EMOTION & STRESS. A change in physiological arousal, ranging from slight to intense. DEFINING EMOTION Emotion A feeling that differs from a person s normal affective state; a biological function of the nervous system. A change in physiological arousal, ranging from slight to intense. An

More information

FAILURES OF OBJECT RECOGNITION. Dr. Walter S. Marcantoni

FAILURES OF OBJECT RECOGNITION. Dr. Walter S. Marcantoni FAILURES OF OBJECT RECOGNITION Dr. Walter S. Marcantoni VISUAL AGNOSIA -damage to the extrastriate visual regions (occipital, parietal and temporal lobes) disrupts recognition of complex visual stimuli

More information

Over-representation of speech in older adults originates from early response in higher order auditory cortex

Over-representation of speech in older adults originates from early response in higher order auditory cortex Over-representation of speech in older adults originates from early response in higher order auditory cortex Christian Brodbeck, Alessandro Presacco, Samira Anderson & Jonathan Z. Simon Overview 2 Puzzle

More information

Cognitive Neuroscience Attention

Cognitive Neuroscience Attention Cognitive Neuroscience Attention There are many aspects to attention. It can be controlled. It can be focused on a particular sensory modality or item. It can be divided. It can set a perceptual system.

More information

Psych3BN3 Topic 4 Emotion. Bilateral amygdala pathology: Case of S.M. (fig 9.1) S.M. s ratings of emotional intensity of faces (fig 9.

Psych3BN3 Topic 4 Emotion. Bilateral amygdala pathology: Case of S.M. (fig 9.1) S.M. s ratings of emotional intensity of faces (fig 9. Psych3BN3 Topic 4 Emotion Readings: Gazzaniga Chapter 9 Bilateral amygdala pathology: Case of S.M. (fig 9.1) SM began experiencing seizures at age 20 CT, MRI revealed amygdala atrophy, result of genetic

More information

Neural activity to positive expressions predicts daily experience of schizophrenia-spectrum symptoms in adults with high social anhedonia

Neural activity to positive expressions predicts daily experience of schizophrenia-spectrum symptoms in adults with high social anhedonia 1 Neural activity to positive expressions predicts daily experience of schizophrenia-spectrum symptoms in adults with high social anhedonia Christine I. Hooker, Taylor L. Benson, Anett Gyurak, Hong Yin,

More information

A CONVERSATION ABOUT NEURODEVELOPMENT: LOST IN TRANSLATION

A CONVERSATION ABOUT NEURODEVELOPMENT: LOST IN TRANSLATION A CONVERSATION ABOUT NEURODEVELOPMENT: LOST IN TRANSLATION Roberto Tuchman, M.D. Chief, Department of Neurology Nicklaus Children s Hospital Miami Children s Health System 1 1 in 6 children with developmental

More information

The Brain. Its major systems, How we study them, How they make the mind

The Brain. Its major systems, How we study them, How they make the mind The Brain Its major systems, How we study them, How they make the mind 9.00 Introduction to Psychology Joanne s Recitation Section Friday, February 11, 2011 Outline 1. Syllabus: Course Requirements, Exams,

More information

Stuttering Research. Vincent Gracco, PhD Haskins Laboratories

Stuttering Research. Vincent Gracco, PhD Haskins Laboratories Stuttering Research Vincent Gracco, PhD Haskins Laboratories Stuttering Developmental disorder occurs in 5% of children Spontaneous remission in approximately 70% of cases Approximately 1% of adults with

More information

Takwa Adly Gabr Assistant lecturer of Audiology

Takwa Adly Gabr Assistant lecturer of Audiology Mismatch Negativity as an Objective Tool for the Assessment of Cognitive Function in Subjects with Unilateral Severe to Profound Sensorineural Hearing Loss Takwa Adly Gabr Assistant lecturer of Audiology

More information

The Nervous System and the Endocrine System

The Nervous System and the Endocrine System The Nervous System and the Endocrine System Neurons: The Building Blocks of the Nervous System Nervous System The electrochemical communication system of the body Sends messages from the brain to the

More information

J Jeffress model, 3, 66ff

J Jeffress model, 3, 66ff Index A Absolute pitch, 102 Afferent projections, inferior colliculus, 131 132 Amplitude modulation, coincidence detector, 152ff inferior colliculus, 152ff inhibition models, 156ff models, 152ff Anatomy,

More information

Wernicke s area is commonly considered to be the neural locus of speech. perception. It is named after the German neurologist Carl Wernicke, who first

Wernicke s area is commonly considered to be the neural locus of speech. perception. It is named after the German neurologist Carl Wernicke, who first Supplementary Material 1. Where is Wernicke s area? Wernicke s area is commonly considered to be the neural locus of speech perception. It is named after the German neurologist Carl Wernicke, who first

More information

How do individuals with congenital blindness form a conscious representation of a world they have never seen? brain. deprived of sight?

How do individuals with congenital blindness form a conscious representation of a world they have never seen? brain. deprived of sight? How do individuals with congenital blindness form a conscious representation of a world they have never seen? What happens to visual-devoted brain structure in individuals who are born deprived of sight?

More information

Do women with fragile X syndrome have problems in switching attention: Preliminary findings from ERP and fmri

Do women with fragile X syndrome have problems in switching attention: Preliminary findings from ERP and fmri Brain and Cognition 54 (2004) 235 239 www.elsevier.com/locate/b&c Do women with fragile X syndrome have problems in switching attention: Preliminary findings from ERP and fmri Kim Cornish, a,b, * Rachel

More information

Topographical functional connectivity patterns exist in the congenitally, prelingually deaf

Topographical functional connectivity patterns exist in the congenitally, prelingually deaf Supplementary Material Topographical functional connectivity patterns exist in the congenitally, prelingually deaf Ella Striem-Amit 1*, Jorge Almeida 2,3, Mario Belledonne 1, Quanjing Chen 4, Yuxing Fang

More information

Why does language set up shop where it does?

Why does language set up shop where it does? Why does language set up shop where it does? Does modality affect the functional neuroanatomy of language? For example, the M350 and the N400 localize in the vicinity of auditory cortex. Is that just an

More information

Mirror neurons. Romana Umrianova

Mirror neurons. Romana Umrianova Mirror neurons Romana Umrianova The functional role of the parieto-frontal mirror circuit: interpretations and misinterpretations Giacomo Rizzolatti and Corrado Sinigaglia Mechanism that unifies action

More information

2/25/2013. Context Effect on Suprasegmental Cues. Supresegmental Cues. Pitch Contour Identification (PCI) Context Effect with Cochlear Implants

2/25/2013. Context Effect on Suprasegmental Cues. Supresegmental Cues. Pitch Contour Identification (PCI) Context Effect with Cochlear Implants Context Effect on Segmental and Supresegmental Cues Preceding context has been found to affect phoneme recognition Stop consonant recognition (Mann, 1980) A continuum from /da/ to /ga/ was preceded by

More information

Excellent Network Courses. Department of Neurology Affiliated hospital of Jiangsu University

Excellent Network Courses. Department of Neurology Affiliated hospital of Jiangsu University Excellent Network Courses Department of Neurology Affiliated hospital of Jiangsu University Agnosia Visual Agnosia Lissauer (1890) described 2 types: a) Apperceptive Cannot see objects b) Associative Does

More information

25 Things To Know. Pre- frontal

25 Things To Know. Pre- frontal 25 Things To Know Pre- frontal Frontal lobes Cognition Touch Sound Phineas Gage First indication can survive major brain trauma Lost 1+ frontal lobe Gage Working on a railroad Gage (then 25) Foreman on

More information

Cognitive Neuroscience Cortical Hemispheres Attention Language

Cognitive Neuroscience Cortical Hemispheres Attention Language Cognitive Neuroscience Cortical Hemispheres Attention Language Based on: Chapter 18 and 19, Breedlove, Watson, Rosenzweig, 6e/7e. Cerebral Cortex Brain s most complex area with billions of neurons and

More information

Structure and Function of the Auditory and Vestibular Systems (Fall 2014) Auditory Cortex (3) Prof. Xiaoqin Wang

Structure and Function of the Auditory and Vestibular Systems (Fall 2014) Auditory Cortex (3) Prof. Xiaoqin Wang 580.626 Structure and Function of the Auditory and Vestibular Systems (Fall 2014) Auditory Cortex (3) Prof. Xiaoqin Wang Laboratory of Auditory Neurophysiology Department of Biomedical Engineering Johns

More information

Robust Neural Encoding of Speech in Human Auditory Cortex

Robust Neural Encoding of Speech in Human Auditory Cortex Robust Neural Encoding of Speech in Human Auditory Cortex Nai Ding, Jonathan Z. Simon Electrical Engineering / Biology University of Maryland, College Park Auditory Processing in Natural Scenes How is

More information

Disorders of Object and Spatial perception. Dr John Maasch Brain Injury Rehabilitation Service Burwood Hospital.

Disorders of Object and Spatial perception. Dr John Maasch Brain Injury Rehabilitation Service Burwood Hospital. Disorders of Object and Spatial perception Dr John Maasch Brain Injury Rehabilitation Service Burwood Hospital. Take Home Message 1 Where there are lesions of the posterior cerebrum and posterior temporal

More information

Sensorimotor Functioning. Sensory and Motor Systems. Functional Anatomy of Brain- Behavioral Relationships

Sensorimotor Functioning. Sensory and Motor Systems. Functional Anatomy of Brain- Behavioral Relationships Sensorimotor Functioning Sensory and Motor Systems Understanding brain-behavior relationships requires knowledge of sensory and motor systems. Sensory System = Input Neural Processing Motor System = Output

More information

Exam 1 PSYC Fall 1998

Exam 1 PSYC Fall 1998 Exam 1 PSYC 2022 Fall 1998 (2 points) Briefly describe the difference between a dualistic and a materialistic explanation of brain-mind relationships. (1 point) True or False. George Berkely was a monist.

More information

AUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening

AUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening AUDL GS08/GAV1 Signals, systems, acoustics and the ear Pitch & Binaural listening Review 25 20 15 10 5 0-5 100 1000 10000 25 20 15 10 5 0-5 100 1000 10000 Part I: Auditory frequency selectivity Tuning

More information

Event-Related fmri and the Hemodynamic Response

Event-Related fmri and the Hemodynamic Response Human Brain Mapping 6:373 377(1998) Event-Related fmri and the Hemodynamic Response Randy L. Buckner 1,2,3 * 1 Departments of Psychology, Anatomy and Neurobiology, and Radiology, Washington University,

More information

CNS composed of: Grey matter Unmyelinated axons Dendrites and cell bodies White matter Myelinated axon tracts

CNS composed of: Grey matter Unmyelinated axons Dendrites and cell bodies White matter Myelinated axon tracts CNS composed of: Grey matter Unmyelinated axons Dendrites and cell bodies White matter Myelinated axon tracts The Brain: A Quick Tour Frontal Lobe Control of skeletal muscles Personality Concentration

More information

SLHS1402 The Talking Brain

SLHS1402 The Talking Brain SLHS1402 The Talking Brain What are neuroscience core concepts? Neuroscience Core Concepts offer fundamental principles that one should know about the brain and nervous system, the most complex living

More information

Attention: Neural Mechanisms and Attentional Control Networks Attention 2

Attention: Neural Mechanisms and Attentional Control Networks Attention 2 Attention: Neural Mechanisms and Attentional Control Networks Attention 2 Hillyard(1973) Dichotic Listening Task N1 component enhanced for attended stimuli Supports early selection Effects of Voluntary

More information

Brain Imaging of auditory function

Brain Imaging of auditory function Who am I? Brain Imaging of auditory function Maria Chait I do (mostly) MEG functional brain imaging of auditory processing. Now, I am at the EAR Institute, University College London & Equipe Audition,

More information

ERP Measures of Multiple Attention Deficits Following Prefrontal Damage

ERP Measures of Multiple Attention Deficits Following Prefrontal Damage INO056 10/18/04 6:34 PM Page 339 CHAPTER 56 ERP Measures of Multiple Attention Deficits Following Prefrontal Damage Leon Y. Deouell and Robert T. Knight ABSTRACT Maintaining a goal-directed behavior requires

More information

ANALYZING EVENT-RELATED POTENTIALS

ANALYZING EVENT-RELATED POTENTIALS Adavanced Lifespan Neurocognitive Development: EEG signal processing for lifespan research Dr. Manosusos Klados Liesa Ilg ANALYZING EVENT-RELATED POTENTIALS Chair for Lifespan Developmental Neuroscience

More information

Define functional MRI. Briefly describe fmri image acquisition. Discuss relative functional neuroanatomy. Review clinical applications.

Define functional MRI. Briefly describe fmri image acquisition. Discuss relative functional neuroanatomy. Review clinical applications. Dr. Peter J. Fiester November 14, 2012 Define functional MRI. Briefly describe fmri image acquisition. Discuss relative functional neuroanatomy. Review clinical applications. Briefly discuss a few examples

More information

Human Paleoneurology and the Evolution of the Parietal Cortex

Human Paleoneurology and the Evolution of the Parietal Cortex PARIETAL LOBE The Parietal Lobes develop at about the age of 5 years. They function to give the individual perspective and to help them understand space, touch, and volume. The location of the parietal

More information

Sperling conducted experiments on An experiment was conducted by Sperling in the field of visual sensory memory.

Sperling conducted experiments on An experiment was conducted by Sperling in the field of visual sensory memory. Levels of category Basic Level Category: Subordinate Category: Superordinate Category: Stages of development of Piaget 1. Sensorimotor stage 0-2 2. Preoperational stage 2-7 3. Concrete operational stage

More information

This article was originally published in the Encyclopedia of Neuroscience published by Elsevier, and the attached copy is provided by Elsevier for the author's benefit and for the benefit of the author's

More information

11, in nonhuman primates, different cell populations respond to facial identity and expression12, and

11, in nonhuman primates, different cell populations respond to facial identity and expression12, and UNDERSTANDING THE RECOGNITION OF FACIAL IDENTITY AND FACIAL EXPRESSION Andrew J. Calder* and Andrew W. Young Abstract Faces convey a wealth of social signals. A dominant view in face-perception research

More information

Auditory fmri correlates of loudness perception for monaural and diotic stimulation

Auditory fmri correlates of loudness perception for monaural and diotic stimulation PROCEEDINGS of the 22 nd International Congress on Acoustics Psychological and Physiological Acoustics (others): Paper ICA2016-435 Auditory fmri correlates of loudness perception for monaural and diotic

More information

Psychology Formative Assessment #2 Answer Key

Psychology Formative Assessment #2 Answer Key Psychology Formative Assessment #2 Answer Key 1) C 2) B 3) B 4) C 5) D AP Objective: Discuss the influence of drugs on neurotransmitters 6) E AP Objective: Discuss the influence of drugs on neurotransmitters

More information

(SAT). d) inhibiting automatized responses.

(SAT). d) inhibiting automatized responses. Which of the following findings does NOT support the existence of task-specific mental resources? 1. a) It is more difficult to combine two verbal tasks than one verbal task and one spatial task. 2. b)

More information

Neuroscience Tutorial

Neuroscience Tutorial Neuroscience Tutorial Brain Organization : cortex, basal ganglia, limbic lobe : thalamus, hypothal., pituitary gland : medulla oblongata, midbrain, pons, cerebellum Cortical Organization Cortical Organization

More information

Neocortex. Hemispheres 9/22/2010. Psychology 472 Pharmacology of Psychoactive Drugs. Structures are divided into several section or lobes.

Neocortex. Hemispheres 9/22/2010. Psychology 472 Pharmacology of Psychoactive Drugs. Structures are divided into several section or lobes. Neocortex Psychology 472 Pharmacology of Psychoactive Drugs 1 Is the most developed in Humans Has many folds and fissures The folds of tissue are called gyri or a gyrus (single) The fissures or valleys

More information

Report. Direct Recordings of Pitch Responses from Human Auditory Cortex

Report. Direct Recordings of Pitch Responses from Human Auditory Cortex Current Biology 0,, June, 00 ª00 Elsevier Ltd. Open access under CC BY license. DOI 0.0/j.cub.00.0.0 Direct Recordings of Pitch Responses from Human Auditory Cortex Report Timothy D. Griffiths,, * Sukhbinder

More information

Contents. Boxes xii Preface xiii Acknowledgments. Background and Methods

Contents. Boxes xii Preface xiii Acknowledgments. Background and Methods Contents Boxes xii Preface xiii Acknowledgments xv PARTI Background and Methods 1 A Brief History of Cognitive Neuroscience 2 A Historical Perspective 4 The Brain Story 5 The Psychological Story 10 The

More information

Music, Epilepsy and the Brain. Stephen Brown

Music, Epilepsy and the Brain. Stephen Brown Music, Epilepsy and the Brain Stephen Brown this may increase your IQ by a few points for a few minutes (or not) Music What s it for? Mood & music Musical morbidity Cognitive aspects What s it for? Music

More information

Mirror Neurons in Primates, Humans, and Implications for Neuropsychiatric Disorders

Mirror Neurons in Primates, Humans, and Implications for Neuropsychiatric Disorders Mirror Neurons in Primates, Humans, and Implications for Neuropsychiatric Disorders Fiza Singh, M.D. H.S. Assistant Clinical Professor of Psychiatry UCSD School of Medicine VA San Diego Healthcare System

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION doi:10.1038/nature14066 Supplementary discussion Gradual accumulation of evidence for or against different choices has been implicated in many types of decision-making, including value-based decisions

More information

Homework Week 2. PreLab 2 HW #2 Synapses (Page 1 in the HW Section)

Homework Week 2. PreLab 2 HW #2 Synapses (Page 1 in the HW Section) Homework Week 2 Due in Lab PreLab 2 HW #2 Synapses (Page 1 in the HW Section) Reminders No class next Monday Quiz 1 is @ 5:30pm on Tuesday, 1/22/13 Study guide posted under Study Aids section of website

More information

Systems Neuroscience November 29, Memory

Systems Neuroscience November 29, Memory Systems Neuroscience November 29, 2016 Memory Gabriela Michel http: www.ini.unizh.ch/~kiper/system_neurosci.html Forms of memory Different types of learning & memory rely on different brain structures

More information

Decision neuroscience seeks neural models for how we identify, evaluate and choose

Decision neuroscience seeks neural models for how we identify, evaluate and choose VmPFC function: The value proposition Lesley K Fellows and Scott A Huettel Decision neuroscience seeks neural models for how we identify, evaluate and choose options, goals, and actions. These processes

More information

Twenty subjects (11 females) participated in this study. None of the subjects had

Twenty subjects (11 females) participated in this study. None of the subjects had SUPPLEMENTARY METHODS Subjects Twenty subjects (11 females) participated in this study. None of the subjects had previous exposure to a tone language. Subjects were divided into two groups based on musical

More information

D. Debatisse, E. Fornari, E. Pralong, P. Maeder, H Foroglou, M.H Tetreault, J.G Villemure. NCH-UNN and Neuroradiology Dpt. CHUV Lausanne Switzerland

D. Debatisse, E. Fornari, E. Pralong, P. Maeder, H Foroglou, M.H Tetreault, J.G Villemure. NCH-UNN and Neuroradiology Dpt. CHUV Lausanne Switzerland Vegetative comatose and auditory oddball paradigm with Cognitive evoked potentials (CEPs) and fmri: Implications for the consciousness model of Damasio and Guerit D. Debatisse, E. Fornari, E. Pralong,

More information