International Journal of Psychophysiology
|
|
- Bartholomew Webb
- 5 years ago
- Views:
Transcription
1 International Journal of Psychophysiology 86 (212) Contents lists available at SciVerse ScienceDirect International Journal of Psychophysiology journal homepage: Processing of prosodic changes in natural speech stimuli in school-age children R. Lindström a,, T. Lepistö a,b, T. Makkonen a,c, T. Kujala a,d a Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland b Department of Child Neurology, Helsinki University Central Hospital, Helsinki, Finland c Finnish Center of Excellence in Interdisciplinary Music Research, University of Jyväskylä, Finland d Cicero Learning, University of Helsinki, Helsinki, Finland article info abstract Article history: Received 4 July 212 Received in revised form 2 September 212 Accepted 26 September 212 Available online 3 October 212 Keywords: Children Emotional prosody Speech Event-related potentials Mismatch negativity Facial electromyography Speech prosody conveys information about important aspects of communication: the meaning of the sentence and the emotional state or intention of the speaker. The present study addressed processing of emotional prosodic changes in natural speech stimuli in school-age children (mean age 1 years) by recording the electroencephalogram, facial electromyography, and behavioral responses. The stimulus was a semantically neutral Finnish word uttered with four different emotional connotations: neutral, commanding, sad, and scornful. In the behavioral sound-discrimination task the reaction times were fastest for the commanding stimulus and longest for the scornful stimulus, and faster for the neutral than for the sad stimulus. EEG and EMG responses were measured during non-attentive oddball paradigm. Prosodic changes elicited a negative-going, fronto-centrally distributed neural response peaking at about 5 ms from the onset of the stimulus, followed by a fronto-central positive, peaking at about 74 ms. For the commanding stimulus also a rapid negative peaking at about 29 ms from stimulus onset was elicited. No reliable stimulus type specific rapid facial reactions were found. The results show that prosodic changes in natural speech stimuli activate pre-attentive neural change-detection mechanisms in school-age children. However, the results do not support the suggestion of automaticity of emotion specific facial muscle responses to non-attended emotional speech stimuli in children. 212 Elsevier B.V. All rights reserved. 1. Introduction Speech prosody conveys information about the meaning of the sentence and the intention and the emotional state of the speaker, which all are important aspects of communication. Speech prosody is defined as simultaneous variation of three main auditory features of speech: parallel intensity, fundamental frequency (F ), and duration changes (Botinis et al., 21). Prosody perception is fundamental to speech comprehension: in spoken language the identification of the word boundaries is based on the prosodic cues and also the meaning of the sentence can change dramatically if the prosody changes (Kuhl, 24). Even newborns are capable of extracting information from prosodic features and automatically respond to some prosodic cues suggesting that speech prosody is a key aspect in human communication (Sambeth et al., 28). Even though learning to interpret the mood of the speaker is an important developmental task during childhood, very little is known about the psychophysiological basis of children's ability to process emotional prosodic changes in natural speech. Corresponding author at: Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, P.O. Box 9, FIN-14 University of Helsinki, Finland. Tel.: ; fax: address: riikka.h.lindstrom@helsinki.fi (R. Lindström). As speech prosody is based on rapid physical changes of speech stimuli, the auditory event related potentials (ERPs) are relevant tools for studying such basic processing. The auditory ERPs are also feasible tools for studying children as the recordings can be done even in absence of participants' attention and task performance (Taylor and Baldeweg, 22). Mismatch negativity (MMN) is an ERP component that reflects automatic, pre-conscious detection of violations of auditory regularities and it is elicited by any discriminable change in auditory flow (for reviews, see Baldeweg, 27; Näätänen et al., 212, 27; Sussman, 27). It is classically recorded in an oddball paradigm, in which frequent standard stimuli are occasionally replaced by deviant stimuli (Winkler, 23). It is maximally negative over the fronto-central scalp areas. As the MMN is generated mainly in the auditory cortices it reverses polarity at the electrodes below the level of the Sylvian fissure (when nose is used as a reference; Alho, 1995; Näätänen et al., 27). In children the MMN is usually followed by a late discriminative negativity (LDN, or Late MMN) (Ceponiené et al., 1998; Korpilahti et al., 21, 1995; for review see Cheour et al., 21). The function of the LDN is not clearly known, but it is hypothesized to reflect more cognitive aspects of auditory change detection than the MMN (Ceponiené et al., 24). If the stimulus deviation catches subject's attention the MMN is often followed by a positive (P3a and/or P3b) reflecting further attentive discrimination of stimulus deviation (for a review see Escera and Corral, 27; Polich, 27) /$ see front matter 212 Elsevier B.V. All rights reserved.
2 23 R. Lindström et al. / International Journal of Psychophysiology 86 (212) Prosodic cues such as stress patterns are automatically processed in adults (Honbolygó et al., 24; Ylinen et al., 29). Adult data also suggest that the prosodic features related to the emotional state of the speaker in natural speech are detected at the early, pre-attentive level of auditory processing as reflected by the MMN and its magnetic counterpart the MMNm (Kujala et al., 25; Thönnessen et al., 21). In Kujala et al. (25) study MMN responses were elicited (at about 2 3 ms from the onset of the sound) by the stimuli with emotional connotations, which served as deviant stimuli. Thönnessen et al. (21) also reported MMNm responses to happy and angry deviants even in a sequence of randomly acoustically and phonetically changing disyllabic pseudowords and showed that the MMNm was elicited by the violations of the abstract emotional category representations, not solely by the change in acoustical features of the stimuli. There is some evidence that also school-age children can detect emotional prosodic changes in natural speech stimuli at the pre-attentive level of auditory processing. Korpilahti et al. (27) showed both early (at about 2 ms) and late (at about 65 ms) MMNs to one-word utterance pronounced with a tender or commanding voice. Grossman et al. (25) reported ERPs to natural speech stimuli with different emotional prosody even in 7-month-old infants suggesting that human brain starts to detect emotionally loaded words very early during the development. In addition to the ERP recordings, another way to study human psychophysiological correlates of processing emotional speech stimuli are facial electromyographic reactions (facial EMG). Rapid facial reactions, occurring within 1 ms, are commonly observed in adults after seeing facial expressions (Dimberg et al., 2; Dimberg, 199a; McIntosh, 26). A specific muscle area activation is associated with a specific emotion category: zygomaticus major (muscle responsible for raising the cheek in a smile) area with happy emotions and corrugator supercilii (muscle responsible for knitting the brows in a scowl) with negative emotions, for example (Dimberg and Thunberg, 1998; Dimberg, 199a; McIntosh, 26). In adult data, the rapid facial reactions to other stimulus types than facial expressions have also been reported. For example Hietanen et al. (1998) have shown rapid facial reactions to emotional speech stimuli and Magneé et al. (27) to emotional face voice pairs (for other EMG studies using sound stimuli see: Dimberg, 199b, 199c; Jäncke et al., 1996; Kjellberg et al., 1994). Rapid facial reactions are described to be as automatic and unconscious reactions because they occur rapidly (even within 3 4 ms after stimulus presentation; Dimberg and Thunberg, 1998; Dimberg, 1997), the subjects cannot avoid producing them (Dimberg et al., 22), and the reactions can occur even if the stimuli are backward masked (Dimberg et al., 2). There are two possible underlying mechanisms for these rapid facial reactions: 1) motor mimicry (an automatic tendency to mimic another person's emotional behavior probably mediated by the mirror neuron system) and 2) an affective, initial, reaction to the emotional stimulus (Beall et al., 28; Magneé et al., 27; Moody et al., 27; Williams et al., 21). Although these rapid facial reactions are well documented in adults, only few studies have been conducted with children. Recently, Beall et al. (28) showed rapid facial reactions to facial expressions in 7 12 year old children and demonstrated that the reactions were reflecting motor mimicry of the stimuli as well as the emotional reactivity towards the stimuli (for consistent results in adults see also Moody et al., 27). However, to our knowledge, no previous studies have tested if children show rapid facial reactions to natural speech stimuli. The aim of our study was to explore how school-age children process prosodic changes in naturally articulated words uttered with different emotional connotations. To this end we recorded the ERPs, rapid facial muscle reactions, and behavioral responses for different types of emotional prosodic changes of words in 7 12 years old children. These natural stimuli have a high ecological validity, enabling the inspection of responses reflecting very closely reactions in real-life situations. As the rapid facial reactions to facial expressions are reported being automatic and unconscious (Dimberg et al., 2) we wanted to study if natural speech stimuli would also elicit such involuntary facial reactions. Therefore we recorded the facial EMG together with the auditory ERPs during non-attended experimental paradigm. By combing these approaches we aimed at obtaining comprehensive information about children's neural processing of prosodic changes in natural speech stimuli. In addition, our aim was also to determine how emotional prosodic changes are discriminated at the attentive level as reflectedbybehavioralmeasures. On the basis of the studies by Grossman et al. (25) and Korpilahti et al. (27) we hypothesized that school-age children would detect emotional prosodic changes in natural speech at the pre-attentive level of auditory processing as reflected by the MMN. However, on the basis of Kujala et al. (25) we expected that the more salient prosodic changes (the commanding deviant) are more easily pre-attentively detected as reflected by earlier and stronger MMN. 2. Materials and methods 2.1. Participants 18 healthy children participated in the experiment. Their mean age was 1 years (standard deviation, sd 1.4 years; range years; 9 girls; 2 left handed). The children's cognitive abilities were assessed with the Wechsler Intelligence Scale for Children III (WISC-III, Wechsler, 1991), using three verbal and three performance subtests. The mean full IQ of the group was 111 (sd 11.1, range ), the mean verbal IQ (sd 16.4, range ), and the mean performance IQ 18.4 (sd 1.2, range ). All the children were monolingual Finnish speakers. According to parental reports they all had normal hearing, no present or past neurological disorders, language or learning difficulty or emotional problems, and no family history of developmental or psychiatric disorders. The children were recruited from three elementary schools in the Helsinki area. Parents received an introductory letter on the experiment through teachers and contacted the researcher if they were interested in the study. The parents signed an informed consent and the children gave a verbal approval prior to the experiment. The experiment was carried out according to the Declaration of Helsinki, and it was approved by the Ethical Committee of the Helsinki University Central Hospital Stimuli and procedure The same stimuli and paradigm were used as in Kujala et al. (25). The stimulus was a Finnish word, a female name Saara (none of the children participating in this study had a family member with this name) uttered by a female speaker with four different emotional connotations: neutral (stimulus length 577 ms), commanding (538 ms), sad (775 ms), and scornful (828 ms) (for acoustic waveforms, see Fig. 1). The stimuli were originally produced for a study investigating the identification of utterances with different emotional connotations (Leinonen et al., 1997). These stimuli were categorized by 73 male and female subjects in Leinonen et al. (1997) study and they all were more often identified correctly than as representing another optional emotion. The mean percentages of correctly identified stimuli in Kujala et al. (25) study were: 75% (sd 38) for the neutral stimulus, 91% (sd 19) for the commanding stimulus, 66% (sd 48) for the scornful stimulus, and 63% (sd 42) for the sad stimulus. The peak loudness of the stimuli was adjusted to vary randomly within 5 db (Leinonen et al., 1997) but no other basic acoustic parameters of the stimuli were modified to keep them as natural as possible. For each stimulus, the F and intensity at different time points as well as the mean F and intensity values were calculated from intensity and F contours determined for the whole stimulus duration using Praat software (Paul Boersma &
3 F (Hz) Sound pressure (Pa) F (Hz) Sound pressure (Pa) R. Lindström et al. / International Journal of Psychophysiology 86 (212) Commanding Time (ms) Intensity (db) Time (ms) Intensity F.3 Sad.49 Scornful Time (ms) Intensity (db) Time (ms) Intensity F Fig. 1. Acoustic features of the stimuli. Upper part of the figure presents the acoustic waveforms of the stimuli. F (left axis, dotted line) as well as intensity (right axis, solid line) contours of the stimuli are presented below the acoustic waveforms. David Weenink, version 5.3.5, The basic acoustic parameters of the stimuli are presented in Table 1 and Fig. 1. During the electroencephalogram (EEG) and facial electromyography (EMG) recordings the subjects were presented with eight stimulus blocks each containing 268 stimuli. Each block consisted of a frequently presented standard stimulus (the neutral one; 79%) that was occasionally replaced with a commanding (7%), sad (7%), or scornful (7%) deviant. The stimuli were presented pseudorandomly: each deviant stimulus was preceded by at least two standard stimuli. The stimulus onset asynchrony (SOA) was 13 ms. The stimuli were presented via loudspeakers OWI-22 (OWI Inc. CA., USA) at 56 db (SPL). The loudspeakers were located in front of the child, on the left and right side of the screen the children were watching during the recordings (at a 157 cm distance from subject's head, 18 cm apart from each other). During the experiment, the children sat in an armchair watching self-chosen soundless video in an electrically and acoustically shielded room. The children were accompanied with their parent if necessary and video-monitored during the whole experiment. The children were instructed to concentrate on the video and not to pay attention to the sounds. The whole experiment took about an hour including breaks. Behavioral sound-discrimination abilities were evaluated after the ERP and EMG recordings. Children were presented with three blocks, each with 4 stimulus pairs. In 5% of the pairs, the stimuli were identical and in 5% different (the second stimulus being a commanding, scornful or sad deviant). The first sound of the stimulus-pair was always a standard stimulus (neutral). The first sound of the stimulus pair was followed by the second sound 177 ms after the onset of the first
4 232 R. Lindström et al. / International Journal of Psychophysiology 86 (212) Table 1 Basic acoustic parameters of the stimuli. Stimulus F start Hz F peak Hz (454 ms) Commanding (114 ms) Scornful (224 ms) Sad (163 ms) F end Hz F mean Hz Intensity start db Intensity peak db Intensity end db Intensity mean db (427 ms) (12 ms) (27 ms) (26 ms) This table presents the fundamental frequency (F ) in hertz (Hz) and intensity values in decibels (db) in the beginning and at the end of each stimuli, the peak F and intensity values with the peak latencies in milliseconds (ms) in parenthesis, as well as the stimulus onset to stimulus offset mean F and intensity values. sound. The between pair stimulus onset asynchrony was 39 ms. Stimuli were presented via loudspeakers at 56 db (SPL). Children were instructed to press a response button according to whether the sounds werethesameordifferent.theywereinstructedtopresstheresponse key before the next trial started, but no other instruction about the response speed or accuracy was given. The same and different trials were presented randomly and the order of the blocks was balanced between the subjects. Before the test, 12 training trials were presented in order to familiarize the child with the task Analysis of behavioral responses The button presses starting 2 ms after the presentation of the second stimulus of the stimulus-pair and occurring before beginning of the next trial were included in the analysis. Hit rates and reaction times (RT) were separately analyzed with one-way repeated measures ANOVA. The factor was stimulus type (neutral/scornful/commanding/ sad). Mauchly's Test of Sphericity served to test the assumption of equality of variances. Degrees of freedom were corrected with Greenhouse Geisser correction when the assumption of sphericity was violated. Significant main effects were analyzed with Fisher's LSD post-hoc comparisons ERP recordings and analysis The EEG was continuously recorded (DC-14 Hz; sampling rate 512 Hz) with Biosemi Active Two Mk1 with a 64-channel active electrode set-up (BioSemi B.V., Amsterdam, The Netherlands) and with additional electrodes placed at the left and right mastoids. Vertical and horizontal eye movements were monitored with electrodes placed above and at the outer canthi of the left eye. The off-line reference electrode was attached to the tip of the nose. The recordings were later downsampled by a factor of 2. The continuous EEG was divided into 11 ms long epochs, including 1 ms pre-stimulus period serving as the baseline. Epochs with EEG changes exceeding ±1 μv at any electrode were discarded from the analysis. ERPs were averaged in each condition separately for the standard and deviant stimuli, and were digitally filtered (band pass 1 2 Hz). Difference waves were calculated by subtracting the ERPs elicited by the standard stimuli from the ERPs elicited by the deviant stimuli. The final data set consisted of, on average, 117 accepted deviant trials (range ). The mean amplitudes of the brain responses were measured as follows. The MMN is typically maximal at the fronto-central electrodes and P3a/P3b at the centro-parietal electrodes (Duncan et al., 29). Therefore, and since these responses were maximal at the central scalp areas in the present study, the peak latencies of the responses were first identified from the grand-mean difference waves at the central Cz electrode. Thereafter, the mean amplitudes from the individual-subject ERPs were calculated over a ±25-ms time period centered at this grand-mean peak latency. The individual-subject peak latencies were calculated from the difference waves at the Cz electrode with the following time-windows: the negative as the maximum negative peak between 4 and 6 ms and the positive as the maximum positive peak between 6 and 9 ms. The peak latencies were identified using the software peak search function and double-checked by the experimenter. In addition, for the commanding deviant the mean amplitudes and latencies of the rapid negative (within 29 ms from the sound onset) and positive (within 37 ms from the sound onset) s were measured correspondingly. The individual-subject peak latencies were calculated with the following time windows: the rapid negative as the maximum negative peak between 254 and 328 ms and the rapid positive as the maximum positive peak between 33 and 399 ms. The statistical significance of each was analyzed with one-sample t-tests by comparing the mean amplitudes to zero at Cz (Table 3). Differences in the amplitudes and scalp distribution of the negative and positive s were analyzed separately with three-way repeated measures ANOVA. The factors were deviant type (scornful/commanding/sad), anterior posterior electrode location (F3, Fz, F4/C3, Cz, C4/P3, Pz, P4) and lateral electrode location (F3, C3, P3/ Fz, Cz, Pz/F4, C4, P4). Significant main effects and interactions were analyzed with Fisher's LSD post-hoc comparisons. The latencies of the negative and positive s were separately analyzed with one-way repeated measures ANOVA. The factor was deviant type (scornful/commanding/sad). The statistically significant main effects were further analyzed with LSD post-hoc comparisons. The latency difference between the rapid negative elicited by the commanding deviant and the negative elicited by the sad and scornful deviants was also analyzed with one-way repeated measures ANOVA (factor: deviant type: scornful/commanding/sad) and LSD post-hoc comparisons. For each ERP mean amplitude and latency ANOVAs Mauchly's Test of Sphericity served to test the assumption of equality of variances. Degrees of freedom were corrected with Greenhouse Geisser correction when the assumption of sphericity was violated EMG recordings and analysis Electromyographic signals were bipolarly recorded using Biosemi Active Mk1 amplifier with a sampling rate of 512 Hz (bandwidth of DC-14 Hz). Electrodes were placed on corrugator supercilii and zygomaticus major area following Tassinary et al. (1989). The EMG signals were offline filtered (3 2 Hz, FFT). All epochs having extreme values over a limit of ±1 μv in EOG channel (filtered 1 3 Hz) were excluded from the analysis. The signals were rectified and then averaged in 1 ms chunks. Resulting values were log 1 transformed. Finally, the data were standardized (i.e., expressed as z-scores) within each subject and baseline corrected (1 ms pre-stimulus baseline). The EMG data were analyzed in several ways. 1) A peak detection algorithm was used that extracts the EMG reactions. The EMG activity was analyzed from 1 ms to 1 ms post-stimulus onset of the deviant and standard stimuli preceded at least by two standard stimuli.
5 R. Lindström et al. / International Journal of Psychophysiology 86 (212) The parameters of the peak detection algorithm were based on a previous study by McIntosh (26; see also Dimberg et al., 2): to be counted as a response, the activity had to rise by at least.1 Z, be sustained for at least 1 ms and then fall by at least.1z. The number of responses for each stimulus type was then divided by the presentation rate of each stimulus type (peak ratio). The statistical significance of the responses was analyzed with one-sample t-tests by comparing the peak ratio to.5. This analysis was based on the assumption that if the EMG responses are elicited randomly, the expected value of the peak ratio is.5. Peak ratio significantly differing from.5 suggests stimulus related EMG activity. Differences in EMG responses were then analyzed with one-way repeated measures ANOVA for each muscle separately. The factor was stimulus type (neutral/scornful/ commanding/sad). In the ANOVA a main effect for stimulus type would indicate the presence of stimulus type selective EMG responses. 2)Changes in themuscle activity(expressed as z-scores as in Beall et al., 28) across nine post-stimulus windows from 1 ms to 1 ms were analyzed. First 15 trials for each stimulus type were included in this analysis. Differences in EMG activity were then analyzed with two-way repeated measures ANOVA for each deviant separately. The factors were: muscle (zygomaticus/corrugator) and time (1 2 ms/ 2 3 ms/3 4 ms/4 5 ms/5 6 ms/6 7 ms/7 8 ms/8 9 ms/9 1 ms). AsinBealletal. (28) either a significant main effect for muscle or muscle time interaction would indicate the presence of the EMG responses. 3) As an alternative analysis, the post-stimulus EMG signal was expressed as a percentage value of the mean pre-stimulus baseline amplitude (as in Magneé et al., 27). Epochs with the length of 13 ms were extracted from the filtered and rectified EMG signal, (a) starting at each stimulus onset and preceded by at least two standard stimuli (13 ms post-stimulus mean activity) and (b) starting at the onset of the preceding standard stimulus (13 ms pre-stimulus mean activity). The ratio of post-stimulus mean activity compared to pre-stimulus mean activity was calculated for each stimulus type correspondingly. First 15 trials were included in this analysis. Differences in the EMG activity for different stimulus types were then analyzed with a one-way repeated measures ANOVA for each muscle separately. The factor was stimulus type (neutral/scornful/commanding/sad). A significant main effect for stimulus type would indicate the presence of stimulus type selective EMG responses. 4) The EMG signal was analyzed as above but it was averaged in 1 ms chunks and depicted as a percentage of the mean 1 ms pre-stimulus baseline amplitude. Six post-stimulus windows from 4 ms to 1 ms were analyzed (see Beall et al., 28). Differences in the EMG activity were then analyzed with a two-way repeated measures ANOVA for each muscle separately. The factors were: stimulus type (neutral/scornful/commanding/sad) and time (4 5 ms/5 6 ms/6 7 ms/7 8 ms/8 9 ms/9 1 ms). Either a significant main effect for stimulus type or stimulus type time interaction would indicate the presence of stimulus type selective EMG responses. For each EMG ANOVAs Mauchly's Test of Sphericity served to test the assumption of equality of variances. Degrees of freedom were corrected with Greenhouse Geisser correction when the assumption of sphericity was violated. 3. Results 3.1. Sound discrimination test The RTs were significantly different for different deviant types (F[2.17, 37]=3.72, pb.1, η p 2 =.64)(Table 2). The RTs were faster for the commanding stimulus than for the neutral (p b.1), scornful (pb.1), or sad (pb.1) stimulus. In addition, RTs were faster for the neutral stimulus than for the sad stimulus (pb.5). RTs were longer for the scornful stimulus than for the neutral (p b.1) or the sad stimuli (p b.1). Thus, the children were fastest in discriminating Table 2 Results of the sound discrimination task. Stimulus class Hit rate % (sd) Reaction time ms (sd) 97.1 (3.2) 15 (18) Scornful 95.5 (5.9) 111 (191) Commanding 96.8 (4.9) 99 (182) Sad 96. (6.9) 154 (19) This table presents the hit rates expressed as percentage values (%) as well as the mean reaction times in milliseconds (ms) with standard deviations (sd) in parenthesis. the commanding stimulus and slowest with the scornful stimulus, whereas the neutral stimulus was discriminated faster than the sad stimulus (Table 2). The overall hit rates were high (Table 2) there being no significant differences in the hit rates for the different stimulus type (F [3,52]=.47, p=.7, η p 2 =.3) ERP results All the deviant stimuli elicited a negative-going neural response peaking at about 5 ms from the onset of the deviant stimulus (Figs. 2 and 3). It was fronto-centrally distributed showing polarity inversion at the mastoids for the sad and scornful deviants. A frontocentral positive, peaking at about 74 ms followed this negative for all the deviants. The amplitudes of the negative and the positive s for each deviant type were statistically significant (Table 3). In addition, the commanding deviant elicited a statistically significant negative-going response peaking at about 29 ms from stimulus onset. The polarity of this rapid negative response was reversed at the mastoids. The rapid negative response was followed by a significant positive response at about 37 ms from the onset of the commanding stimuli (Table 3). No such rapid negative (within 29 ms) or positive (within 37 ms) s were elicited by sad or scornful deviants. Therefore, the differences in the amplitudes and latencies of the rapid negative and positive s elicited by the commanding deviant were not analyzed further. Overall, the brain reactions were fastest for the commanding deviant: the latency difference between the first negative-going neural response elicited by the commanding deviant (within 29 ms) and by the sad and scornful deviants (within 5 ms) was statistically significant (F[2,34]=26.53, pb.1; η p 2 =.92; commanding vs. sad pb.1, commanding vs. scornful pb.1) The negative s Asignificant main effect for the deviant type was found (F [2, 34]= 9.36, pb.1; η p 2 =.36): the amplitude of the negative was larger for the scornful than for the commanding deviant (pb.1) and larger for the sad than commanding deviant (pb.1), whereas no amplitude differences between the responses for the sad and scornful deviants were found (p=.56). There was also a significant main effect for the anterior posterior electrode location (F [1.26, 21.48]=7.86, pb.1, η p 2 =.32). The amplitude of the negative was larger in the central than posterior scalp region (pb.1). No significant amplitude differences betweenthecentralandfrontal(p=.7)orbetweenthefrontalandposterior (p=.9) scalp regions were observed. There was no significant main effect for the lateral electrode (F [1.33, 34]=.53, p=.53, η p 2 =.3). A significant deviant type anterior posterior electrode location interaction was found (F [2.2, 37.54]=5.76, pb.1, η p 2 =.25). For the scornful deviant the amplitude of the negative was larger in the frontal than posterior scalp regions (pb.1) and in the central than posterior scalp regions (pb.1), but no amplitude difference between the frontal and central scalp regions (p=.9) was observed. Also for the sad deviant the amplitude of the negative was larger in the frontal than posterior scalp regions (pb.5) and in the central than posterior scalp regions (pb.1), there being no amplitude differences between the frontal and central scalp regions (p=.36). Instead, for the commanding deviant there
6 234 R. Lindström et al. / International Journal of Psychophysiology 86 (212) µv Cz -1 ms 1 ms 5 µv Scornful ERPs Scornful Cz Sad ERPs Sad Cz Commanding ERPs Commanding Fig. 2. Grand-average ERPs to neutral (dotted line) and scornful, commanding, and sad deviants (solid lines). The onset of the sound is at ms. The acoustic waveforms for the deviant stimuli as well as for the neutral standard stimulus (above the deviant stimuli) are presented under the ERPs. were no amplitude differences between the frontal and central (p=.8), frontal and posterior (p=.53) or central and posterior (p=.44) scalp regions. The lateral electrode location anterior posterior electrode location interaction was also significant (F [2.75, 46.78]=7.9, pb.1, η p 2 =.28). Post-hoc comparisons showed that, in the frontal area, the amplitude of thenegativewaslargerattherighthemispherethanatthe midline (pb.5) or left hemisphere (pb.5). In addition, in the parietal scalp region, the amplitude of the negative was larger at midline than at the left (pb.5) or right (pb.5) hemispheres, there being no amplitude differences between left and right hemispheres (p=.34). However, in the central scalp area, no amplitude differences between the midline and left (p=.98) or right (p=.93) hemisphere or between hemispheres (p=.94) were observed. Neither were there significant deviant type lateral electrode location (F [2.91, 49.47]=1.43, p=.25, η p 2 =.8) nor deviant type anterior posterior electrode location lateral electrode location (F [4.15, 7.6] =1.62, p=.17, η p 2 =.9) interactions. There were significant differences in the latencies of the negative (F[2,34]=5.98, pb.1; η p 2 =.26). Post-hoc comparisons showed that the latency of the negative was shorter for the sad deviant than for the commanding (pb.5) or scornful (pb.5) deviants The positive A three way ANOVA for the mean amplitudes showed a significant main effect for anterior posterior electrode location (F[1.22, 2.69]= 4.72, pb.5, η p 2 =.22). Post-hoc comparisons showed that the positive was larger in the central scalp region than in the frontal (pb.1) or parietal (pb.1) scalp regions. There was also a significant interaction effect for the deviant type and lateral electrode location (F [3.19, 54.3]=3.3, pb.5, η p 2 =.16). For the sad deviant the amplitude of the positive was larger in the central scalp region than over the right hemisphere (p b.1), but no amplitude differences between the central scalp region and the left hemisphere (p=.11) or between the left and right hemispheres (p =.18) were observed. For the scornful deviant no amplitude differences between the central scalp region and the right (p=.45) or left (p=.27) hemisphere or between the hemispheres (p=.99) were observed. Also for the commanding deviant, no amplitude differences between the central areas and the right (p=.85) or left (p=.98) hemispheres or between the hemispheres (p=.87) were observed. No significant main effects for deviant type (F [2, 34]=.6, p=.56, η p 2 =.34) or lateral electrode location (F [1.49, 25.27]=1.8, p=.18, η p 2 =.1) were found. There were also no significant interaction effects for the deviant type anterior posterior electrode location (F[2.65, 44.97]=2.25, p=.1, η p 2 =.12), anterior-posterior lateral electrode location (F[4, 68]= 1.53, p=.2, η p 2 =.8) or the deviant type anterior posterior electrode lateral electrode location (F[4.9, 83.25]=.56, p=.73, η p 2 =.3). No significant differences were observed in the latencies of the positive (F [2,34]=.13, p=.88, η p 2 =.) EMG results Analysis 1 (see Method section 2.5): the EMG peak ratios for each stimulus type are presented in Table 4. For all stimulus types and muscles the EMG peak ratio was significantly different from.5 (see Table 4). Thus, the EMG responses were not elicited randomly after the stimulus presentation suggesting that both the corrugator and
7 R. Lindström et al. / International Journal of Psychophysiology 86 (212) µV F3 Fz F4-1 ms 1 ms 5µV C3 Rapid negative Cz Negative C4 Rapid positive Positive P3 Pz P4 LM Difference waves Scornful Commanding Sad RM Fig. 3. Grand-mean difference waves for scornful, commanding, and sad deviants at the nine electrodes used in the statistical analysis. The onset of the sound is at ms. zygomaticus muscles showed stimulus related activity during the experiment. However, neither the zygomaticus (F[3,51]=.75, p=.53, η p 2 )nor the corrugator (F[3,51]=.48, p=.7, η p 2 =.3) muscle showed significant main effect for the stimulus type indicating that the muscle reactions to different emotions did not differ from each other. Analysis 2: neither significant main effect for muscle nor significant muscle time interaction was observed for the neutral (F[1,17]=.75, p=.4, η p 2 =.42; F[4.19, 71.32]=1.2, p=.32, η p 2 =.7), scornful (F[1,17]=.5, p=.83, η p 2 =.; F[4.92, 83.65]=.74, p=.59, η p 2 =.4), commanding (F[1,17]=.7, p=.79, η p 2 =.; F[4.29, 73.]=.78, p=.55, η p 2 =.4) or sad stimuli (F[1,17]=.33, p=.57, η p 2 =.2; F[4.8, 81.45]=1.1, p=.36, η p 2 =.6) suggesting that there Table 3 Mean latencies and amplitudes of the brain responses at the Cz electrode as well as the t-values and degrees of freedom of one-sample t-tests. Deflection Deviant Mean latency ms (sd) Fast negative response Fast positive response Negative Positive Mean amplitude μv (sd) Commanding (12.94) 1.81 (2.56)** Commanding (52.6) 1.94 (3.8)* Scornful (16.37) 4.52 (2.28)*** Commanding 53.9 (72.75) 1.94 (1.96)** Sad (27.15) 3.85 (2.6)*** Scornful (71.93) 2.25 (2.99)** Commanding (76.51) 1.56 (1.61)** Sad (75.48) 2.87 (2.59)*** This table presents the mean latencies in milliseconds (ms) and the mean amplitudes in microvolts (μv) calculated from the difference waves of each. The standard deviations (sd) are presented in the brackets. The amplitudes significantly differing from the zero are marked with asterisks: *Pb.5, **Pb.1, and ***Pb.1. t df were no differences in the rapid facial reactions to the different emotions. Analysis 3: no significant main effect for the stimulus type was observed either for the zygomaticus muscle (F[3, 51]=1.35, p=.27, η p 2 =.7) or corrugator muscle (F[2.1, 35.79]=.97, p=.39, η p 2 =.5), suggesting no stimulus type selective EMG activity. Analysis 4: no significant main effect for stimulus type was observed either for the zygomaticus (F[3,51]=.17, p=.91, η p 2 =.1) or corrugator (F[3,51]=.17, p=.91, η p 2 =.1) muscle. Also, no stimulus type time interaction effect was observed for the zygomaticus (F[15,255]=.7, p=.78, η p 2 =.4) or corrugator (F[15,255]=.7, p=.78, η p 2 =.4) muscle. 4. Discussion In the present study our aim was to determine, by using auditory ERPs, behavioral measures, and facial muscle reactions, how 7 to 12 years old children process emotional prosodic changes in naturally articulated words. We found distinct and statistically significant ERPs to natural speech stimuli with different emotional connotations. Table 4 EMG peak ratio for each stimulus type and muscle with the standard deviation (sd) in brackets. Stimulus class EMG peak ratio (sd) Zygomaticus Corrugator.57 (.6).61 (.6) Scornful.57 (.6).6 (.5) Commanding.58 (.6).61 (.5) Sad.58 (.4).61 (.7) The EMG peak ratio for each stimulus type was calculated by combining the total number of the EMG reactions (peaks) to each stimulus type with the presentation rate of the stimulus type.
8 236 R. Lindström et al. / International Journal of Psychophysiology 86 (212) Although the children were not even attending to the stimuli, all prosodic changes activated auditory change detection mechanisms; eliciting a negative peaking at about 5 ms after the stimulus onset. In addition, for the commanding deviant a rapid negative peaking at about 29 ms was elicited. These results suggest that school-age children are able to detect acoustical changes of prosody already at the pre-attentive level. The negative was followed by a positive peaking at about 74 ms presumably reflecting an attention switch towards the prosodic change. All stimuli were also accurately discriminated in the behavioral test, as suggested by high hit rates. The RTs were the shortest for the commanding stimulus, second shortest for the neutral stimulus, third shortest for the sad stimulus and the longest for the scornful stimulus. However, even if there was some evidence of stimulus related facial muscle activity, no reliable stimulus type specific rapid facial reactions were found. All the deviant stimuli significantly elicited negative s within 5 ms from stimulus onset. The s were elicited before the offset of the deviant sounds suggesting that they reflect processing of the acoustic features underlying prosody and not duration differences between the stimuli (See Fig 2). The amplitude of this negative was larger in the frontal and central than parietal scalp regions for the sad and scornful deviants, but for the commanding deviant no such anterior posterior scalp distribution effect was found. Instead, only the commanding deviant elicited a rapid negative inverting its polarity within 29 ms after the stimulus onset followed by a rapid positive within 37 ms. Consistent with this, in the behavioral test the commanding deviant was discriminated the fastest. Taken together, the topography and time course differences mentioned above suggest that the commanding deviant was processed partly differently than the scornful and sad deviants. As the overall brain reactions were fastest for the commanding deviant, it seems that the auditory system of the children detected its salient prosodic changes the most efficiently. Since the natural stimuli we used are very complex and may trigger multiple successive ERP components varying in polarity (e.g., MMN, P3a, LDN), it is difficult to say which components contribute to the s observed. The advantage of using the present stimuli was that they provided a natural and thus ecologically highly valid tool for studying emotional speech prosody processing in children. However, with these highly acoustically variable stimuli one cannot specify which specific sound features were reflected in ERPs. However, matching the acoustic parameters further would presumably remove the emotional information conveyed by the prosody and make the stimuli less natural. The present results are consistent with those of Korpilahti et al. (27) who also found MMNs for emotional prosodic changes in school age children when they ignored the stimuli by watching movies. Korpilahti et al. (27) used as the prosodic deviant a word uttered with a commanding manner. Their deviant stimulus elicited a biphasic MMN response peaking at around 2 ms and 65 ms. Also in the present study the commanding deviant elicited negative s peaking both early (29 ms) and late (5 ms). The lack of an early change-related response in the present study for the scornful and sad deviants presumably results from these deviants having slower acoustic changes than the commanding one (see Table 1). Kujala et al. (25) found MMN responses in adults to the same deviant stimuli as used in the present study with latencies of about 19 ms (commanding stimulus) and 32 ms (sad and scornful stimuli) (see also Thönnessen et al., 21). In the present study children's ERPs to the same deviant stimuli had longer latencies: 29 ms and 5 ms for the commanding, 494 ms for the scornful and 456 ms for the sad stimuli. Thus, the present results show that when the prosodic features are salient and very distinctive (as in the commanding deviant in the present study) 7 12 years old children process prosodic information nearly as fast as adults do. However, when the prosodic features are harder to detect (as suggested by slower RTs for the scornful and sad stimuli than for the commanding stimuli) children's processing is prolonged which is also reflected by slower brain reactions. The second aim in our study was to see if typically developed school-age children show rapid facial reactions to the current stimuli. The rapid facial reactions have usually been recorded during attentive paradigms in which the subjects are instructed to look at the presented pictures or listen to the presented sounds (Beall et al., 28; Dimberg et al., 2; Hietanen et al., 1998). Our interest was to see if these reactions to natural speech stimuli are automatic and involuntary so that they would also occur when the subject's attention is directed away from the stimuli. Therefore, we recorded the EMG activity during our non-attentive EEG-experiment. We found that the EMG peak ratio differed significantly from.5 (chance). Thus, we found evidence for a stimulus related facial muscle activity to the natural speech stimuli during a non-attentive paradigm. These reactions, however, seem to be unselective for the stimulus type as no stimulus type main effect was found. There are some possible explanations for these unselective facial EMG reactions to natural speech stimuli with different emotional connotations. First, it is possible that although the natural speech stimuli were shown to elicit rapid facial reactions in adults (Hietanen et al., 1998), the autonomic nervous system of children might not react similarly. Second, the deviants might not have been sufficiently distinctive to elicit clear rapid facial reactions during a non-attentive paradigm. Third, unlike previously, in the present study the subjects were ignoring the prosodic stimuli by watching videos. It is possible that the withdrawal of attention from the stimuli prohibits reactions of autonomic nervous system. Fourth, it is also possible that the present paradigm, primarily designed for an ERP study, was not adequate for investigating facial muscle reactions to the prosodic changes. The fast presentation rate of the stimuli required for MMN recording (Duncan et al., 29; Kujala et al., 27; Näätänen et al., 27) might have inhibited the rapid facial reactions in the present study. In the previous EMG studies the stimuli have typically been presented with rather long interstimulus intervals (about 6 4 s, e.g. Beall et al., 28; Dimberg et al., 22, 2; Hietanen et al., 1998), whereas in the present study the stimulus onset asynchrony was 13 ms only. However, as the rapid facial reactions are reported to be elicited really fast, even after 3 to 4 ms from the stimulus onset, the 13 ms SOA should have been long enough for measuring such reactions. It is also possible that the repetitive presentation of the stimuli might be another reason for not obtaining stimulus specific EMG reactions. The rapid facial reactions to the repetitive neutral stimuli were, however, assumed to be weak and the reactions to the occasional emotional deviant stimuli should therefore have been detectable. To further diminish the repetition effect, we additionally analyzed EMG responses for the first 15 trials per stimulus type (analysis 2 4). Since this did not change the results, it is quite possible that even with fewer repetitions the repetitiveness might still have had an effect on the rapid facial reactions. In future studies it should be determined how attention modulates rapid facial reactions to speech sounds by comparing the facial EMG recorded in non-attended and attended conditions. It would also be important to compare the effects of SOA and thus longer baseline window as well as stimulus repetition effects on the facial EMG signal. In conclusion, the present study shows that ERP recordings are a proper approach for studying even the early pre-attentive stages of prosody perception in children with natural and thus ecologically valid stimuli. The auditory system of 7 12 year old children reacts automatically to prosodic changes, and salient prosodic features were processed more efficiently than less salient features. The pattern of 7 12 year old children's ERP reactions is consistent with that of adults (Kujala et al., 25) but the timing of the reactions is delayed in children suggesting that maturation affects the cortical processing of the emotional speech prosody still in the school-age. However, our
9 R. Lindström et al. / International Journal of Psychophysiology 86 (212) results do not support the suggestion of automaticity of facial muscle responses to unattended repetitive emotional speech stimuli in children. Disclosure statement The authors declare that the research was conducted in the absence of any commercial, financial or personal relationships that could be construed as a potential conflict of interest. Role of the funding source This research was supported by the Finnish Cultural Foundation, Rinnekoti Research Centre and the Academy of Finland (grant number 12884). Funding sources had no involvement in the study design, in the data collection or analysis, in the writing of the article and in the decision to submit the article for publication. Acknowledgments We express our warmest thanks to all the children and their parents for their participation. We would like to thank M.A. Lilli Kimppa for her assistance in data collection. References Alho, K., Cerebral generators of mismatch negativity (MMN) and its magnetic counterpart (MMNm) elicited by sound changes. Ear and Hearing 16, Baldeweg, T., 27. ERP repetition effects and mismatch negativity generation: a predictive coding perspective. Journal of Psychophysiology 21, Beall, P., Moody, E., McIntosh, D., Hepburn, S., Reed, C., 28. Rapid facial reactions to emotional facial expressions in typically developing children and children with autism spectrum disorder. Journal of Experimental Child Psychology 11, Botinis, A., Granström, B., Möbius, B., 21. Developments and paradigms in intonation research. Speech Communication 33, Ceponiené, R., Cheour, M., Näätänen, R., Interstimulus interval and auditory events related potentials in children: evidence for multiple generators. Electroencephalography and Clinical Neurophysiology 18, Ceponiené, R., Lepistö, T., Soininen, M., Aronen, E., Alku, P., Näätänen, R., 24. Eventrelated potentials associated with sound discrimination versus novelty detection in children. Psychophysiology 41, Cheour, M., Korpilahti, P., Martynova, O., Lang, A.H., 21. Mismatch negativity and late discriminative negativity in investigating speech perception and learning in children and infants. Audiology & Neuro-Otology 6, Dimberg, U., 199a. Facial electromyography and emotional reactions. Psychophysiology 27, Dimberg, U., 199b. Facial electromyographic reactions and autonomic activity to auditory stimuli. Biological Psychology 31, Dimberg, U., 199c. Perceived unpleasantness and facial reactions to auditory stimuli. Scandinavian Journal of Psychology 31, Dimberg, U., Facial reactions: rapidly evoked emotional responses. Journal of Psychophysiology 11, Dimberg, U., Thunberg, M., Rapid facial reactions to the different emotionally relevant stimuli. Scandinavian Journal of Psychology 39, Dimberg, U., Thunberg, M., Elmehed, K., 2. Unconscious facial reactions to emotional facial expressions. Psychological Science 11, Dimberg, U., Thunberg, M., Grunedal, S., 22. Facial reactions to emotional stimuli: automatically controlled emotional responses. Cognitive Emotion 16, Duncan, C., Barry, R., Conolly, J., Fischer, C., Michie, P., Näätänen, R., Polisch, J., Reinvang, I., Van Petten, C., 29. Event-related potentials in clinical research: guidelines for eliciting, recording, and quantifying mismatch negativity, P3, and N4. Clinical Neurophysiology 12, Escera, C., Corral, M.J., 27. Role of mismatch negativity and novelty-p3 in involuntary auditory attention. Journal of Psychophysiology 21, Grossman, T., Striano, T., Friederici, A., 25. Infants' electric brain responses to emotional prosody. Neuroreport 16, Hietanen, J., Surakka, V., Linnankoski, I., Facial electromyographic responses to vocal affect expressions. Psychophysiology 35, Honbolygó, F., Csépe, V., Ragó, A., 24. Suprasegmental speech cues are automatically processed by the human brain: a mismatch negativity study. Neuroscience Letters 363, Jäncke, L., Vogt, J., Musial, F., Lutz, K., Kalveram, K., Facial EMG responses to auditory stimuli. International Journal of Psychophysiology 22, Kjellberg, A., Sköldström, B., Tesarz, M., Dallner, M., Facial EMG responses to noise. Perceptual and Motor Skills 79, Korpilahti, P., Lang, A.H., Aaltonen, O., Is there a late-latency mismatch negativity (MMN) component? Electroencephalography and Clinical Neurophysiology 95, 96P. Korpilahti, P., Krause, C., Holopainen, I., Lang, H., 21. Early and late mismatch negativity elicited by words and speech-like stimuli in children. Brain and Language 76, Korpilahti, P., Jansson-Verkasalo, E., Mattila, M.L., Kuusikko, S., Suominen, K., Rytky, S., Pauls, D., Moilanen, I., 27. Processing of affective speech prosody is impaired in Asperger syndrome. Journal of Autism and Developmental Disorders 37, Kuhl, P., 24. Early language acquisition: cracking the speech code. Nature Reviews Neuroscience 5, Kujala,T.,Lepistö,T.,Nieminen-vonWendt, T., Näätänen, P., Näätänen, R., 25. Neurophysiological evidence for cortical discrimination impairment of prosody in Asperger syndrome. Neuroscience Letters 383, Kujala, T., Tervaniemi, M., Schröger, E., 27. The mismatch negativity in cognitive and clinical neuroscience: theoretical and methodological considerations. Biological Psychology 74, Leinonen, L., Hiltunen, T., Linnankoski, I., Laakso, M.J., Expression or emotional motivational connotations with a one-word utterance. Journal of the Acoustical Society of America 12, Magneé, M., Stekelenburg, J., Kemner, C., de Gelderb, B., 27. Similar facial electromyographic responses to faces, voices, and body expressions. Neuroreport 18, McIntosh, D., 26. Spontaneous facial mimicry, liking and emotional contagion. Polish Psychological Bulletin 37, Moody, E., McIntosh, D., Mann, L., Weisser, K., 27. More than mere mimicry? The influence of emotion on rapid facial reactions to faces. Emotion 7, Näätänen, R., Paavilainen, P., Rinne, T., Alho, K., 27. The mismatch negativity (MMN) in basic research of central auditory processing: a review. Clinical Neurophysiology 188, Näätänen, R., Kujala, T., Escera, C., Baldeweg, T., Kreegipuu, K., Carlson, S., Ponton, C., 212. The mismatch negativity (MMN) a unique window to disturbed central auditory processing in ageing and different clinical conditions. Clinical Neurophysiology 123, Polich, J., 27. Updating P3: an integrative theory of P3a and P3b. Clinical Neurophysiology 118, Sambeth, A., Ruohio, K., Alku, P., Fellman, V., Huotilainen, M., 28. Sleeping newborns extract prosody from continuous speech. Clinical Neurophysiology 119, Sussman, E., 27. A new view on the MMN and attention debate: the role of context in processing auditory events. Journal of Psychophysiology 21, Tassinary, L., Cappio, J., Geen, T., A psychometric study of surface electrode placements for facial electromyographic recordings: I. the brow and cheek muscle regions. Psychophysiology 26, Taylor, M.J., Baldeweg, T., 22. Application of EEG, ERP and intracranial recordings to the investigation of cognitive functions in children. Developmental Science 5, Thönnessen, H., Boers, F., Dammers, J., Chen, Y.H., Norra, C., Mathiak, K., 21. Early sensory encoding of affective prosody: neuromagnetic tomography of emotional category changes. NeuroImage 5, Wechsler, D., Wechsler Intelligence Scale for Children (WISC-III). Psychological Corporation, San Antonio, Texas. Williams, J.H.G., Whiten, A., Suddendorf, T., Perrett, D.I., 21. Imitation, mirror neurons and autism. Neuroscience and Biobehavioral Reviews 25, Winkler, I., 23. Change Detection in Complex Auditory Environment: Beyond the Oddball Paradigm. In: Polich, J. (Ed.), Detection of Change: Event-Related Potential and fmri Findings. Kluwer Academic Publishers, Boston, pp Ylinen, S., Strelnikov, K., Huotilainen, M., Näätänen, R., 29. Effects of prosodic familiarity on the automatic processing of word in human brain. International Journal of Psychophysiology 73,
Atypical processing of prosodic changes in natural speech stimuli in school-age children with Asperger syndrome
Atypical processing of prosodic changes in natural speech stimuli in school-age children with Asperger syndrome Riikka Lindström, PhD student Cognitive Brain Research Unit University of Helsinki 31.8.2012
More informationA study of the effect of auditory prime type on emotional facial expression recognition
RESEARCH ARTICLE A study of the effect of auditory prime type on emotional facial expression recognition Sameer Sethi 1 *, Dr. Simon Rigoulot 2, Dr. Marc D. Pell 3 1 Faculty of Science, McGill University,
More informationAn investigation of the auditory streaming effect using event-related brain potentials
Psychophysiology, 36 ~1999!, 22 34. Cambridge University Press. Printed in the USA. Copyright 1999 Society for Psychophysiological Research An investigation of the auditory streaming effect using event-related
More informationNeuroImage 50 (2010) Contents lists available at ScienceDirect. NeuroImage. journal homepage:
NeuroImage 50 (2010) 329 339 Contents lists available at ScienceDirect NeuroImage journal homepage: www.elsevier.com/locate/ynimg Switching associations between facial identity and emotional expression:
More informationEvent-related brain activity associated with auditory pattern processing
Cognitive Neuroscience 0 0 0 0 0 p Website publication November NeuroReport, () ONE of the basic properties of the auditory system is the ability to analyse complex temporal patterns. Here, we investigated
More informationPerceptual and cognitive task difficulty has differential effects on auditory distraction
available at www.sciencedirect.com www.elsevier.com/locate/brainres Research Report Perceptual and cognitive task difficulty has differential effects on auditory distraction Alexandra Muller-Gass, Erich
More informationReward prediction error signals associated with a modified time estimation task
Psychophysiology, 44 (2007), 913 917. Blackwell Publishing Inc. Printed in the USA. Copyright r 2007 Society for Psychophysiological Research DOI: 10.1111/j.1469-8986.2007.00561.x BRIEF REPORT Reward prediction
More informationAuditory sensory memory in 2-year-old children: an event-related potential study
LEARNING AND MEMORY Auditory sensory memory in -year-old children: an event-related potential study Elisabeth Glass, te achse and Waldemar von uchodoletz Department of Child and Adolescent Psychiatry,
More informationActivation of brain mechanisms of attention switching as a function of auditory frequency change
COGNITIVE NEUROSCIENCE Activation of brain mechanisms of attention switching as a function of auditory frequency change Elena Yago, MarõÂa Jose Corral and Carles Escera CA Neurodynamics Laboratory, Department
More informationELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION
AVSP 2 International Conference on Auditory-Visual Speech Processing ELECTROPHYSIOLOGY OF UNIMODAL AND AUDIOVISUAL SPEECH PERCEPTION Lynne E. Bernstein, Curtis W. Ponton 2, Edward T. Auer, Jr. House Ear
More informationEvent-Related Potentials Recorded during Human-Computer Interaction
Proceedings of the First International Conference on Complex Medical Engineering (CME2005) May 15-18, 2005, Takamatsu, Japan (Organized Session No. 20). Paper No. 150, pp. 715-719. Event-Related Potentials
More informationLanguage Speech. Speech is the preferred modality for language.
Language Speech Speech is the preferred modality for language. Outer ear Collects sound waves. The configuration of the outer ear serves to amplify sound, particularly at 2000-5000 Hz, a frequency range
More informationElectrophysiological evidence of enhanced distractibility in ADHD children
Neuroscience Letters 374 (2005) 212 217 Electrophysiological evidence of enhanced distractibility in ADHD children V. Gumenyuk a,, O. Korzyukov a,b, C. Escera c,m.hämäläinen d, M. Huotilainen a,e,f, T.
More informationActivation of the auditory pre-attentive change detection system by tone repetitions with fast stimulation rate
Cognitive Brain Research 10 (2001) 323 327 www.elsevier.com/ locate/ bres Short communication Activation of the auditory pre-attentive change detection system by tone repetitions with fast stimulation
More informationTitle change detection system in the visu
Title Attention switching function of mem change detection system in the visu Author(s) Kimura, Motohiro; Katayama, Jun'ich Citation International Journal of Psychophys Issue Date 2008-02 DOI Doc URLhttp://hdl.handle.net/2115/33891
More informationTitle of Thesis. Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential
Title of Thesis Study on Audiovisual Integration in Young and Elderly Adults by Event-Related Potential 2014 September Yang Weiping The Graduate School of Natural Science and Technology (Doctor s Course)
More informationThe EEG Analysis of Auditory Emotional Stimuli Perception in TBI Patients with Different SCG Score
Open Journal of Modern Neurosurgery, 2014, 4, 81-96 Published Online April 2014 in SciRes. http://www.scirp.org/journal/ojmn http://dx.doi.org/10.4236/ojmn.2014.42017 The EEG Analysis of Auditory Emotional
More informationDevelopment of infant mismatch responses to auditory pattern changes between 2 and 4 months old
European Journal of Neuroscience European Journal of Neuroscience, Vol. 29, pp. 861 867, 2009 doi:10.1111/j.1460-9568.2009.06625.x COGNITIVE NEUROSCIENCE Development of infant mismatch responses to auditory
More informationBehavioral and electrophysiological effects of task-irrelevant sound change: a new distraction paradigm
Ž. Cognitive Brain Research 7 1998 Research report Behavioral and electrophysiological effects of task-irrelevant sound change: a new distraction paradigm Erich Schroger a,), Christian Wolff b a Institut
More informationEarly Occurrence Of Auditory Change Detection In The Human Brain
Early Occurrence Of Auditory Change Detection In The Human Brain SABINE GRIMM University of Leipzig, Germany (formerly University of Barcelona, Spain) ICON, Brisbane, July 28 th, 2014 Auditory deviance
More informationOrganization of sequential sounds in auditory memory
COGNITIVE NEUROSCIENCE AND NEUROPSYCHOLOGY Organization of sequential sounds in auditory memory Elyse S. Sussman CA and Valentina Gumenyuk Department of Neuroscience, Albert Einstein College of Medicine,141
More informationANALYZING EVENT-RELATED POTENTIALS
Adavanced Lifespan Neurocognitive Development: EEG signal processing for lifespan research Dr. Manosusos Klados Liesa Ilg ANALYZING EVENT-RELATED POTENTIALS Chair for Lifespan Developmental Neuroscience
More informationWorking with EEG/ERP data. Sara Bögels Max Planck Institute for Psycholinguistics
Working with EEG/ERP data Sara Bögels Max Planck Institute for Psycholinguistics Overview Methods Stimuli (cross-splicing) Task Electrode configuration Artifacts avoidance Pre-processing Filtering Time-locking
More informationInternational Journal of Neurology Research
International Journal of Neurology Research Online Submissions: http://www.ghrnet.org/index./ijnr/ doi:1.1755/j.issn.313-511.1..5 Int. J. of Neurology Res. 1 March (1): 1-55 ISSN 313-511 ORIGINAL ARTICLE
More informationConscious control of movements: increase of temporal precision in voluntarily delayed actions
Acta Neurobiol. Exp. 2001, 61: 175-179 Conscious control of movements: increase of temporal precision in voluntarily delayed actions El bieta Szel¹g 1, Krystyna Rymarczyk 1 and Ernst Pöppel 2 1 Department
More informationD. Debatisse, E. Fornari, E. Pralong, P. Maeder, H Foroglou, M.H Tetreault, J.G Villemure. NCH-UNN and Neuroradiology Dpt. CHUV Lausanne Switzerland
Vegetative comatose and auditory oddball paradigm with Cognitive evoked potentials (CEPs) and fmri: Implications for the consciousness model of Damasio and Guerit D. Debatisse, E. Fornari, E. Pralong,
More informationPerception of Music: Problems and Prospects
A neuroscience perspective on using music to achieve emotional objectives Perception of Music: Problems and Prospects Frank Russo Ryerson University Frank Russo SMARTLab Ryerson University Room 217 February
More informationIndependence of Visual Awareness from the Scope of Attention: an Electrophysiological Study
Cerebral Cortex March 2006;16:415-424 doi:10.1093/cercor/bhi121 Advance Access publication June 15, 2005 Independence of Visual Awareness from the Scope of Attention: an Electrophysiological Study Mika
More informationNewborns discriminate novel from harmonic sounds: A study using magnetoencephalography
Clinical Neurophysiology 117 (2006) 496 503 www.elsevier.com/locate/clinph Newborns discriminate novel from harmonic sounds: A study using magnetoencephalography Anke Sambeth a,b,c, *, Minna Huotilainen
More informationPrimitive intelligence in the auditory cortex
Review Primitive intelligence in the auditory cortex Risto Näätänen, Mari Tervaniemi, Elyse Sussman, Petri Paavilainen and István Winkler 283 The everyday auditory environment consists of multiple simultaneously
More informationTwenty subjects (11 females) participated in this study. None of the subjects had
SUPPLEMENTARY METHODS Subjects Twenty subjects (11 females) participated in this study. None of the subjects had previous exposure to a tone language. Subjects were divided into two groups based on musical
More informationEarly posterior ERP components do not reflect the control of attentional shifts toward expected peripheral events
Psychophysiology, 40 (2003), 827 831. Blackwell Publishing Inc. Printed in the USA. Copyright r 2003 Society for Psychophysiological Research BRIEF REPT Early posterior ERP components do not reflect the
More informationEffect of intensity increment on P300 amplitude
University of South Florida Scholar Commons Graduate Theses and Dissertations Graduate School 2004 Effect of intensity increment on P300 amplitude Tim Skinner University of South Florida Follow this and
More informationProcessed by HBI: Russia/Switzerland/USA
1 CONTENTS I Personal and clinical data II Conclusion. III Recommendations for therapy IV Report. 1. Procedures of EEG recording and analysis 2. Search for paroxysms 3. Eyes Open background EEG rhythms
More informationEar and Hearing (In press, 2005)
Ear and Hearing (In press, 2005) EFFECTS OF SENSORINEURAL HEARING LOSS AND PERSONAL HEARING AIDS ON CORTICAL EVENT-RELATED POTENTIAL AND BEHAVIORAL MEASURES OF SPEECH-SOUND PROCESSING Peggy A. Korczak
More informationBirkbeck eprints: an open access repository of the research output of Birkbeck College
Birkbeck eprints: an open access repository of the research output of Birkbeck College http://eprints.bbk.ac.uk Sambeth, Anke; Huotilainen, Minna; Kushnerenko, Elena; Fellman, Vineta and Pihko, Elina (2006).
More informationSupporting Information
Supporting Information ten Oever and Sack 10.1073/pnas.1517519112 SI Materials and Methods Experiment 1. Participants. A total of 20 participants (9 male; age range 18 32 y; mean age 25 y) participated
More informationFinal Summary Project Title: Cognitive Workload During Prosthetic Use: A quantitative EEG outcome measure
American Orthotic and Prosthetic Association (AOPA) Center for Orthotics and Prosthetics Leraning and Outcomes/Evidence-Based Practice (COPL) Final Summary 2-28-14 Project Title: Cognitive Workload During
More informationDevelopmental Course of Auditory Event-Related Potentials in Autism
Developmental Course of Auditory Event-Related Potentials in Autism Marco R. Hoeksma Chantal Kemner Cisca Aerts Marinus N. Verbaten Herman van Engeland 35Chapter Three Chapter 3 Previous studies of auditory
More informationSeparate memory-related processing for auditory frequency and patterns
Psychophysiology, 36 ~1999!, 737 744. Cambridge University Press. Printed in the USA. Copyright 1999 Society for Psychophysiological Research Separate memory-related processing for auditory frequency and
More informationEvent-related potentials as an index of similarity between words and pictures
Psychophysiology, 42 (25), 361 368. Blackwell Publishing Inc. Printed in the USA. Copyright r 25 Society for Psychophysiological Research DOI: 1.1111/j.1469-8986.25.295.x BRIEF REPORT Event-related potentials
More informationPERCEPTION OF UNATTENDED SPEECH. University of Sussex Falmer, Brighton, BN1 9QG, UK
PERCEPTION OF UNATTENDED SPEECH Marie Rivenez 1,2, Chris Darwin 1, Anne Guillaume 2 1 Department of Psychology University of Sussex Falmer, Brighton, BN1 9QG, UK 2 Département Sciences Cognitives Institut
More informationEffects of discrepancy between imagined and perceived sounds on the N2 component of the event-related potential
Psychophysiology, 47 (2010), 289 298. Wiley Periodicals, Inc. Printed in the USA. Copyright r 2009 Society for Psychophysiological Research DOI: 10.1111/j.1469-8986.2009.00936.x Effects of discrepancy
More informationCongruency Effects with Dynamic Auditory Stimuli: Design Implications
Congruency Effects with Dynamic Auditory Stimuli: Design Implications Bruce N. Walker and Addie Ehrenstein Psychology Department Rice University 6100 Main Street Houston, TX 77005-1892 USA +1 (713) 527-8101
More informationBiomedical Research 2013; 24 (3): ISSN X
Biomedical Research 2013; 24 (3): 359-364 ISSN 0970-938X http://www.biomedres.info Investigating relative strengths and positions of electrical activity in the left and right hemispheres of the human brain
More informationThe auditory P3 from passive and active three-stimulus oddball paradigm
Research paper Acta Neurobiol Exp 2008, 68: 362 372 The auditory P3 from passive and active three-stimulus oddball paradigm Eligiusz Wronka 1,2 *, Jan Kaiser 1, and Anton M.L. Coenen 2 1 Institute of Psychology,
More informationDissociable neural correlates for familiarity and recollection during the encoding and retrieval of pictures
Cognitive Brain Research 18 (2004) 255 272 Research report Dissociable neural correlates for familiarity and recollection during the encoding and retrieval of pictures Audrey Duarte a, *, Charan Ranganath
More informationNeuropsychologia 47 (2009) Contents lists available at ScienceDirect. Neuropsychologia
Neuropsychologia 47 (2009) 218 229 Contents lists available at ScienceDirect Neuropsychologia journal homepage: www.elsevier.com/locate/neuropsychologia Maturation of cortical mismatch responses to occasional
More informationMULTI-CHANNEL COMMUNICATION
INTRODUCTION Research on the Deaf Brain is beginning to provide a new evidence base for policy and practice in relation to intervention with deaf children. This talk outlines the multi-channel nature of
More informationHuman Brain Institute Russia-Switzerland-USA
1 Human Brain Institute Russia-Switzerland-USA CONTENTS I Personal and clinical data II Conclusion. III Recommendations for therapy IV Report. 1. Procedures of EEG recording and analysis 2. Search for
More informationTranscranial direct current stimulation modulates shifts in global/local attention
University of New Mexico UNM Digital Repository Psychology ETDs Electronic Theses and Dissertations 2-9-2010 Transcranial direct current stimulation modulates shifts in global/local attention David B.
More informationAuditory information processing during human sleep as revealed by event-related brain potentials
Clinical Neurophysiology 112 (2001) 2031 2045 www.elsevier.com/locate/clinph Auditory information processing during human sleep as revealed by event-related brain potentials Mercedes Atienza a, *, José
More informationDual Mechanisms for the Cross-Sensory Spread of Attention: How Much Do Learned Associations Matter?
Cerebral Cortex January 2010;20:109--120 doi:10.1093/cercor/bhp083 Advance Access publication April 24, 2009 Dual Mechanisms for the Cross-Sensory Spread of Attention: How Much Do Learned Associations
More informationActive suppression after involuntary capture of attention
Psychon Bull Rev (2013) 20:296 301 DOI 10.3758/s13423-012-0353-4 BRIEF REPORT Active suppression after involuntary capture of attention Risa Sawaki & Steven J. Luck Published online: 20 December 2012 #
More informationSpeech Cue Weighting in Fricative Consonant Perception in Hearing Impaired Children
University of Tennessee, Knoxville Trace: Tennessee Research and Creative Exchange University of Tennessee Honors Thesis Projects University of Tennessee Honors Program 5-2014 Speech Cue Weighting in Fricative
More informationTemporal dynamics of amygdala and orbitofrontal responses to emotional prosody using intracerebral local field potentials in humans
Temporal dynamics of amygdala and orbitofrontal responses to emotional prosody using intracerebral local field potentials in humans Andy Christen, Didier Grandjean euroscience of Emotion and Affective
More informationImplicit, Intuitive, and Explicit Knowledge of Abstract Regularities in a Sound Sequence: An Event-related Brain Potential Study
Implicit, Intuitive, and Explicit Knowledge of Abstract Regularities in a Sound Sequence: An Event-related Brain Potential Study Titia L. van Zuijen, Veerle L. Simoens, Petri Paavilainen, Risto Näätänen,
More informationComparing the Affectiva imotions Facial Expression Analysis
Comparing the Affectiva imotions Facial Expression Analysis Software with EMG Louisa Kulke 1,2 *, Dennis Feyerabend 1, Annekathrin Schacht 1,2 1 Göttingen University, Department for Affective Neuroscience
More informationElectrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals
Electrophysiological Substrates of Auditory Temporal Assimilation Between Two Neighboring Time Intervals Takako Mitsudo *1, Yoshitaka Nakajima 2, Gerard B. Remijn 3, Hiroshige Takeichi 4, Yoshinobu Goto
More informationAn ERP Examination of the Different Effects of Sleep Deprivation on Exogenously Cued and Endogenously Cued Attention
Sleep Deprivation and Selective Attention An ERP Examination of the Different Effects of Sleep Deprivation on Exogenously Cued and Endogenously Cued Attention Logan T. Trujillo, PhD 1 ; Steve Kornguth,
More informationERROR PROCESSING IN CLINICAL POPULATIONS IN CONTRAST TO ADHD J.J.VAN DER MEERE
ERROR PROCESSING IN CLINICAL POPULATIONS IN CONTRAST TO ADHD J.J.VAN DER MEERE 1 THEORETICAL FRAMEWORK OF ERROR PROCESSING Mean RT x error x 500 ms x x x RTe+1 2 CEREBRAL PALSY an umbrella term covering
More informationSpeech conveys not only linguistic content but. Vocal Emotion Recognition by Normal-Hearing Listeners and Cochlear Implant Users
Cochlear Implants Special Issue Article Vocal Emotion Recognition by Normal-Hearing Listeners and Cochlear Implant Users Trends in Amplification Volume 11 Number 4 December 2007 301-315 2007 Sage Publications
More informationFigure 1. Source localization results for the No Go N2 component. (a) Dipole modeling
Supplementary materials 1 Figure 1. Source localization results for the No Go N2 component. (a) Dipole modeling analyses placed the source of the No Go N2 component in the dorsal ACC, near the ACC source
More informationNegative emotional context enhances auditory novelty processing
COGNITIVE ROSCIENCE AND ROPSYCHOLOGY ROREPORT Negative emotional context enhances auditory novelty processing Judith Dom nguez-borra' s, Manuel Garcia-Garcia and Carles Escera Cognitive Neuroscience Research
More informationDevelopment of auditory sensory memory from 2 to 6 years: an MMN study
Journal of Neural Transmission, 2008; Online First DOI 10.1007/s00702-008-0088-6 Die Originalarbeit ist verfügbar unter: www.springerlink.com Development of auditory sensory memory from 2 to 6 years: an
More informationSperling conducted experiments on An experiment was conducted by Sperling in the field of visual sensory memory.
Levels of category Basic Level Category: Subordinate Category: Superordinate Category: Stages of development of Piaget 1. Sensorimotor stage 0-2 2. Preoperational stage 2-7 3. Concrete operational stage
More informationAn EEG-based Approach for Evaluating Audio Notifications under Ambient Sounds
An EEG-based Approach for Evaluating Audio Notifications under Ambient Sounds Yi-Chieh Lee 1, Wen-Chieh Lin 1, Jung-Tai King 2, Li-Wei Ko 2,3, Yu-Ting Huang 1, Fu-Yin Cherng 1 1 Dept. of Computer Science,
More informationJan Kaiser, Andrzej Beauvale and Jarostaw Bener. Institute of Psychology, Jagiellonian University, 13 Golcbia St., ?
The evoked cardiac response as 0.0 1 1. a runction or cognitive load in subjects differing on the individual difference variable of reaction time Jan Kaiser, Andrzej Beauvale and Jarostaw Bener Institute
More informationTemporal integration: intentional sound discrimination does not modulate stimulus-driven processes in auditory event synthesis
Clinical Neurophysiology 113 (2002) 1909 1920 www.elsevier.com/locate/clinph Temporal integration: intentional sound discrimination does not modulate stimulus-driven processes in auditory event synthesis
More informationAttention Capture: Studying the Distracting Effect of One s Own Name
Umeå University Department of Psychology Bachelor s thesis in cognitive science, vt11 Attention Capture: Studying the Distracting Effect of One s Own Name Erik Marsja Supervisor: Jessica Körning-Ljungberg
More informationPHYSIOLOGICAL RESEARCH
DOMAIN STUDIES PHYSIOLOGICAL RESEARCH In order to understand the current landscape of psychophysiological evaluation methods, we conducted a survey of academic literature. We explored several different
More informationGeneralization Effect of Conditioned Sadness: An Innovation of the Approach to Elicit Implicit Sadness
Review in Psychology Research December 2012, Volume 1, Issue 1, PP.20-31 Generalization Effect of Conditioned Sadness: An Innovation of the Approach to Elicit Implicit Sadness Li Chen 1,2, Qinglin Zhang
More informationThe attentional selection of spatial and non-spatial attributes in touch: ERP evidence for parallel and independent processes
Biological Psychology 66 (2004) 1 20 The attentional selection of spatial and non-spatial attributes in touch: ERP evidence for parallel and independent processes Bettina Forster, Martin Eimer School of
More informationMENTAL WORKLOAD AS A FUNCTION OF TRAFFIC DENSITY: COMPARISON OF PHYSIOLOGICAL, BEHAVIORAL, AND SUBJECTIVE INDICES
MENTAL WORKLOAD AS A FUNCTION OF TRAFFIC DENSITY: COMPARISON OF PHYSIOLOGICAL, BEHAVIORAL, AND SUBJECTIVE INDICES Carryl L. Baldwin and Joseph T. Coyne Department of Psychology Old Dominion University
More informationBehavioural and electrophysiological measures of task switching during single and mixed-task conditions
Biological Psychology 72 (2006) 278 290 www.elsevier.com/locate/biopsycho Behavioural and electrophysiological measures of task switching during single and mixed-task conditions Philippe Goffaux a,b, Natalie
More informationTowards natural human computer interaction in BCI
Towards natural human computer interaction in BCI Ian Daly 1 (Student) and Slawomir J Nasuto 1 and Kevin Warwick 1 Abstract. BCI systems require correct classification of signals interpreted from the brain
More informationAUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT
AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT Y. Soeta, S. Uetani, and Y. Ando Graduate School of Science and Technology, Kobe
More informationResearch Article Mindfulness Trait Predicts Neurophysiological Reactivity Associated with Negativity Bias: An ERP Study
Evidence-Based Complementary and Alternative Medicine Volume 5, Article ID 368, 5 pages http://dx.doi.org/.55/5/368 Research Article Mindfulness Trait Predicts Neurophysiological Reactivity Associated
More informationNeuro Q no.2 = Neuro Quotient
TRANSDISCIPLINARY RESEARCH SEMINAR CLINICAL SCIENCE RESEARCH PLATFORM 27 July 2010 School of Medical Sciences USM Health Campus Neuro Q no.2 = Neuro Quotient Dr.Muzaimi Mustapha Department of Neurosciences
More informationThe impact of numeration on visual attention during a psychophysical task; An ERP study
The impact of numeration on visual attention during a psychophysical task; An ERP study Armita Faghani Jadidi, Raheleh Davoodi, Mohammad Hassan Moradi Department of Biomedical Engineering Amirkabir University
More informationAUDITORY ERPS IN CHILDREN WITH DEVELOPMENTAL COORDINATION DISORDER
... ANS: Journal for Neurocognitive Research Journal Homepage: www.activitas.org SHORT COMMUNICATION... AUDITORY ERPS IN CHILDREN WITH DEVELOPMENTAL COORDINATION DISORDER Irena Holeckova, 1 Ladislav Cepicka,
More informationIn what way does the parietal ERP old new effect index recollection?
Ž. International Journal of Psychophysiology 35 2000 81 87 In what way does the parietal ERP old new effect index recollection? Edward L. Wilding School of Psychology, Cardiff Uni ersity, Cardiff, CF10
More informationRAPID COMMUNICATION Scalp-Recorded Optical Signals Make Sound Processing in the Auditory Cortex Visible
NeuroImage 10, 620 624 (1999) Article ID nimg.1999.0495, available online at http://www.idealibrary.com on RAPID COMMUNICATION Scalp-Recorded Optical Signals Make Sound Processing in the Auditory Cortex
More informationTHE ROLE OF VISUAL SPEECH CUES IN THE AUDITORY PERCEPTION OF SYNTHETIC STIMULI BY CHILDREN USING A COCHLEAR IMPLANT AND CHILDREN WITH NORMAL HEARING
THE ROLE OF VISUAL SPEECH CUES IN THE AUDITORY PERCEPTION OF SYNTHETIC STIMULI BY CHILDREN USING A COCHLEAR IMPLANT AND CHILDREN WITH NORMAL HEARING Vanessa Surowiecki 1, vid Grayden 1, Richard Dowell
More informationAUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening
AUDL GS08/GAV1 Signals, systems, acoustics and the ear Pitch & Binaural listening Review 25 20 15 10 5 0-5 100 1000 10000 25 20 15 10 5 0-5 100 1000 10000 Part I: Auditory frequency selectivity Tuning
More informationREHEARSAL PROCESSES IN WORKING MEMORY AND SYNCHRONIZATION OF BRAIN AREAS
REHEARSAL PROCESSES IN WORKING MEMORY AND SYNCHRONIZATION OF BRAIN AREAS Franziska Kopp* #, Erich Schröger* and Sigrid Lipka # *University of Leipzig, Institute of General Psychology # University of Leipzig,
More informationThe neural code for interaural time difference in human auditory cortex
The neural code for interaural time difference in human auditory cortex Nelli H. Salminen and Hannu Tiitinen Department of Biomedical Engineering and Computational Science, Helsinki University of Technology,
More informationMusic training enhances the rapid plasticity of P3a/P3b event-related brain potentials for unattended and attended target sounds
Atten Percept Psychophys () 7:6 6 DOI.78/s--7-9 Music training enhances the rapid plasticity of Pa/Pb event-related brain potentials for unattended and attended target sounds Miia Seppänen & Anu-Katriina
More informationNeural correlates of short-term perceptual learning in orientation discrimination indexed by event-related potentials
Chinese Science Bulletin 2007 Science in China Press Springer-Verlag Neural correlates of short-term perceptual learning in orientation discrimination indexed by event-related potentials SONG Yan 1, PENG
More informationMental representation of number in different numerical forms
Submitted to Current Biology Mental representation of number in different numerical forms Anna Plodowski, Rachel Swainson, Georgina M. Jackson, Chris Rorden and Stephen R. Jackson School of Psychology
More informationThe combined perception of emotion from voice and face de Gelder, Bea; Böcker, K.B.E.; Tuomainen, J.; Hensen, M.; Vroomen, Jean
Tilburg University The combined perception of emotion from voice and face de Gelder, Bea; Böcker, K.B.E.; Tuomainen, J.; Hensen, M.; Vroomen, Jean Published in: Neuroscience Letters Publication date: 1999
More informationManuscript under review for Psychological Science. Direct Electrophysiological Measurement of Attentional Templates in Visual Working Memory
Direct Electrophysiological Measurement of Attentional Templates in Visual Working Memory Journal: Psychological Science Manuscript ID: PSCI-0-0.R Manuscript Type: Short report Date Submitted by the Author:
More informationTrait aspects of auditory mismatch negativity predict response to auditory training in individuals with early illness schizophrenia
Biagianti et al. Neuropsychiatric Electrophysiology (2017) 3:2 DOI 10.1186/s40810-017-0024-9 RESEARCH ARTICLE Open Access Trait aspects of auditory mismatch negativity predict response to auditory training
More informationMusic-induced Emotions and Musical Regulation and Emotion Improvement Based on EEG Technology
Music-induced Emotions and Musical Regulation and Emotion Improvement Based on EEG Technology Xiaoling Wu 1*, Guodong Sun 2 ABSTRACT Musical stimulation can induce emotions as well as adjust and improve
More informationMismatch Responses to Pitch Changes in Early Infancy
Mismatch Responses to Pitch Changes in Early Infancy Chao He, Lisa Hotson, and Laurel J. Trainor Abstract & We investigated the emergence of discriminative responses to pitch by recording 2-, 3-, and 4-month-old
More informationAn investigation of the effect of preparation on response execution and inhibition in the go/nogo task
University of Wollongong Research Online University of Wollongong Thesis Collection 1954-2016 University of Wollongong Thesis Collections 2005 An investigation of the effect of preparation on response
More informationEvent-related brain potentials reveal covert distractibility in closed head injuries
Neurophysiology 10, 2125±2129 (1999) EVENT-RELATED brain potentials (ERPs) to auditory stimuli were recorded from 11 closed head injured (CHI) and 10 age-matched healthy adults. Auditory stimuli consisted
More informationTracking pattern learning with single-trial event-related potentials
Clinical Neurophysiology 117 (2006) 1957 1973 www.elsevier.com/locate/clinph Tracking pattern learning with single-trial event-related potentials Marijtje L.A. Jongsma a, *, Tom Eichele b, Clementina M.
More informationThe Role of Large-Scale Memory Organization in the Mismatch Negativity Event-Related Brain Potential
The Role of Large-Scale Memory Organization in the Mismatch Negativity Event-Related Brain Potential IstvaÂn Winkler 1,2, Erich SchroÈger 3, and Nelson Cowan 4 Abstract & The mismatch negativity (MMN)
More informationHow high-frequency do children hear?
How high-frequency do children hear? Mari UEDA 1 ; Kaoru ASHIHARA 2 ; Hironobu TAKAHASHI 2 1 Kyushu University, Japan 2 National Institute of Advanced Industrial Science and Technology, Japan ABSTRACT
More information