The role of low frequency components in median plane localization

Similar documents
3-D SOUND IMAGE LOCALIZATION BY INTERAURAL DIFFERENCES AND THE MEDIAN PLANE HRTF. Masayuki Morimoto Motokuni Itoh Kazuhiro Iida

A NOVEL HEAD-RELATED TRANSFER FUNCTION MODEL BASED ON SPECTRAL AND INTERAURAL DIFFERENCE CUES

Binaural Hearing. Why two ears? Definitions

THE RELATION BETWEEN SPATIAL IMPRESSION AND THE PRECEDENCE EFFECT. Masayuki Morimoto

3-D Sound and Spatial Audio. What do these terms mean?

CONTRIBUTION OF DIRECTIONAL ENERGY COMPONENTS OF LATE SOUND TO LISTENER ENVELOPMENT

Sound localization psychophysics

EFFECTS OF TEMPORAL FINE STRUCTURE ON THE LOCALIZATION OF BROADBAND SOUNDS: POTENTIAL IMPLICATIONS FOR THE DESIGN OF SPATIAL AUDIO DISPLAYS

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 THE DUPLEX-THEORY OF LOCALIZATION INVESTIGATED UNDER NATURAL CONDITIONS

IN EAR TO OUT THERE: A MAGNITUDE BASED PARAMETERIZATION SCHEME FOR SOUND SOURCE EXTERNALIZATION. Griffin D. Romigh, Brian D. Simpson, Nandini Iyer

Determination of filtering parameters for dichotic-listening binaural hearing aids

Effect of spectral content and learning on auditory distance perception

J. Acoust. Soc. Am. 114 (2), August /2003/114(2)/1009/14/$ Acoustical Society of America

Human Sensitivity to Interaural Phase Difference for Very Low Frequency Sound

Neural System Model of Human Sound Localization

Localization in the Presence of a Distracter and Reverberation in the Frontal Horizontal Plane. I. Psychoacoustical Data

The role of high frequencies in speech localization

The basic mechanisms of directional sound localization are well documented. In the horizontal plane, interaural difference INTRODUCTION I.

Trading Directional Accuracy for Realism in a Virtual Auditory Display

Discrimination and identification of azimuth using spectral shape a)

The basic hearing abilities of absolute pitch possessors

Juha Merimaa b) Institut für Kommunikationsakustik, Ruhr-Universität Bochum, Germany

PERCEPTION OF AUDITORY-VISUAL SIMULTANEITY CHANGES BY ILLUMINANCE AT THE EYES

The development of a modified spectral ripple test

Study of perceptual balance for binaural dichotic presentation

Localization in the Presence of a Distracter and Reverberation in the Frontal Horizontal Plane: II. Model Algorithms

Improved method for accurate sound localization

Perceptual Plasticity in Spatial Auditory Displays

Sound localization under conditions of covered ears on the horizontal plane

Development of a new loudness model in consideration of audio-visual interaction

HST.723J, Spring 2005 Theme 3 Report

Brian D. Simpson Veridian, 5200 Springfield Pike, Suite 200, Dayton, Ohio 45431

Hearing. Juan P Bello

Neural correlates of the perception of sound source separation

Spectral processing of two concurrent harmonic complexes

Effects of speaker's and listener's environments on speech intelligibili annoyance. Author(s)Kubo, Rieko; Morikawa, Daisuke; Akag

Masked Perception Thresholds of Low Frequency Tones Under Background Noises and Their Estimation by Loudness Model

Effect on car interior sound quality according to the variation of noisy components of tire-pattern noise

Binaural weighting of pinna cues in human sound localization

Effect of vibration sense by frequency characteristics of impact vibration for residential floor

The use of interaural time and level difference cues by bilateral cochlear implant users

Binaural Hearing. Steve Colburn Boston University

AUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening

Quarterly Progress and Status Report. Masking effects of one s own voice

Noise-Robust Speech Recognition in a Car Environment Based on the Acoustic Features of Car Interior Noise

Audibility of time differences in adjacent head-related transfer functions (HRTFs) Hoffmann, Pablo Francisco F.; Møller, Henrik

Abstract. 1. Introduction. David Spargo 1, William L. Martens 2, and Densil Cabrera 3

Minimum Audible Angles Measured with Simulated Normally-Sized and Oversized Pinnas for Normal-Hearing and Hearing- Impaired Test Subjects

Perception of tonal components contained in wind turbine noise

Thresholds for different mammals

Perceptual pitch shift for sounds with similar waveform autocorrelation

Psychoacoustical Models WS 2016/17

A. SEK, E. SKRODZKA, E. OZIMEK and A. WICHER

Hearing II Perceptual Aspects

HCS 7367 Speech Perception

Hearing. Figure 1. The human ear (from Kessel and Kardon, 1979)

An active unpleasantness control system for indoor noise based on auditory masking

Publication VI. c 2007 Audio Engineering Society. Reprinted with permission.

The effect of wearing conventional and level-dependent hearing protectors on speech production in noise and quiet

Representation of sound in the auditory nerve

EEL 6586, Project - Hearing Aids algorithms

Comment by Delgutte and Anna. A. Dreyer (Eaton-Peabody Laboratory, Massachusetts Eye and Ear Infirmary, Boston, MA)

Issues faced by people with a Sensorineural Hearing Loss

A Microphone-Array-Based System for Restoring Sound Localization with Occluded Ears

The influence of binaural incoherence on annoyance reported for unpleasant low frequency sound

Proceedings of Meetings on Acoustics

HEARING AND PSYCHOACOUSTICS

21/01/2013. Binaural Phenomena. Aim. To understand binaural hearing Objectives. Understand the cues used to determine the location of a sound source

Subjective impression of copy machine noises: an improvement of their sound quality based on physical metrics

Pitfalls in behavioral estimates of basilar-membrane compression in humans a)

SUBJECTIVE EVALUATION OF AUDITORY SPATIAL IMAGERY ASSOCIATED WITH DECORRELATED SUBWOOFER SIGNALS. William L. Martens

Spatial hearing and sound localization mechanisms in the brain. Henri Pöntynen February 9, 2016

inter.noise 2000 The 29th International Congress and Exhibition on Noise Control Engineering August 2000, Nice, FRANCE

Effect of source spectrum on sound localization in an everyday reverberant room

CROSSMODAL PLASTICITY IN SPECIFIC AUDITORY CORTICES UNDERLIES VISUAL COMPENSATIONS IN THE DEAF "

Localization 103: Training BiCROS/CROS Wearers for Left-Right Localization

Binaural hearing and future hearing-aids technology

A linear relation between loudness and decibels

Voice Pitch Control Using a Two-Dimensional Tactile Display

Frequency refers to how often something happens. Period refers to the time it takes something to happen.

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES

Effects of partial masking for vehicle sounds

Signals, systems, acoustics and the ear. Week 5. The peripheral auditory system: The ear as a signal processor

An Auditory System Modeling in Sound Source Localization

Influence of Frequency on Difference Thresholds for Magnitude of Vertical Sinusoidal Whole-Body Vibration

Systems Neuroscience Oct. 16, Auditory system. http:

ACOUSTIC AND PERCEPTUAL PROPERTIES OF ENGLISH FRICATIVES

Dynamic-range compression affects the lateral position of sounds

AUTOCORRELATION AND CROSS-CORRELARION ANALYSES OF ALPHA WAVES IN RELATION TO SUBJECTIVE PREFERENCE OF A FLICKERING LIGHT

CHAPTER 1. Simon Carlile 1. PERCEIVING REAL AND VIRTUAL SOUND FIELDS

INTRODUCTION J. Acoust. Soc. Am. 103 (2), February /98/103(2)/1080/5/$ Acoustical Society of America 1080

Sound Localization PSY 310 Greg Francis. Lecture 31. Audition

Auditory temporal order and perceived fusion-nonfusion

Threshold of hearing for pure tone under free-field

Temporal offset judgments for concurrent vowels by young, middle-aged, and older adults

Keywords: time perception; illusion; empty interval; filled intervals; cluster analysis

functions grow at a higher rate than in normal{hearing subjects. In this chapter, the correlation

Spectral and Spatial Parameter Resolution Requirements for Parametric, Filter-Bank-Based HRTF Processing*

Lecture 3: Perception

9/29/14. Amanda M. Lauer, Dept. of Otolaryngology- HNS. From Signal Detection Theory and Psychophysics, Green & Swets (1966)

Transcription:

Acoust. Sci. & Tech. 24, 2 (23) PAPER The role of low components in median plane localization Masayuki Morimoto 1;, Motoki Yairi 1, Kazuhiro Iida 2 and Motokuni Itoh 1 1 Environmental Acoustics Laboratory, Faculty of Engineering, Kobe University, Rokko, Nada, Kobe, 657 851 Japan 2 Multimedia Solution Laboratories, Matsushita Communication Industrial Co., Ltd., Saedo, Tsuzuki, Yokohama, 224 8539 Japan ( Received 3 April 22, Accepted for publication 4 September 22 ) Abstract: The high components of an auditory stimulus are considered to be the primary contribution to median plane localization. However, a number of studies have demonstrated that the low components of a stimulus are also important in median plane localization. Asano et al. concluded that important cues for front-back discrimination seem to exist in the range below 2 khz. In the present paper, localization tests were performed in order to examine the contribution of low components to median plane localization. In these tests, the higher (above 4,8 Hz) and lower (below 4,8 Hz) components, respectively, of a wide-band white noise were simultaneously presented from different directions so that the individual components provided different directional information. The results of these tests reveal that: (1) when the source is a wideband signal, the higher components are dominant in median plane localization, whereas the lower components do not contribute significantly to the localization, and (2) important cues for front-back discrimination do not exist in the low range. Keywords: Median plane localization, Spectral cues, Low components, Head-related transfer function PACS number: 43.66.Qp, 43.66.Pn 1. INTRODUCTION It is generally known that the amplitude spectrum of the head-related transfer function (HRTF) provides cues, called spectral cues, for median plane localization [1]. However, the mechanism by which spectral cues are extracted has not yet been revealed. Generally speaking, two assumptions have been discussed concerning this mechanism. One assumption is that some spectral features such as peak and dip provide the cues. Blauert [2] demonstrated that the direction of a sound image for a 1=3 octave band noise was a function of center only and did not depend on the source elevation angle. He referred to the band by which the direction of sound image is determined as the directional band. Moreover, he inferred that the direction of sound image for a broadband stimulus is perceived according to the directional band, which corresponds to the components boosted in the amplitude spectrum of the HRTF. Hebrank and Wright [3] compared the results of localization tests performed using various band-pass e-mail: mrmt@kobe-u.ac.jp stimuli to HRTF spectra, and speculated that some peaks and dips in the spectrum act as spectral cues. Similar results were indicated by Butler and Belendiuk [4] and Bloom [5]. However, these results are not consistent with the results of experimental tests conducted by Asahi and Matsuoka [6], who reported that although the perceived direction of sound image for a dip-filtered stimulus changed as the dip changed, the direction of sound image did not coincide with the direction of sound source, in which HRTF included the same dip as the stimulus. The second assumption is that the entire spectrum of the stimulus contributes to the localization of the sound image. Morimoto [7] demonstrated that the localization accuracy increased as the bandwidth of a stimulus having the center of approximately 8 khz increased. In other words, Morimoto suggested that neither specific peaks nor dips act as spectral cues, and that the amount of the cue is proportional to the bandwidth of the stimulus. Middlebrooks [8] proposed the vertical angle perception model, in which the sound image is localized in the direction in which the correlation coefficient between the 76

SPL (db) M. MORIMOTO et al.: ROLE OF LOW FREQUENCY COMPONENTS IN SOUND LOCALIZATION spectrum of the input signal to the ears and the spectrum of the subject s HRTF is a maximum. Meanwhile, it is assumed that the high components of a stimulus are the primary contribution to median plane localization. Mehrgardt and Mellert [9] showed that the HRTF spectrum changes systematically above 5 khz as the elevation angle changes. Morimoto [7] observed the localization accuracy using several high- or low-passed stimuli and concluded that when the stimulus contains components of 4,8 9, Hz, the localization error is small, and that when the stimulus does not contain components above 4,8 Hz, the sound image always appears on the horizontal plane. Moreover, when the components above 2,4 Hz are lost, front-back confusion occurs. On the other hand, a number of studies have demonstrated that the low components of a stimulus also contribute to median plane localization. Asano et al. [1] conducted virtual localization tests using the HRTF smoothed by applying the ARMA (Auto-Regressive Moving-Average) model. They concluded that important cues for front-back discrimination seem to exist in the range below 2 khz. Algazi et al. [11] showed that subjects could estimate the elevation angle even if a stimulus lowpass filtered to 3 khz was used, when the source was located away from the median plane. The contribution of low components can also be found in Morimoto s results [7] which reveal that front-back discrimination can be achieved when the stimulus contains components of 2,4 4,8 Hz. However, the contribution of low components to median plane localization has not yet been clarified. The importance of low components is not clear from the results by Morimoto [7] and Algazi et al. [11], and the results reported by Asano et al. [1] are not sufficient to discuss the importance of low components, since the smoothed HRTF used in their experiments can be regarded as providing different information than that provided by the true HRTF. In the present paper, localization tests were performed in order to examine the contribution of low components to median plane localization. In these tests, the higher and lower components of a wide-band white noise are simultaneously presented from different directions so that each components may provide different directional information. 2. EXPERIMENTAL METHOD 2.1. Subjects Subjects were three male students (SM, TM, TU) with normal hearing sensitivity. back Fig. 1 18 above 1.5m 3 front Loudspeaker arrangement in the experiment. 2.2. Apparatus The localization tests were conducted in an anechoic chamber. Seven cylindrical loudspeakers (diameter: 18 mm, length: 35 mm) were located at every 3 in the upper median plane, as described in Fig. 1. The loudspeaker radius was 1.5 m relative to the center of the subject s head. The characteristics of the seven loudspeakers were flattened to within 3 db in the range of the stimulus by a equalizer (Technics SH-865). 2.3. Stimuli The source signal was a wide-band white noise from 28 Hz to 11.2 khz which was divided into two components at the cutoff of 4,8 Hz, as indicated in Fig. 2. In this paper, two localization tests were performed. In the first test, the two components were simultaneously presented from different loudspeakers. In the second test, the two components were presented from the same loudspeaker simultaneously. Hence the number of stimuli was 49 (seven directions for the high-pass components seven directions for the low-pass components). Note that seven out of the 49 stimuli had lower and higher components that were presented from the same direction. The stimuli were delivered at 5 :2 dba for 1,682 ms, including 341 ms onset and offset ramps, followed by an interval of 5, ms. 48 db/oct. slope 28 lower components higher components 48 1 Frequency (Hz) Fig. 2 Schematic diagram of the characteristics of the stimuli. 3 db 77

Perceived direction (degrees) Acoust. Sci. & Tech. 24, 2 (23) 2.4. Procedure Each subject was tested individually while seated, with his head fixed in a stationary position, in a partially darkened anechoic chamber. Recording sheets with a circle intersected by perpendicular lines on it were supplied to the subjects. The circle indicates the median plane, and the four intersections of the circle by the cross lines indicate,,, and 18 relative to the front. The subject s task was to mark down the perceived directions of all sound images for each stimulus presentation. The 5, ms inter-stimulus interval allowed the subject to pick up the next recording sheet. The only light in the chamber was placed such that it provided just enough illumination for the subject to see and utilize the recording sheets. Each stimulus was repeated ten times in random order. In the first test in which the lower and higher components were presented from different directions, 42 stimuli (42 kinds of stimuli 1 times) were separated into five sessions. Each session was completed in approximately 9.5 minutes. In the second test in which both components were presented from the same direction, 7 stimuli were presented in one session. This test was completed in approximately 8 minutes. Each subject made a total of 4 judgments for the entire task. 3. EXPERIMENTAL RESULTS AND DISCUSSION 3.1. Distribution of Subject s Responses The direction marked by the subject was read with a protractor to an accuracy of one degree. Figure 3(a) shows the responses to the stimuli whose lower and higher components were presented from the same direction. The diameter of each circle plotted is proportional to the number of responses. The ordinate of each panel is the perceived direction, and the abscissa is the source direction. Figure 3(b) shows the results obtained in the preparatory tests, in which normal wide-band noise stimuli (28 11,2 Hz) were presented from loudspeakers in the median plane. Comparing panels (a) and (b), no significant difference exists in the distributions of responses. This means that, even if a source signal is divided into lower and higher components, all listeners can localize the sound images as accurately as the normal wide-band stimulus when both components are presented from the same direction. Figures 4 6 show the responses for each subject to the stimuli for which the lower and higher components were presented from different directions. Results are presented for the directions of the lower components. The abscissa in each panel is the direction of the higher components, and the dot-dashed line indicates the direction of the lower components. 18 3 18 3 18 3 (a) Subject : SM Subject : TM Subject : TU Subject: SM Subject: TM Subject: TU 3 18 3 18 Source direction (degrees) Subject SM (Fig. 4) perceived a fused sound image for each stimulus, even when the lower and higher components of the stimulus were presented from different directions. When the lower components were presented from (panel (a)) and the higher components were presented from, and, sound images were localized between the directions of the two components. When the lower components were presented from 3, and (panels (b) (d)) and the higher components were presented from and, sound images were localized between the directions of the two components or near the direction of the lower components. When the lower components were presented from (panel (e)) and the higher components were presented from, sound images were localized around the direction of the lower components. When the lower components were presented from 18 (panel (g)) and the higher components were presented from and, sound images were localized between the (b) Fig. 3 Distribution of responses. (a) shows the responses to the stimuli whose lower (28 4,8 Hz) and higher (4,8 11,2 Hz) components were presented from the same direction in the median plane. (b) shows the responses to normal wide-band (28 11,2 Hz) noise stimuli in the median plane. 78

Perceived direction (degrees) Perceived direction (degrees) M. MORIMOTO et al.: ROLE OF LOW FREQUENCY COMPONENTS IN SOUND LOCALIZATION 18 (a) Direction of lower components: (e) 18 (a) Direction of lower components: 3 3 (e) 18 18 (b) 3 (b) 3 3 18 3 18 3 (c) (d) 3 18 (f) (g) 18 3 18 Direction of higher components (degrees) 3 18 3 18 3 (c) (d) 3 18 (f) (g) 18 3 18 Direction of higher components (degrees) Fig. 4 Responses to the stimuli whose higher and lower components were presented from different directions in the median plane for subject SM. directions of the two components. In all other cases, the perceived directions coincided approximately with the directions of the higher components. Subject TM (Fig. 5) also perceived a fused sound image for each stimulus. When the lower components were presented from and 3 (panels (a) and (b)) and the higher components were presented from, the perceived directions were distributed over all directions. When the lower components were presented from and (panels (c) and (d)) and the higher components were presented from 3, sound images were localized around the direction of the lower components. When the lower components were presented from and 18 (panels (f) and (g)) and the higher components were presented from 3, and, sound Fig. 5 As Fig. 4 for subject TM. images were localized between the directions of the two components. In all other cases, the perceived directions coincided approximately with the directions of the higher components. Subject TU (Fig. 6) sometimes marked down two directions on a recording sheet for the stimuli of which lower components were presented from and 18 (panels (f) and (g)) and higher components were presented from. According to his report, he perceived a sound image extended straight from the front to the rear, and so marked two directions. Nevertheless, the images were localized primarily in the front. Therefore, the responses near 18 (triangles in Fig. 6) are omitted from the examination. Most of the perceived directions coincided approximately with the directions of the higher components. Summarizing these results, although a few sound images were perceived as being shifted toward the 79

Perceived direction (degrees) Acoust. Sci. & Tech. 24, 2 (23) 18 3 18 3 18 3 18 3 (a) Direction of lower components: (b) 3 (c) (d) 3 18 directions of the lower components, most of the perceived directions agreed with the directions of the higher components. In other words, when the source is a broadband signal, the higher components are dominant in median plane localization and the contribution of the lower components is slight. 3.2. Average error Table 1 shows the localization error E [12] for each subject defined as: Table 1 (e) 3 18 Direction of higher components (degrees) Fig. 6 As Fig. 4 for subject TU. Averaged localization error E (degrees). (f) (g) 18 Subject E both E high E low SM 11 22 64 TM 17 28 66 TU 19 16 74 E ¼ 1 1 X J X K js jk R jk j ð1þ J K j¼1 k¼1 where S jk indicates the presented direction, R jk is the perceived direction, every j corresponds to a direction, and every k is the stimulus for each direction. E both in Table 1 is the error when both components were presented from the same direction (J ¼ 7, K ¼ 1), and E high and E low are the error to the direction of the higher and lower components (J ¼ 7, K ¼ ), respectively, when the two components were presented from different directions. Clearly, for each subject, E low is much larger than E both or E high, which are approximately equal. A factor analysis of variance indicates that both factors, type of error and subject, are significant (p < :1). Moreover, the results of a t-test indicate significant differences (p < :1) both between E low and E both and between E low and E high. This means that the higher components are dominant in median plane localization, and that the contribution of the lower components is slight. In order to examine details of the experimental results, the localization error for each presented direction was calculated as follows: E j ¼ 1 K X K k¼1 js jk R jk j where j is the presented direction. As described earlier, E both;j is the error when both components were presented from the same direction j, and E high;j and E low;j are the errors in the direction j of the higher and lower components, respectively, when the two components were presented from different directions. T-tests were performed in order to determine whether the difference between E both;j and E high;j or that between E both;j and E low;j was significant (p < :1). The results of these tests were classified into four cases. Case A: The difference between E both;j and E low;j was significant and the difference between E high;j and E both;j was not significant. The sound images were assumed to be localized in the direction of the higher components. Case B: The difference between E both;j and E high;j was significant and the difference between E low;j and E both;j was not significant. The sound images were assumed to be localized in the direction of the lower components. Case C: Both of the differences were significant. Sound images were assumed to be localized in neither the direction of the higher components nor in that of the lower components. Case D: Neither of the differences were significant. In this case, each components can be regarded as dominant in ð2þ 8

M. MORIMOTO et al.: ROLE OF LOW FREQUENCY COMPONENTS IN SOUND LOCALIZATION Table 2 Classification of significant difference between localization errors E low,high and E both. Subject Direction of higher Direction of lower components components 3 18 A A A A A A 3 A A A A A A A D A A A C SM C A D A A C C C B D A A C C B B D A 18 A A A A A D D D A A A A 3 A D B A C C A A B D C C TM A A A D A C A A A A A A C A A A A A 18 A A A A A D D A A A A A 3 A D A A A A A A D A A A TU A A D D A A A A A D D A A A A A A A 18 A A A A A D sound localization; however, determination of the dominant components was not possible. This case often occurred when the directions of two loudspeakers were close to each other. These results are summarized in Table 2. All of the results for subject TU are classified into either case A or case D, which means that this subject always localizes sound images in the direction of the higher components. On the other hand, a few of the results for subject SM are classified into either case B or case C when the higher components were presented from,,, or. A similar tendency is observed for subject TM when the higher components were presented from 3,, or. For these directions, the responses scatter more widely than the other directions, even for normal wide-band stimuli, as shown in Fig. 3(b). Hence sound images might have easily shifted to the direction of the lower components because spectral cues provided by the higher components were not sufficiently effective. Nevertheless, a number of results are classified into case A, that is, the higher components are thought of as having contributed to median plane localization more effectively in most cases. 3.3. Front-back Discrimination Asano et al. [1] concluded that important cues for front-back discrimination exist in the range below 2 khz. Here, the contribution of the lower components to front-back discrimination is examined. Morimoto [7] and Kurosawa et al. [13] demonstrated that just noticeable differences in median plane localization are large when a white noise signal is presented from the above directions from to. Namely, this means that frontback confusion frequently occurs at these directions. Therefore, in the present paper, front-back discrimination is examined only when either lower or higher components were presented from or 3, and the other components were presented from or 18. Note that a sound image perceived less than is regarded to be in the front, and vise versa. For subject SM, when the higher components were presented from and the lower components were presented from either or 3 (panels (a) and (b) in Fig. 4), most of the responses indicated the front. For subject TM, when the higher components were presented from 3 and the lower components were presented from or 18 (panels (f) and (g) in Fig. 5), most of the responses indicated the rear. Moreover, when the higher components were presented from and the lower components were presented from or 3 (panels (a) and (b) in Fig. 5), half of all responses indicated the front. However, for both subjects, when the higher components were presented from or 18, all responses coincided with the directions of the components. Moreover, for subject TU, the responses always coincided with the directions of the higher components. 81

Acoust. Sci. & Tech. 24, 2 (23) In summary, the lower components contribute to front-back discrimination only slightly compared to the higher components. Hence, important cues for front-back discrimination cannot be assumed to exist in the lower components. This is not inconsistent with the results reported by Morimoto [7] and Algazi et al. [11], because these studies do not mention about the importance of low components. However, this conclusion is obviously inconsistent with the conclusion of Asano et al. Such inconsistency can be explained as follows: In the localization tests performed by Asano et al., the subjects might have perceived the sound images inside their heads, because the stimuli were presented to the subjects through headphones. Laws [14] demonstrated that sound images were localized inside the head when stimuli were presented through headphones without accurate compensation for the transfer function from the headphones to the ears. It is possible that the smoothed HRTF used by Asano et al. corresponds to providing stimuli without such compensation. When the subject perceives sound images inside his or her head, the sound images are usually perceived in the back part of the head. The subjects in the tests performed by Asano et al. perceived most of the sound images in the rear when low components were smoothed (see Figs. 5 and 8 in [1]). 4. CONCLUSIONS Localization tests were performed in order to examine the contribution of low components to sound localization in the median plane. In these tests, the higher (above 4,8 Hz) and lower (below 4,8 Hz) components of stimuli are presented simultaneously from different directions. The results indicate that: (1) when the source is a wide-band signal, the higher components are dominant in median plane localization, whereas the lower components do not contribute significantly to the localization, and (2) important cues for front-back discrimination do not exist in the low range. REFERENCES [1] J. Blauert, Spatial Hearing: The Psychophysics of Human Sound Localization (MIT, Cambridge, MA, 1997). [2] J. Blauert, Sound localization in the median plane, Acustica, 22, 25 213 (1969/7). [3] J. Hebrank and D. Wright, Spectral cues used in the localization of sound sources on the median plane, J. Acoust. Soc. Am., 56, 1829 1834 (1974). [4] R. A. Butler and K. Belendiuk, Spectral cues utilized in the localization of sound in the median sagittal plane, J. Acoust. Soc. Am., 61, 1264 1269 (1977). [5] P. J. Bloom, Creating source elevation illusions by spectral manipulation, J. Audio Eng. Soc., 25, 5 565 (1977). [6] N. Asahi and S. Matsuoka, Effect of the sound anti resonance by pinna on median plane localization Localization of sound signal passed dip filter, Tech. Rep. Hear. Acoust. Soc. Jpn., H-4-1 (1977). [7] M. Morimoto, On sound localization, Dissertation, Tokyo University (1982). [8] J. C. Middlebrooks, Narrow-band sound localization related to external ear acoustics, J. Acoust. Soc. Am., 92, 27 2624 (1992). [9] S. Mehrgardt and V. Mellert, Transformation characteristics of the external human ear, J. Acoust. Soc. Am., 61, 1567 1576 (1977). [1] F. Asano, Y. Suzuki and T. Sone, Role of spectral cues in median plane localization, J. Acoust. Soc. Am., 88, 159 168 (19). [11] V. R. Algazi, C. Avendano and R. O. Duda, Elevation localization and head-related transfer function analysis at low frequencies, J. Acoust. Soc. Am., 19, 111 1122 (21). [12] M. Morimoto and Y. Ando, On the simulation of sound localization, J. Acoust. Soc. Jpn. (E), 1, 167 174 (198). [13] A. Kurosawa, T. Takagi and Z. Yamaguchi, On transfer function of human ear and auditory localization, J. Acoust. Soc. Jpn. (J), 38, 145 151 (1982). [14] P. Laws, On the problem of distance hearing and the localization of auditory events inside the head, Dissertation, Technishe Hochshule, Aachen (1972). 82