A Tutorial Essay on Spatial Filtering and Spatial Vision

Size: px
Start display at page:

Download "A Tutorial Essay on Spatial Filtering and Spatial Vision"

Transcription

1 A Tutorial Essay on Spatial Filtering and Spatial Vision Tim S Meese Neurosciences School of Life and Health Sciences Aston University Feb 2009: minor stylistic and modifications and technical corrections March 2009: referencing tidied up To follow: Final round-up sections and detailed references added. -1-

2 1. Introduction To the layperson, vision is all about seeing we open our eyes and effortlessly we see the world out there. We can recognize objects and people, we can interact with them in sensible and useful ways and we can navigate our way about the environment, rarely bumping into things. Of course, the only information that we are really able to act upon is that which is encoded as neural activity by our nervous system. These visual codes, the essence of all of our perceptions, are constructed from information arriving from each of two directions. Image data arrives bottom-up from our sensory apparatus, and knowledge-based rules and inferences arrive top-down from memory. These two routes to visual perception achieve very different jobs. The first route is well suited to providing a descriptive account of the retinal image while the second route allows these descriptions to be elaborated and interpreted. This chapter focuses primarily on one aspect of the first of these routes, the processing and encoding of spatial information in the two-dimensional retinal image. One of the major success stories in understanding the human brain has been the exploration of the bottom-up processes used in vision, sometimes referred to as early vision. But how do vision scientists get in to explore it? Our visual world is a very private one and not available for external scrutiny from vision scientists in an obvious way. Fortunately, it is not necessary and, according to some, perhaps not even helpful to think of vision this way. The point is made by David Marr s consideration of the lowly housefly (Marr, 1982). Although we cannot know what it is like to be a housefly, we don t have to suppose that a housefly has an explicit visual representation of the world around it for us to study its visual system. Perhaps the housefly just computes a few simple but immediately useful visually guided parameters. For instance, when the computation of rate of image expansion reaches a certain value this means that a surface is approaching and the undercarriage down, prepare to land signal should be triggered. On this view, it makes little sense to ask how the visual world might look to a housefly. However, by carefully manipulating artificial visual stimuli that mimic the fly s normal visual environment it is possible to investigate its visual apparatus by observing its behavioural responses (e.g. whether it prepares to land). A similar philosophy has been applied to understanding early vision in humans. The central tenet is that the visual system is a signal processor. It is treated as a black box, with a two-dimensional spatial signal as input and a neural image (an organized distribution of neural activity) as output (see Figure 1) upon which behavioural decisions can be made. To learn about signal processing in the early visual system we need to know about the neural image. Visual psychophysics attempts to do this in the laboratory (see Figure 2) by assessing the behavioural responses to visual stimuli (e.g. pressing one of two buttons), usually made by trained observers. With this technique, the system is necessarily treated as a whole though as we shall see, careful thought and experimentation can allude to the nuts and bolts of the visual system s inner workings. In neurophysiology, single- and multiple-cell recordings (see Figure 3) give us more direct access to the neural response, but allow us to look only at isolated fragments of the system at any one time. Input image (grey levels) Early vision Output image (neural activity) BLACK BOX Fig 1. The black box approach to vision. -2-

3 Spatial Filters/Meese/Nov 2000 Together, neurophysiology and psychophysics have converged on the view that early spatial vision consists of a set of spatial filters. The evidence for this view and the reasons why vision might work this way form the basis of the second part of the chapter, but first we need some suitable building blocks. We begin by exploring some formal concepts of spatial filtering. Fig 2. A psychophysics laboratory. Fig 3. Direct recordings of visual neurons. 2. Filtering in the Fourier Domain In most basic terms, a filter is a device that receives something as input and passes on some of its input as output. For example, a sieve might receive various grades of gravel as input, retaining the largest stones and passing on only the small chippings as output. In image processing, the input and output are both images, but what aspect of the image might be selectively passed on or filtered out? One very useful way of approaching this is from the Fourier domain. Essentially, a Fourier component can be thought of as a sine-wave grating1 of some particular orientation, spatial frequency, amplitude and spatial phase. It turns out, perhaps astonishingly, that all images can be thought of as a set of Fourier components that when added together recreate the original image. An example of an image and the amplitude spectrum of its Fourier transform (FT) are shown in Figure 4. In Figure 4b, the grey levels indicate the amplitudes of Fourier components (sine-wave gratings) within a map called Fourier space. This space is most conveniently expressed in terms of polar coordinates, 1 A sine-wave grating is a stimulus in which luminance is modulated sinusoidally, producing an image that looks like a set of blurred stripes. The experimenter can control the width of the stripes (spatial frequency), their orientation, spatial phase (how they line up with the centre of the display screen) and their contrast (the light levels for the light and dark bars relative to mean luminance). Sinewave gratings and related stimuli are widely used in vision science (partly because they are the fundamental building blocks of all images), and examples can be seen by looking ahead to Figures 12 and

4 where the argument (angle) indicates the orientation of a Fourier component and the modulus (distance from the origin) indicates its spatial frequency. Unfortunately, the gentle gradients of the grey levels in Figure 4b make it difficult to appreciate its structure, so a second method of plotting the Fourier transform is shown in Figure 4c. Here, quantization and colour enhancement reveal contours representing Fourier components of similar amplitudes. In this case, those with the greater amplitudes (coloured red and yellow) are those closer to the origin (centre of the image). But the point here is that images contain Fourier energy distributed across Fourier space (i.e. they have many Fourier components with different orientations and spatial frequencies) and that spatial filters are selective for different regions of Fourier space, passing some Fourier components and stopping (filtering out) others, just like the sieve. We will consider why it might be a good idea to do this later on (in Section 6), but for know, let us just consider the different types of filters that we might construct. (a) (b) (c) Fig 4. Image and its Fourier transform. a) The original image. b) Amplitude spectrum of the Fourier transform. c) Colour enhanced version of (b). The colours represent high to low amplitudes as follows: red, orange yellow, green, blue, purple, black. A filter that is selective for the circular region of Fourier space shown in Figure 5a (known as the filter s pass-band) would pass only low spatial frequencies (i.e. those close to the origin) but would not care about their orientations. Such a filter is known as a low-pass isotropic filter and an example of how it transforms an image is shown in Figure 6a. (For now, you can ignore the small insets in the upper left corner of Figure 6; these will be explained later in Section 3). The output image contains only the Fourier components from the input image that are within the filter s pass-band. In other words, it contains low spatial frequencies but no high spatial frequencies at any orientation we say they have been filtered out. Other interesting filters are those that pass only a band of spatial frequencies at any orientation (band-pass isotropic filters; Figure 5b) and bands of spatial frequencies at only specific orientations (oriented band-pass filters; Figure 5c) 2. Note that for all three of the filters in Figure 6, the Fourier components for which the filter is most responsive are those shaded red. So, for the filter in Figure 6b for example, the red ring represents a single spatial frequency at any orientation. The results of applying these filters to the image in Figure 4a are shown in Figure 6. (See ahead to Fig 13 for another example of oriented band-pass filtering). With the aid of a computer and appropriate image processing software, performing filtering operations in the Fourier domain is very straightforward. All we have to do is multiply the Fourier transform of the input image with the Fourier representation of the filter (sometimes called the filter s modulation transfer function [MTF]), the details for which are described in Box 1. The three MTFs in Figure 5 have been multiplied by the Fourier transform of the image in Figure 4, to produce Fourier transforms (Figure 7) of three different output images. The final step of generating the output images, simply involves computing their inverse Fourier transforms. This is done by adding-up spatial representations of all the many Fourier components (sine-wave gratings) in the Fourier transform. This is how the three filtered images in Figure 6 were generated. 2 For mathematical reasons each Fourier component appears twice in the Fourier domain, one being a rotation of the other through 180. Intuitively, this makes good sense when you realize that if you rotate a cosine-phase sine-wave grating through 180, you get back to where you started. So, at first sight, in Fig 5c it looks as though the filter is sensitive to two different regions of Fourier space, but one of them is in fact just a copy of the other rotated through

5 (a) (b) (c) Fig 5. Modulation transfer functions of three filters. a) low-pass isotropic filter (e.g. the eye s optics). b) band-pass isotropic filter (e.g. retina and LGN). c) band-pass vertical filter (e.g. cortex). The colours represent high to low sensitivities as follows: red, orange yellow, green, blue, purple, black. (a) (b) (c) Fig 6. a,b,c. Effects of applying the three filters in Fig 5 to the image in Fig 4a. (a) (b) (c) Fig 7. a,b,c. Fourier transforms of the filtered images in Fig 6. Note that the amplitude scale in (c) has been amplified a little to help reveal the spectrum s structure. -5-

6 BOX 1 Multiplication of two images Filtering can be achieved by multiplying a filter s MTF with the Fourier transform of the input image to generate the Fourier transform of the output image. The first step is to think of the amplitude spectrum of the image s Fourier transform (Figures 4 & 7) and the filter s MTF as images: they just contain regions of different shades of grey on a dark background. (Though in most of the figures in this chapter, colour has been used to achieve visual enhancement). Now, to multiply any two images together, all that is needed is to convert the image s grey levels to numbers. It is convenient to associate black with zero, white with one, and intermediate grey levels (or colours) with intermediate values. We can now think of each image as a two-dimensional array of neighbouring points with each entry being represented by a number (a grey level). To calculate the product of the two images all we do is work out the product of the two numbers at each of the corresponding points in the two images. Note that because the numbers representing the MTF are never greater than one, the numbers in the output can never be greater than the corresponding numbers in the input. For this reason, filters are often said to attenuate the input image Preferred stimuli and bandwidths 1 Preferred orientation = 90 1 Preferred spatial frequency = 10 c.deg -1 f bw θ bw Sensitivity bandwidth = 40 Sensitivity bandwidth = log 2 (14/5.7) bandwidth = 1.3 octaves Orientation (deg) Spatial Frequency (c.deg -1 ) Fig 8. Filter bandwidths. a) MTF of band-pass horizontal filter. b) Orientation bandwidth. c) Spatial frequency bandwidth. Note that (a) and (b) are plotted on linear scales, whereas in (c) a log-axis is used. A convenient way of summarizing a filter s characteristic (its MTF) is in terms of i) the spatial frequency and orientation that produce the maximum response (i.e. the greatest output), referred to as the preferred orientation and preferred spatial frequency and ii) the range of Fourier components to which it responds, referred to as its bandwidth. There are several different conventions for describing bandwidth, but the one considered here is the filter s full-width at half-height. Figure 8a is the MTF of an oriented band-pass filter (note that the orientation and spatial frequency are different from those in Figure 5c). The white circle in Figure 8a has its centre at the origin of Fourier space. This means that all of the points on this circle have the same spatial frequency but different orientations. The circle also passes through the point in the MTF that represents the preferred spatial frequency and orientation for this filter (the red spot). This is the Fourier component that would be least affected (least attenuated) by the filter. Sometimes we say that the filter is tuned to this spatial frequency and orientation. In Figure 8b a one-dimensional plot of the filter s MTF is shown. The ordinate represents amplitude of the MTF, and when presented this way it is often referred to as sensitivity (or, sometimes, gain). The abscissa represents the contour in Fourier space mapped out by an arc from the circle in Figure 8a, indicating orientation. The horizontal line in Figure 8b shows the width of the plot at half of the filter s maximum sensitivity, indicating that this filter has an orientation bandwidth of 40. (This is also shown by the wedge -shape in Figure 8a.) Figure 8c illustrates a similar idea for spatial frequency. Here the abscissa is the contour through the MTF given by a straight line that passes through the origin and the filter s preferred spatial frequency (see Figure 8a). It is common to plot spatial frequency on a log-axis, and this is what is done in Fig 8c (i.e. even spatial intervals on the graph represent even numeric multiples). Furthermore, it is also common to report spatial frequency bandwidth in terms of octaves (an octave is a relative dimension; see Box 2). Spatial frequency bandwidth is determined in the same way as for orientation, and Figure 8c shows that for our filter it is 1.3 octaves. -6-

7 Fig 9 a,b,c. Spatial filter-elements for the spatial filters shown in Fig 5. For these filters, the spatial filter-element is also the filter s pointspread function. Note that in all cases, the figures have been enlarged for clarity BOX 2 Octaves In the frequency domain, an octave is simply a multiple of two (or a half, depending upon whether you are moving up or down the scale). For example, as we move up a piano keyboard in octave steps we are doubling the pitch of the notes. For Fourier components, a spatial frequency of 8 c.deg -1 is two octaves higher than one of 2 c.deg -1 because we double once to get from 2 to 4, and then again to get from 4 to 8. Formally, the difference between two frequencies in octaves is given by Log 2 (H/L), where H is the higher frequency and L is the lower frequency. (And as every high school child knows, Log 2 (x) = Log 10 (x)/(log 10 (2).) Spatial phase We have seen that spatial filters are selective for particular orientations and spatial frequencies. Another dimension along which spatial filters are selective is spatial phase. The phase selectivity of a spatial filter is represented by the phase spectrum (not shown) of the Fourier representation of the filter. This is a little more difficult to think about and the details are beyond the scope of this chapter. However, in brief, a cosine-phase filter will impose no phase shift on its output whereas a sine-phase filter will shift the phase of the Fourier components within its pass-band by a phase angle of 90. As all of the filters that we consider here are cosine-phase filters, we can safely ignore the phase spectrum for our present purposes The eye s optics as a filter The eye s optics are not perfect but degrade the image by blurring it (amongst other things). This process can be characterized in terms of the MTF of the optics and can be measured experimentally. Recall from above that: OP amp = MTF IP amp, where OP amp and IP amp are the amplitude spectra of output and input images respectively, and the MTF is the Fourier representation of the filter. This rearranges to give: MTF = OP amp /IP amp. (1) This means that with a carefully chosen input image we should be able to calculate the MTF (the amount by which each and every Fourier component is attenuated) for the eye s optics. Specifically, because the MTF describes the attenuating effects of the filter on all Fourier components, we need a stimulus that contains all Fourier components so that OP amp /IP amp is a complete description of the filter. It is perhaps not immediately obvious what such an image might look like, so for now we will consider something slightly different but which is exactly equivalent for the filters considered in this chapter. Instead of presenting all of the Fourier components (sine-wave gratings) at the same time, we will consider presenting them sequentially. Of course, to do this for all Fourier components would take -7-

8 forever but a good approximation can be achieved by selecting a restricted set of test gratings and sampling at say every 10 in orientation and every 0.5 octaves in spatial frequency. So, we need lots of sine-wave gratings at different orientations and spatial frequencies each of unit amplitude, and we need to measure the attenuating effects of the eye s optics for each of these gratings. We can do this by using a modified ophthalmoscope (a device for looking into people s eyes), to inspect the retinal image that is produced for each of the test gratings. With the help of a photodiode it is possible to measure the light levels at the peaks and troughs of the gratings that appear on the retinal images and deduce the amplitude of each grating s image. Of course, we must remember to adjust our calculations to allow for the fact that the light has passed through the eye s optics twice (once on the way in, to produce the retinal image, and once on the way out, to produce the image that is being measured by our photo-diode). Now, because the amplitude spectrum of each grating in IP amp was unity it follows from Equation 1 that an estimate of the MTF is given directly from our measurements of OP amp. Specifically, to generate our estimate we first draw some axes that represent Fourier space. We now create a picture of the MTF by using the spatial frequency and orientation of each grating from our experiment to index Fourier space and write into the picture at that point an intensity (grey level) that represents the amplitude of the grating in the retinal image. For a typical observer without astigmatism, this would produce a picture (after colour enhancement) that looks something like that in Figure 5a. (In fact, the level of optical blur has been exaggerated considerably here for the purpose of illustration and, in practice, the picture would be much more grainy than shown because of the practical limits imposed on the sampling of Fourier space described above.) All this means that the eye s optics can be thought of as an isotropic low-pass filter. Note that what we have achieved here is a characterization of the eye s optics (the MTF) that allows us to deduce how any image would be filtered, just by observing the way in which a set of sine-wave gratings are attenuated. Actually, as we shall see in Section 3, there is a second and more straightforward method that we can use to achieve exactly the same thing Anti-aliasing We have seen above that the eye s optics filter out high spatial frequencies from the input. At first this might seem like bad news because it follows that lots of spatial information out there in the real world never makes it into our neural representations. (It has been claimed that this is where the fairies live, which is why you never see them!) However, it turns out that the fidelity of the optics is very well matched to the fidelity of subsequent neural processing. It is well known to engineers that if an image is under-sampled, spurious low frequency Fourier components called aliases will be introduced (e.g. see Reference 7 in the reading list). Aliased components are undesirable because they do not convey useful information about the outside world; they just create distracting clutter. In vision, the sampling density is determined by the spacing of receptors on the retinal mosaic, which are positioned such that spatial frequencies higher than about 50 or 60 c/deg would result in aliasing (this cut-off point is referred to as the Nyquist frequency). Fortunately, the MTF of the eye s optics ensures that spatial frequencies this high never reach the retina. This means the eye s optics can be thought of as an anti-aliasing filter. This is a valuable operation and very similar to the sort of filtering that is performed by sound engineers when making digital recordings. In this case, the analogue input signal (e.g. music) might contain audio frequencies that are higher than the sampling frequency used for making CDs (and also too high to be heard). However, unless they are filtered out using a low-pass anti-aliasing filter, these high frequencies would produce low frequency aliases, which would find there way onto the CD producing unpleasant auditory interference. See Meese (2002) for further details. 3.0 Filtering in the spatial domain In the discussion above we have considered filtering from a Fourier perspective by characterizing a filter in terms of its MTF. An alternative and equally useful characterization is possible in the spatial domain. Such representations go by several names including the pointspread function, the spatial filter-element, the receptive field, the convolution kernel, the convolution mask, the impulse response and the weighting-function. The preferred term depends upon several things including the derivation, the application, and the background of the author! The terms used in this chapter will be the first three in the respective contexts of i) empirical estimates from point-source stimuli, ii) applications in image processing and iii) their presence in biological visual systems. Inevitably, however, there will be some overlap between the contexts and therefore the terminology. Spatial domain representations of the three filters from Figure 5 are shown in Figure 9. In the case of the eye s optics (Figure 9a) it is very straightforward to get an empirical estimate of this function. Suppose that we repeat the experiment described previously using the modified ophthalmoscope but this time we use just a single point of bright light as the input image and record the retinal image that it produces. If we now adjust our record of this image -8-

9 to allow for the fact that the light has passed through the eye s optics twice we have a representation of what is known as the pointspread function (Figure 9a). This is a description of how a filter (in this case the eye s optics) distributes a point of light received as input. Because all images can be thought of as just a set of spatially distributed points of light, we can mimic the effects of our spatial filter by replacing each light point in the input image with a copy of the pointspread function multiplied by a number (between zero and one) that represents the light intensity of the image at that point. Summing together all of the weighted and overlapping pointspread functions then creates the output image. Although very different in approach, this is exactly equivalent to calculating the filter output through the Fourier route as described earlier. Note then, that this time we achieved a characterization of the eye s optics (the pointspread function) that allows us to deduce how any image would be filtered, just by observing the way in which a single point of light is blurred. In fact, it turns out that this method is very closely related to the earlier method using gratings. The problem that we sidestepped before was the creation of a stimulus containing all Fourier components. Perhaps surprisingly, however, our single point of light is just such a stimulus. It is the image that you get when you inverse Fourier transform an amplitude spectrum where the whole of Fourier space is set to unit amplitude (and cosine phase). This leads us to an important insight. Because IP amp in Equation 1 is unity for all Fourier components in our single point of light it follows that for this stimulus the MTF is equal to the amplitude spectrum of the output image. In other words, the MTF is equal to the Fourier transform of the filter s pointspread function. In fact, the MTFs in Figure 5 were generated by calculating the Fourier transforms of the pointspread functions in Figure 9. Another important spatial domain process that, for our purposes, is equivalent to the one described above, is convolution. In this case, spatial filtering is implemented by a set of identical spatial filter-elements (such as those shown in Figure 9) centred over each spatial location in the input image. (More commonly, in image processing applications, the process is implemented serially with a single filter-element that is moved sequentially over the entire image.) To create the output image, the response of each filter-element is calculated as follows. First, we need to realize that a spatial filter-element can itself be thought of as an image; it too is just represented by a set of grey levels. In this case, however, it is common to assign a value of 1 to white, -1 to black, and values between these extremes to intermediate grey levels, with zero being mid-grey. Filter-elements are usually only a small fraction of the size of the image with which they are being convolved and the filter-elements in Figure 9 have been enlarged for clarity; the actual sizes of the filter-elements that were used in the convolutions are shown as the insets (upper left corners) in Figure 6. The next stage is to multiply the small image patch that describes each filter-element with the patch of input image that lies beneath it (recall that multiplication of images is described in Box 1). Next, we add up all of the values that resulted from the multiplications to give a single number for each filter-element 3. This number is the response of the filter-element and is written into the output image in a position corresponding with the centre position of the filter-element. (The entire process is summarized in Figure 10). The output image that is produced by this process is sometimes called a convolved image and is identical to that which would be generated using either of the other methods of filtering described earlier (for cosine phase filters). Filter-element for a band-pass vertical filter Multiply and sum to produce a single greylevel Input image Convolution Output image Fig 10. The process of convolution for just a single point in the image. By repeating the process for different positions of the filter-element, an entire output image can be generated. The figure shows the process partially complete. 3 For technical reasons, this number is often scaled by a constant that depends upon the size of the filter-element. -9-

10 3.1 Relation between the pointspread function and the filter-element As we have seen, it is most straightforward to think of optical systems performing filtering in terms of the pointspread function. However, as we shall see, at later stages of the visual system where filtering is performed by visual neurons it is much more obvious to think in terms of convolution and the filter-element. Fortunately, there is a very close formal relation between a filter s pointspread function and its filter-element. It turns out that the one can be generated from the other by reflections across the x-axis and the y-axis (see Reference 5 in the reading list for a good illustration of why this is so). More fortunate still, for the cosine-phase filters considered in this chapter, these reflections have no effect (this is revealed from inspection of Figure 9) and so the pointspread function and the spatial filter-element are exactly the same. 3.2 Positive and negative lobes In Figure 9a, the spatial filter-element contains only a positive lobe denoted by the circle in Figure 11. In effect, when it is used to perform convolutions it is performing a local spatial averaging of light in the image so it is perhaps not surprising that this results in a blurring of the image (Figure 6a). Thought of this way, it is easy to see why this filter will not pass high spatial frequencies. When the light and dark bars of a grating are very fine they are blurred together by the filter. For example, in Figure 11, it doesn t matter where the filter-element is placed, its response (output) will be very much the same an average mid-grey level. Note also that changing the orientation of the grating won t make any difference to the response of the filter because of the circular shape of the filter-element. Because of the shape of this filter-element, isotropic filters are sometimes called circular filters. + Fig 11. Low-pass filter-element and high spatial frequency grating. -10-

11 The spatial structures of the other two filter-elements in Figure 9 are more complicated and include negative lobes (denoted by grey levels darker than mid grey) as well as positive lobes. As we have just seen, the positive lobes result in the attenuation of high spatial frequencies, but if a (cosine-phase) filter is to attenuate low spatial frequencies (as do the band-pass filters of Figures 6b & c) then its filter-element will have negative lobes. The reason for this is outlined intuitively in Figure 12 where we consider the response of a filter-element placed over a light bar of a grating at three different spatial frequencies. In Figure 12b, the spatial frequency of the grating is close to optimal a light bar falls in the positive lobe and dark bars fall in the negative surround. In this case, the centre region produces a positive response and the absence of light in the negative region results in very little negative contribution, so the net response is positive and strong. In Figure 12c the response is weak because both the centre and the surround blur together the fine bars of the stimulus. In Figure 12a, the response is also weak because even though the positive lobe responds to the light region, the light also stimulates the negative surround and the two contributions cancel each other out. Thus, we have seen that this filter-element does not respond to low or high spatial frequencies, but does respond to mid spatial frequencies. This confirms the band-pass characteristic that we have seen revealed already by its MTF (Figure 5b). Again, because the filter-element is circular, changing the orientation of a grating has no effect on its response, just as we would expect for an isotropic filter Fig 12. Isotropic band-pass filter-element and gratings. a) Low spatial frequency grating. b) Mid spatial frequency grating. c) High spatial frequency grating. 3.3 Preferred stimuli and bandwidths in the spatial domain A filter s preferred stimulus and bandwidth is actually given most directly in the Fourier domain representation as we saw in Figure 8 because, essentially, these terms are Fourier descriptions. However, manipulating these parameters has predictable consequences for the filter-element. The most obvious one is orientation: a filter s preferred orientation is given by the orientation of the spatial structure of its filter-element. In Fig 9c, for example, this is vertical. A filter s preferred spatial frequency depends upon the widths of the positive and negative lobes. To a first approximation, the widths of these lobes indicate the widths of the light and dark bars contained in the sine-wave grating that is preferred by the filter. Consequently, halving all the spatial dimensions of a filter-element will double the filter s preferred spatial frequency. -11-

12 Bandwidths are a little trickier to think about but in brief, decreasing orientation bandwidth increases the height of the spatial filter-element, and decreasing the spatial frequency bandwidth increases the number of positive and negative lobes in the filter-element. In essence, to decrease a filter s bandwidth all you have to do is more closely match the spatial structure of the filter-element to the spatial structure of the filter s preferred sine-wave grating. Thus, in the limit, a filter with infinitesimally narrow orientation and spatial frequency bandwidths would have a filter-element that was exactly like a sine-wave grating and of infinite spatial extent. 4. Relation between filtering in the spatial and Fourier domains In the previous sections we have seen how filtering can be understood from perspectives of both the Fourier domain and the spatial domain. The beauty of the interrelation between these approaches through the Fourier transform is summarized in Figure 13. The bottom row illustrates the Fourier route and shows that we can generate the Fourier transform of a filtered (output) image by multiplying the filter s MTF with the Fourier transform of the original (input) image. Alternatively, as shown by the top route, we can filter an image by convolving it with the filter s filterelement. And for the filters considered here (see Section 3.1) this is given by the inverse Fourier transform of the filter s MTF. Input image Filter-element Output image SPATIAL DOMAIN Convolved with: Produces: Fourier transform Inverse Fourier transform Fourier transform Inverse Fourier transform Fourier transform Inverse Fourier transform FOURIER DOMAIN Multiplied by: Produces: FT of input image Modulation transfer function FT of ouput image Fig 13. Relation between filtering in the Fourier domain and the spatial domain. Note that strictly speaking, the inverse Fourier transform of the MTF is the pointspread function but that for the cosine-phase filters considered in this tutorial, the pointspread function is identical to the filterelement. 5. The contrast sensitivity function (CSF) In Section 2.3 we saw how experiments have allowed us to characterize the first stage of filtering performed by the optics of the eye. We shall now turn to the overall filtering properties of the whole visual system. This has been characterized psychophysically by the contrast sensitivity function (CSF) which provides an estimate of the MTF for human vision 4. The technique involves measuring contrast detection thresholds for sine-wave gratings over a range of spatial frequencies. The contrast detection threshold is the lowest contrast at which a stimulus can be just detected by an observer, and can be measured using what is known as a two-interval forced-choice technique (2IFC). In this technique a single experimental trial consists of two temporal intervals (two brief stimulus presentations separated in 4 This involves several assumptions, the details of which are beyond the scope of this chapter

13 time) each signaled by an auditory beep. One of the intervals, chosen at random, contains a test grating and the other contains no stimulus, just a blank display with the same mean luminance as that in the test interval. The observer has to decide which interval contained the test stimulus and indicate their response by pressing one of two buttons. If the observer were able to see the stimulus then their response would be correct, whereas if they were not, they would have to guess and would be correct with a probability of 0.5. By performing many trials at a range of stimulus contrasts it is possible to generate a psychometric function such as that shown in Fig 14. This is a plot of the percentage of correct responses as a function of stimulus contrast and can be used to estimate the contrast detection threshold. This is often treated as the contrast level at which observers were correct on 75% of trials estimated by fitting a smooth curve through the data. Thus, for the example shown in Fig 14, the detection threshold is a contrast of 1%, sometimes written c t = As we shall see, when plotting the CSF the results are typically expressed in terms of sensitivity which is reciprocally related to contrast detection threshold: sensitivity = 1/c t. Thus, for our example, sensitivity equals Percent Correct Guess Rate = 50% Stimulus Contrast (%) Fig 14. Psychometric function for detecting a sine-wave grating in a psychophysical 2IFC experiment Typical CSF Contrast Sensitivity Contrast Spatial Frequency (c.deg -1 ) Fig 15. The contrast sensitivity function (CSF). By measuring psychometric functions for vertical sine-wave gratings at different spatial frequencies, sensitivities can be plotted as a function of spatial frequency. In our present context, Campbell and Robson (1968) were the first to do this, and the sort of results that they found are shown in Fig 15. Note that the axes are similar to those that were introduced earlier in our discussion of spatial filters (see Figure 8c) though here, the ordinate is also shown as a log axis. The filter characterization is not as complete as those shown in Figure 5 because the effects of orientation have not been considered, but for a single orientation the CSF provides a good illustration of the spatial frequency tuning of the human visual system. For example, at high spatial frequencies a grating cannot be detected (it just looks like a uniform grey field) no matter how high its contrast is. We should not be surprised by this attenuation -13-

14 at high spatial frequencies because of what we have learned already about the filtering performed by the optics. However, why should there be attenuation of low spatial frequencies, the optics doesn t do this! As we know from Section 3.2, in order for a filter to have this band-pass characteristic the filter must have a filter-element with at least one negative lobe. How might the human visual system implement this filtering, and where within the nervous system might it take place? 6. The retina and lateral geniculate nucleus Before vision scientists had even begun to think of early vision in terms of spatial filters, valuable physiological evidence had already begun to accrue. The evidence came from the single-cell recording technique used by physiologists. In brief, this surgical technique involves inserting a micro-electrode into (or close to) a visual neuron and observing the response of the cell in response to conventional visual stimulation. In the 1950s, Kuffler recorded from retinal ganglion cells in cat and found that these cells were responsive to stimulation in approximately circular regions on the retina, different cells preferring stimulation from different regions (Kuffler, 1953). This region is known as the cell s receptive field. Crucially, however, Kuffler s receptive fields were found to contain distinct excitatory and inhibitory subregions. So, for example, compared with the cell s response rate in the absence of stimulation, it might have increased its response if a bright spot were placed in the centre of its receptive field, but decreased its response if a bright spot were placed towards the boundary of its receptive field. For obvious reasons, such cells are often referred to as on-centre cells. It is possible to plot a detailed map of a cell s receptive field by using grey levels to indicate whether stimulation has caused its response rate to increase (between mid grey and white) or decrease (between mid grey and black). When this is done the results look remarkably similar to the spatial filter-element shown in Fig 9b. In other words, the receptive fields of retinal ganglion cells can be thought of as the spatial filter-elements of an isotropic band-pass filter that cover the retina, with neighbouring cells having overlapping receptive fields centred on adjacent locations in the retina. Recordings made in the lateral geniculate nucleus (LGN) prompt similar conclusions about the organization and character of the visual neurons found there. A more detailed account of receptive fields and their stimulation is provided in Box 3. Because the receptive fields of retinal ganglion cells and LGN cells can be thought of as filter-elements, it is an obvious next step to think of the filtered image in similar terms. Quite simply, the response rate of each cell represents the response of each filter-element. Put another way, the grey levels of neighbouring pixels in the filtered image are represented by the response rates of visual neurons whose receptive fields are centred over neighbouring points in the original image (the retinal image). Note though, unlike the retinal image, this neural image is not really an image at all but a spatially distributed set of neural responses. However, some care is needed over how we interpret the different grey levels. Recall that in Section 3.0 we learned that it is conventional to use negative numbers to represent the negative (e.g. black) regions in the filter-element. This makes sense in terms of neural receptive fields because these regions are in fact inhibitory. However, a moment s thought should reveal that the consequence of this is that regions of the output image could become negative. For example, if the image were entirely black in the positive region of a filter-element and entirely white in the negative region of the filter element, then the response would be negative. The regions whose grey level is darker than the mid-grey level represent these negative responses in Figures 6b and 6c. This poses something of a problem because neurons cannot fire negatively and so there is a danger that information will be lost from the filtered image. This problem is handled in two different ways by biological vision. We have already met the solution used in the retina and LGN. Here, a constant response rate (spontaneous discharge) is added to the output so that negative numbers represent a decrease in response rate relative to the normal (unstimulated) rate of activity. This is very similar to the solution used by image processing software, where both positive and negative numbers are represented by a continuum of grey levels with mid-grey representing zero. However, cortical neurons (next section), are silent when unstimulated (they have no spontaneous discharge) and so vision must adopt another solution to the problem. In fact, the trick is very simple and requires only that we should think of filter-elements in terms of pairs of receptive fields having lobes of opposite polarity. So, for example, the filter-element in Figure 9b might be represented by a neuron whose receptive field looks exactly like that in Figure 9b, plus a second neuron whose receptive field is the same shape but contains a central inhibitory region flanked by neighbouring excitatory regions (i.e. all of the lobes have opposite signs). Although both neurons can only respond positively, the second neuron is actually carrying information about the negative response of the filter. In other words, these neurons would be the ones that respond in the dark regions of Figures 6b and 6c. So long as the visual system is able to keep track of this, by knowing the polarity of its neurons receptive fields for example, filtered information will not be lost. This solution is also employed in the retina and LGN, providing something of a belt and braces approach at this level of the visual system. -14-

15 BOX 3 Effects of stimulating visual neurons with small spots of light. Consider a retinal ganglion cell s receptive field containing an excitatory (positive) centre and an inhibitory (negative) surround and stimuli made from light and dark spots placed on a mid-grey background. In the presence of the mid-grey background alone, the cell will respond at a background level known as spontaneous discharge. If a light spot is placed in the excitatory region, then the cell is excited and the response increases, whereas if a light spot is placed in the inhibitory surround, then the cell is antagonized and the response decreases. On the other hand, if a dark spot is placed in the excitatory centre, then this is equivalent to removing some light (the mid-grey level that the dark spot now covers) from the centre. Consequently, the cell will be less excited and the response of the cell will decrease. Finally, if a dark spot is placed in the surround, this is equivalent to removing some light from the surround. Light in the surround normally has an inhibitory effect, so its removal will cause the cell s response to increase. These manipulations are summarized in Table 1, where, from left to right + indicates an excitatory region, an increase in light and an increase in response and - indicates an inhibitory region, a decrease in light and a decrease in response. Note that the contents of this table can be readily constructed using the rules of multiplication. For example, in the second row, moving from left to right, a positive number multiplied by a negative numbers gives a negative number. Table 1 Region Light Response Primary visual cortex The story so far is this. The eye s optics is a low-pass filter, but the CSF indicates that the MTF of the visual system is band-pass. This is consistent with the band-pass characteristics of the isotropic spatial filters that we find implemented by visual neurons in the retina and LGN. We will now learn that in primary visual cortex, the filtering process becomes even more elaborate. For example, the CSF does not represent just a single band-pass filter with a broad bandwidth, but encompasses many spatial filters tuned for different spatial frequencies and different orientations. 7.1 Adaptation The first evidence for us to consider is psychophysical and involves a procedure known as adaptation. In brief, prolonged visual stimulation is thought to fatigue visual neurons, reducing their sensitivity and making them less responsive for a short period of time after the adapting stimulus is removed. Because the post-adaptation fatigue causes the visual system to behave slightly differently from normal it results in what are known as adaptation aftereffects. Blakemore and Campbell (1969) used this technique to learn about the details of filtering in human vision. They adapted to a high contrast sine-wave grating for several minutes and then measured the CSF. Their results were similar to those of Campbell and Robson, except that they found that a notch had been cut out of the CSF around the spatial frequency of the grating to which they had adapted. In other words, as shown in Figure 16, adaptation to a grating of 8 c.deg -1 increased the detection thresholds (reduced sensitivity) for gratings at and around this spatial frequency but had no effect on the detectability of much lower and much higher spatial frequencies. Adapting to different spatial frequencies resulted in notches being cut out of different locations in the CSF but always close to the spatial frequency to which the observer had adapted. The strong implication is that vision contains filters tuned for different spatial frequencies and that adaptation desensitizes the filters that are most responsive to the adapting stimulus. Thus, the idea emerged that the CSF represents the combined sensitivities of filters tuned to a range of different spatial frequencies (see Figure 17). In subsequent experiments, spatial frequency has been kept constant and sensitivity has been measured as a function of orientation. At moderate spatial frequencies this produces a flat function: vision is sensitive to no one orientation any more or less than any other. Adaptation aftereffects have then been measured in a similar way as above. In this case, adapting to a vertical grating reduces sensitivity to gratings with orientations close to vertical, but -15-

16 has no effect, for example, on the detectability of a horizontal grating. The implication here is that the spatial filters revealed by this technique are also tuned for orientation. This also puts a constraint on the locus of the adaptation aftereffect. Recall from above that we know from physiology and anatomy that up to the stage of the LGN, filters are isotropic, so these adaptation results cannot be reflecting the properties of the visual neurons encountered up to that stage. The oriented filters must be at a later stage, the earliest one possible being primary visual cortex. Finally, the breadths of the adaptation aftereffects measured in these kinds of experiments have been used to estimate the bandwidths of the underlying spatial filters. Estimates vary a little, but something close to 1.5 octaves and ±20 is probably about right. Fig 16. Adaptation and the CSF. The curve shows the unadapted CSF. The open symbols show the shape of the CSF after adapting to a sine-wave grating with a spatial frequency of 8 c.deg -1. Replotted from Blakemore and Campbell (1969). CSF Sensitivity MTFs of multiple filters Spatial Frequency Fig 17. The CSF and the MTFs of multiple spatial filters. -16-

17 7.2 Summation We first encountered the idea of a detection threshold in Section 3; it is the lowest stimulus contrast that can be reliably detected by an observer, typically defined as the 75% correct point on a psychometric function measured using 2IFC (see Figure 14). We have seen that filtering can attenuate the contrast of a test stimulus (e.g. at high spatial frequencies), making it difficult to detect, and that adaptation can desensitize spatial filters, making stimuli even more difficult to detect, but what is it that is actually doing the detecting? Central to the summation paradigm is the idea that a stimulus is detected when the filtered contrast of the stimulus exceeds the detection threshold of an individual filter-element. The idea is sketched in Figure 18 and shows a grating stimulus presented as input to a filtering stage. In the figure, just a single filter-element is shown, though we might conceive of different regions of the image being processed by additional filter-elements. Crucially, the output of the filter-element passes through a second stage. Here, the response must be greater than some criterion level (the detection threshold) before it can be passed on to subsequent processing stages. This sequence of a filter-element followed by a detection threshold is an example of what is sometimes referred to as a detecting mechanism. If no stimulus information is passed on because the response of the filter-element was less than the detection threshold, then the stimulus cannot be detected. Or put another way, for a stimulus to be detected, the detecting mechanism must respond. This is known as high threshold theory, and while sophisticated psychophysical experimentation has shown that it is wrong in detail, for our present purposes it is a very useful aid to thinking. Stimulus (Input image) Input (Response of filter-element) If output is greater than zero then stimulus is detected Stage 1 (Filter-element) Stage 2 (Detection threshold) Stage 3 (Decision) Fig 18. A detection mechanism. If the filter-element has a similar orientation and spatial frequency to the grating stimulus then it will respond in proportion to the stimulus contrast (stage 1). However, an output nonlinearity (stage 2) means that this response is only passed on to later processing stages if it is greater than some internal level, often referred to as the detection threshold. If this happens the stimulus is detected (stage 3). The scheme is based on high-threshold theory, which is known to be wrong in detail but serves as a useful tool for thinking. Let s now consider the details of the summation paradigm. In this psychophysical technique, which was widely used by Graham and others in the 1970s and 1980s (see Graham, 1989), sensitivity is measured first for one stimulus, lets call it component A, then for a second stimulus, lets call it component B, and finally a compound stimulus made from a combination of the two components. If the two components (typically small patches of sinewave grating) both stimulate the same detecting mechanism then it should be easier to detect the compound stimulus than either of its components in isolation because both components will help the detecting mechanism reach its detection threshold. If, on the other hand, the stimulus components stimulate completely different mechanisms, then summation could not occur within the detecting mechanism and sensitivity to the compound stimulus (when expressed in terms of its component contrasts) should be similar to its components. For our present purposes there are two important results, the first of which provides further evidence for the view that vision uses oriented filters. Consider a case when component A and component B are vertical and horizontal gratings respectively, and have the same spatial frequency. First let s suppose that the detection threshold for each of the two components when presented in isolation is a contrast of 1%. Now let s halve the contrast of each of these components (to 0.5%) so that they are each below their own detection threshold. If we add these components together we will produce a compound stimulus with vertical and horizontal components and an overall contrast of 1% (because the contrasts of the two components add arithmetically). The isotropic filters in the retina and LGN would respond to both components in the compound stimulus because these filters are not fussy about the orientations of the stimulus components. Consequently, they would see a stimulus with a contrast of 1%. Now, if observers are able to access visual -17-

Spectrograms (revisited)

Spectrograms (revisited) Spectrograms (revisited) We begin the lecture by reviewing the units of spectrograms, which I had only glossed over when I covered spectrograms at the end of lecture 19. We then relate the blocks of a

More information

M Cells. Why parallel pathways? P Cells. Where from the retina? Cortical visual processing. Announcements. Main visual pathway from retina to V1

M Cells. Why parallel pathways? P Cells. Where from the retina? Cortical visual processing. Announcements. Main visual pathway from retina to V1 Announcements exam 1 this Thursday! review session: Wednesday, 5:00-6:30pm, Meliora 203 Bryce s office hours: Wednesday, 3:30-5:30pm, Gleason https://www.youtube.com/watch?v=zdw7pvgz0um M Cells M cells

More information

Neural circuits PSY 310 Greg Francis. Lecture 05. Rods and cones

Neural circuits PSY 310 Greg Francis. Lecture 05. Rods and cones Neural circuits PSY 310 Greg Francis Lecture 05 Why do you need bright light to read? Rods and cones Photoreceptors are not evenly distributed across the retina 1 Rods and cones Cones are most dense in

More information

Early Stages of Vision Might Explain Data to Information Transformation

Early Stages of Vision Might Explain Data to Information Transformation Early Stages of Vision Might Explain Data to Information Transformation Baran Çürüklü Department of Computer Science and Engineering Mälardalen University Västerås S-721 23, Sweden Abstract. In this paper

More information

Plasticity of Cerebral Cortex in Development

Plasticity of Cerebral Cortex in Development Plasticity of Cerebral Cortex in Development Jessica R. Newton and Mriganka Sur Department of Brain & Cognitive Sciences Picower Center for Learning & Memory Massachusetts Institute of Technology Cambridge,

More information

OPTO 5320 VISION SCIENCE I

OPTO 5320 VISION SCIENCE I OPTO 5320 VISION SCIENCE I Monocular Sensory Processes of Vision: Color Vision Mechanisms of Color Processing . Neural Mechanisms of Color Processing A. Parallel processing - M- & P- pathways B. Second

More information

Reading Assignments: Lecture 5: Introduction to Vision. None. Brain Theory and Artificial Intelligence

Reading Assignments: Lecture 5: Introduction to Vision. None. Brain Theory and Artificial Intelligence Brain Theory and Artificial Intelligence Lecture 5:. Reading Assignments: None 1 Projection 2 Projection 3 Convention: Visual Angle Rather than reporting two numbers (size of object and distance to observer),

More information

A contrast paradox in stereopsis, motion detection and vernier acuity

A contrast paradox in stereopsis, motion detection and vernier acuity A contrast paradox in stereopsis, motion detection and vernier acuity S. B. Stevenson *, L. K. Cormack Vision Research 40, 2881-2884. (2000) * University of Houston College of Optometry, Houston TX 77204

More information

Retinal DOG filters: high-pass or high-frequency enhancing filters?

Retinal DOG filters: high-pass or high-frequency enhancing filters? Retinal DOG filters: high-pass or high-frequency enhancing filters? Adrián Arias 1, Eduardo Sánchez 1, and Luis Martínez 2 1 Grupo de Sistemas Inteligentes (GSI) Centro Singular de Investigación en Tecnologías

More information

Sum of Neurally Distinct Stimulus- and Task-Related Components.

Sum of Neurally Distinct Stimulus- and Task-Related Components. SUPPLEMENTARY MATERIAL for Cardoso et al. 22 The Neuroimaging Signal is a Linear Sum of Neurally Distinct Stimulus- and Task-Related Components. : Appendix: Homogeneous Linear ( Null ) and Modified Linear

More information

Lateral Geniculate Nucleus (LGN)

Lateral Geniculate Nucleus (LGN) Lateral Geniculate Nucleus (LGN) What happens beyond the retina? What happens in Lateral Geniculate Nucleus (LGN)- 90% flow Visual cortex Information Flow Superior colliculus 10% flow Slide 2 Information

More information

Chapter 11: Sound, The Auditory System, and Pitch Perception

Chapter 11: Sound, The Auditory System, and Pitch Perception Chapter 11: Sound, The Auditory System, and Pitch Perception Overview of Questions What is it that makes sounds high pitched or low pitched? How do sound vibrations inside the ear lead to the perception

More information

SENSES: VISION. Chapter 5: Sensation AP Psychology Fall 2014

SENSES: VISION. Chapter 5: Sensation AP Psychology Fall 2014 SENSES: VISION Chapter 5: Sensation AP Psychology Fall 2014 Sensation versus Perception Top-Down Processing (Perception) Cerebral cortex/ Association Areas Expectations Experiences Memories Schemas Anticipation

More information

Introduction. Chapter The Perceptual Process

Introduction. Chapter The Perceptual Process Chapter 1 Introduction Most of us take for granted our ability to perceive the external world. However, this is no simple deed at all. Imagine being given a task of designing a machine that can perceive,

More information

Natural Scene Statistics and Perception. W.S. Geisler

Natural Scene Statistics and Perception. W.S. Geisler Natural Scene Statistics and Perception W.S. Geisler Some Important Visual Tasks Identification of objects and materials Navigation through the environment Estimation of motion trajectories and speeds

More information

Models of Attention. Models of Attention

Models of Attention. Models of Attention Models of Models of predictive: can we predict eye movements (bottom up attention)? [L. Itti and coll] pop out and saliency? [Z. Li] Readings: Maunsell & Cook, the role of attention in visual processing,

More information

Lighta part of the spectrum of Electromagnetic Energy. (the part that s visible to us!)

Lighta part of the spectrum of Electromagnetic Energy. (the part that s visible to us!) Introduction to Physiological Psychology Vision ksweeney@cogsci.ucsd.edu cogsci.ucsd.edu/~ /~ksweeney/psy260.html Lighta part of the spectrum of Electromagnetic Energy (the part that s visible to us!)

More information

Analysis of in-vivo extracellular recordings. Ryan Morrill Bootcamp 9/10/2014

Analysis of in-vivo extracellular recordings. Ryan Morrill Bootcamp 9/10/2014 Analysis of in-vivo extracellular recordings Ryan Morrill Bootcamp 9/10/2014 Goals for the lecture Be able to: Conceptually understand some of the analysis and jargon encountered in a typical (sensory)

More information

2/3/17. Visual System I. I. Eye, color space, adaptation II. Receptive fields and lateral inhibition III. Thalamus and primary visual cortex

2/3/17. Visual System I. I. Eye, color space, adaptation II. Receptive fields and lateral inhibition III. Thalamus and primary visual cortex 1 Visual System I I. Eye, color space, adaptation II. Receptive fields and lateral inhibition III. Thalamus and primary visual cortex 2 1 2/3/17 Window of the Soul 3 Information Flow: From Photoreceptors

More information

Reading Assignments: Lecture 18: Visual Pre-Processing. Chapters TMB Brain Theory and Artificial Intelligence

Reading Assignments: Lecture 18: Visual Pre-Processing. Chapters TMB Brain Theory and Artificial Intelligence Brain Theory and Artificial Intelligence Lecture 18: Visual Pre-Processing. Reading Assignments: Chapters TMB2 3.3. 1 Low-Level Processing Remember: Vision as a change in representation. At the low-level,

More information

V1 (Chap 3, part II) Lecture 8. Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Princeton University, Fall 2017

V1 (Chap 3, part II) Lecture 8. Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Princeton University, Fall 2017 V1 (Chap 3, part II) Lecture 8 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Princeton University, Fall 2017 Topography: mapping of objects in space onto the visual cortex contralateral representation

More information

Senses are transducers. Change one form of energy into another Light, sound, pressure, etc. into What?

Senses are transducers. Change one form of energy into another Light, sound, pressure, etc. into What? 1 Vision 2 TRANSDUCTION Senses are transducers Change one form of energy into another Light, sound, pressure, etc. into What? Action potentials! Sensory codes Frequency code encodes information about intensity

More information

Prof. Greg Francis 7/31/15

Prof. Greg Francis 7/31/15 s PSY 200 Greg Francis Lecture 06 How do you recognize your grandmother? Action potential With enough excitatory input, a cell produces an action potential that sends a signal down its axon to other cells

More information

Visual Search: A Novel Psychophysics for Preattentive Vision

Visual Search: A Novel Psychophysics for Preattentive Vision Visual Search: A Novel Psychophysics for Preattentive Vision Elizabeth Williams under the direction of Dr. Jeremy M. Wolfe and Ms. Serena Butcher Brigham and Women s Hospital Research Science Institute

More information

CS/NEUR125 Brains, Minds, and Machines. Due: Friday, April 14

CS/NEUR125 Brains, Minds, and Machines. Due: Friday, April 14 CS/NEUR125 Brains, Minds, and Machines Assignment 5: Neural mechanisms of object-based attention Due: Friday, April 14 This Assignment is a guided reading of the 2014 paper, Neural Mechanisms of Object-Based

More information

Selective changes of sensitivity after adaptation to simple geometrical figures*

Selective changes of sensitivity after adaptation to simple geometrical figures* Perception & Psychophysics 1973. Vol. 13. So. 2.356-360 Selective changes of sensitivity after adaptation to simple geometrical figures* ANGEL VASSILEV+ Institu te of Physiology. Bulgarian Academy of Sciences.

More information

(Visual) Attention. October 3, PSY Visual Attention 1

(Visual) Attention. October 3, PSY Visual Attention 1 (Visual) Attention Perception and awareness of a visual object seems to involve attending to the object. Do we have to attend to an object to perceive it? Some tasks seem to proceed with little or no attention

More information

Linguistic Phonetics Fall 2005

Linguistic Phonetics Fall 2005 MIT OpenCourseWare http://ocw.mit.edu 24.963 Linguistic Phonetics Fall 2005 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 24.963 Linguistic Phonetics

More information

Auditory System & Hearing

Auditory System & Hearing Auditory System & Hearing Chapters 9 and 10 Lecture 17 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Spring 2015 1 Cochlea: physical device tuned to frequency! place code: tuning of different

More information

AUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening

AUDL GS08/GAV1 Signals, systems, acoustics and the ear. Pitch & Binaural listening AUDL GS08/GAV1 Signals, systems, acoustics and the ear Pitch & Binaural listening Review 25 20 15 10 5 0-5 100 1000 10000 25 20 15 10 5 0-5 100 1000 10000 Part I: Auditory frequency selectivity Tuning

More information

= add definition here. Definition Slide

= add definition here. Definition Slide = add definition here Definition Slide Definition Slides Sensation = the process by which our sensory receptors and nervous system receive and represent stimulus energies from our environment. Perception

More information

Fundamentals of psychophysics

Fundamentals of psychophysics Fundamentals of psychophysics Steven Dakin UCL Institute of Ophthalmology Introduction To understand a system as complex as the brain, one must understand not only its components (i.e. physiology) and

More information

HEARING AND PSYCHOACOUSTICS

HEARING AND PSYCHOACOUSTICS CHAPTER 2 HEARING AND PSYCHOACOUSTICS WITH LIDIA LEE I would like to lead off the specific audio discussions with a description of the audio receptor the ear. I believe it is always a good idea to understand

More information

PHY3111 Mid-Semester Test Study. Lecture 2: The hierarchical organisation of vision

PHY3111 Mid-Semester Test Study. Lecture 2: The hierarchical organisation of vision PHY3111 Mid-Semester Test Study Lecture 2: The hierarchical organisation of vision 1. Explain what a hierarchically organised neural system is, in terms of physiological response properties of its neurones.

More information

The lowest level of stimulation that a person can detect. absolute threshold. Adapting one's current understandings to incorporate new information.

The lowest level of stimulation that a person can detect. absolute threshold. Adapting one's current understandings to incorporate new information. absolute threshold The lowest level of stimulation that a person can detect accommodation Adapting one's current understandings to incorporate new information. acuity Sharp perception or vision audition

More information

LEA Color Vision Testing

LEA Color Vision Testing To The Tester Quantitative measurement of color vision is an important diagnostic test used to define the degree of hereditary color vision defects found in screening with pseudoisochromatic tests and

More information

Cambridge CB2 3EG (Received 8 November 1972)

Cambridge CB2 3EG (Received 8 November 1972) J. Physiol. (1973), 232, pp. 149-162 149 With 8 text-figures Printed in Great Britain PSYCHOPHYSICAL EVIDENCE FOR SUSTAINED AND TRANSIENT DETECTORS IN HUMAN VISION BY J. J. KULIKOWSKI AND D. J. TOLHURST*

More information

Definition Slides. Sensation. Perception. Bottom-up processing. Selective attention. Top-down processing 11/3/2013

Definition Slides. Sensation. Perception. Bottom-up processing. Selective attention. Top-down processing 11/3/2013 Definition Slides Sensation = the process by which our sensory receptors and nervous system receive and represent stimulus energies from our environment. Perception = the process of organizing and interpreting

More information

Technical Discussion HUSHCORE Acoustical Products & Systems

Technical Discussion HUSHCORE Acoustical Products & Systems What Is Noise? Noise is unwanted sound which may be hazardous to health, interfere with speech and verbal communications or is otherwise disturbing, irritating or annoying. What Is Sound? Sound is defined

More information

ID# Exam 1 PS 325, Fall 2001

ID# Exam 1 PS 325, Fall 2001 ID# Exam 1 PS 325, Fall 2001 As always, the Skidmore Honor Code is in effect, so keep your eyes foveated on your own exam. I tend to think of a point as a minute, so be sure to spend the appropriate amount

More information

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES

USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES USING AUDITORY SALIENCY TO UNDERSTAND COMPLEX AUDITORY SCENES Varinthira Duangudom and David V Anderson School of Electrical and Computer Engineering, Georgia Institute of Technology Atlanta, GA 30332

More information

Vision and Action. 10/3/12 Percep,on Ac,on 1

Vision and Action. 10/3/12 Percep,on Ac,on 1 Vision and Action Our ability to move thru our environment is closely tied to visual perception. Simple examples include standing one one foot. It is easier to maintain balance with the eyes open than

More information

Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters. Qiong Wu St. Bayside, NY Stuyvesant High School

Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters. Qiong Wu St. Bayside, NY Stuyvesant High School Are Faces Special? A Visual Object Recognition Study: Faces vs. Letters Qiong Wu 58-11 205 St. Bayside, NY 11364 Stuyvesant High School 345 Chambers St. New York, NY 10282 Q. Wu (2001) Are faces special?

More information

Binaural Hearing. Steve Colburn Boston University

Binaural Hearing. Steve Colburn Boston University Binaural Hearing Steve Colburn Boston University Outline Why do we (and many other animals) have two ears? What are the major advantages? What is the observed behavior? How do we accomplish this physiologically?

More information

Research Note. Orientation Selectivity in the Cat's Striate Cortex is Invariant with Stimulus Contrast*

Research Note. Orientation Selectivity in the Cat's Striate Cortex is Invariant with Stimulus Contrast* Exp Brain Res (1982) 46:457-461 9 Springer-Verlag 1982 Research Note Orientation Selectivity in the Cat's Striate Cortex is Invariant with Stimulus Contrast* G. Sclar and R.D. Freeman School of Optometry,

More information

SOLUTIONS Homework #3. Introduction to Engineering in Medicine and Biology ECEN 1001 Due Tues. 9/30/03

SOLUTIONS Homework #3. Introduction to Engineering in Medicine and Biology ECEN 1001 Due Tues. 9/30/03 SOLUTIONS Homework #3 Introduction to Engineering in Medicine and Biology ECEN 1001 Due Tues. 9/30/03 Problem 1: a) Where in the cochlea would you say the process of "fourier decomposition" of the incoming

More information

Spectro-temporal response fields in the inferior colliculus of awake monkey

Spectro-temporal response fields in the inferior colliculus of awake monkey 3.6.QH Spectro-temporal response fields in the inferior colliculus of awake monkey Versnel, Huib; Zwiers, Marcel; Van Opstal, John Department of Biophysics University of Nijmegen Geert Grooteplein 655

More information

Discrimination and Generalization in Pattern Categorization: A Case for Elemental Associative Learning

Discrimination and Generalization in Pattern Categorization: A Case for Elemental Associative Learning Discrimination and Generalization in Pattern Categorization: A Case for Elemental Associative Learning E. J. Livesey (el253@cam.ac.uk) P. J. C. Broadhurst (pjcb3@cam.ac.uk) I. P. L. McLaren (iplm2@cam.ac.uk)

More information

7. Sharp perception or vision 8. The process of transferring genetic material from one cell to another by a plasmid or bacteriophage

7. Sharp perception or vision 8. The process of transferring genetic material from one cell to another by a plasmid or bacteriophage 1. A particular shade of a given color 2. How many wave peaks pass a certain point per given time 3. Process in which the sense organs' receptor cells are stimulated and relay initial information to higher

More information

VISUAL PERCEPTION & COGNITIVE PROCESSES

VISUAL PERCEPTION & COGNITIVE PROCESSES VISUAL PERCEPTION & COGNITIVE PROCESSES Prof. Rahul C. Basole CS4460 > March 31, 2016 How Are Graphics Used? Larkin & Simon (1987) investigated usefulness of graphical displays Graphical visualization

More information

Neuronal responses to plaids

Neuronal responses to plaids Vision Research 39 (1999) 2151 2156 Neuronal responses to plaids Bernt Christian Skottun * Skottun Research, 273 Mather Street, Piedmont, CA 94611-5154, USA Received 30 June 1998; received in revised form

More information

Linguistic Phonetics. Basic Audition. Diagram of the inner ear removed due to copyright restrictions.

Linguistic Phonetics. Basic Audition. Diagram of the inner ear removed due to copyright restrictions. 24.963 Linguistic Phonetics Basic Audition Diagram of the inner ear removed due to copyright restrictions. 1 Reading: Keating 1985 24.963 also read Flemming 2001 Assignment 1 - basic acoustics. Due 9/22.

More information

Chapter 7: Descriptive Statistics

Chapter 7: Descriptive Statistics Chapter Overview Chapter 7 provides an introduction to basic strategies for describing groups statistically. Statistical concepts around normal distributions are discussed. The statistical procedures of

More information

CS294-6 (Fall 2004) Recognizing People, Objects and Actions Lecture: January 27, 2004 Human Visual System

CS294-6 (Fall 2004) Recognizing People, Objects and Actions Lecture: January 27, 2004 Human Visual System CS294-6 (Fall 2004) Recognizing People, Objects and Actions Lecture: January 27, 2004 Human Visual System Lecturer: Jitendra Malik Scribe: Ryan White (Slide: layout of the brain) Facts about the brain:

More information

Ch 5. Perception and Encoding

Ch 5. Perception and Encoding Ch 5. Perception and Encoding Cognitive Neuroscience: The Biology of the Mind, 2 nd Ed., M. S. Gazzaniga, R. B. Ivry, and G. R. Mangun, Norton, 2002. Summarized by Y.-J. Park, M.-H. Kim, and B.-T. Zhang

More information

the human 1 of 3 Lecture 6 chapter 1 Remember to start on your paper prototyping

the human 1 of 3 Lecture 6 chapter 1 Remember to start on your paper prototyping Lecture 6 chapter 1 the human 1 of 3 Remember to start on your paper prototyping Use the Tutorials Bring coloured pencil, felts etc Scissor, cello tape, glue Imagination Lecture 6 the human 1 1 Lecture

More information

COGS 101A: Sensation and Perception

COGS 101A: Sensation and Perception COGS 101A: Sensation and Perception 1 Virginia R. de Sa Department of Cognitive Science UCSD Lecture 5: LGN and V1: Magno and Parvo streams Chapter 3 Course Information 2 Class web page: http://cogsci.ucsd.edu/

More information

Announcements. Perceptual Grouping. Quiz: Fourier Transform. What you should know for quiz. What you should know for quiz

Announcements. Perceptual Grouping. Quiz: Fourier Transform. What you should know for quiz. What you should know for quiz Announcements Quiz on Tuesday, March 10. Material covered (Union not Intersection) All lectures before today (March 3). Forsyth and Ponce Readings: Chapters 1.1, 4, 5.1, 5.2, 5.3, 7,8, 9.1, 9.2, 9.3, 6.5.2,

More information

A. Acuity B. Adaptation C. Awareness D. Reception E. Overload

A. Acuity B. Adaptation C. Awareness D. Reception E. Overload Unit 4 Review #1 The longer an individual is exposed to a strong odor, the less aware of the odor the individual becomes. This phenomenon is known as sensory A. Acuity B. Adaptation C. Awareness D. Reception

More information

THE VISUAL WORLD! Visual (Electromagnetic) Stimulus

THE VISUAL WORLD! Visual (Electromagnetic) Stimulus THE VISUAL WORLD! Visual (Electromagnetic) Stimulus Perceived color of light is determined by 3 characteristics (properties of electromagnetic energy): 1. Hue: the spectrum (wavelength) of light (color)

More information

Lecturer: Rob van der Willigen 11/9/08

Lecturer: Rob van der Willigen 11/9/08 Auditory Perception - Detection versus Discrimination - Localization versus Discrimination - - Electrophysiological Measurements Psychophysical Measurements Three Approaches to Researching Audition physiology

More information

Practice Test Questions

Practice Test Questions Practice Test Questions Multiple Choice 1. Which term is most descriptive of the process of sensation? a. transportation c. selection b. interpretation d. transduction 2. Which terms are most descriptive

More information

Competing Frameworks in Perception

Competing Frameworks in Perception Competing Frameworks in Perception Lesson II: Perception module 08 Perception.08. 1 Views on perception Perception as a cascade of information processing stages From sensation to percept Template vs. feature

More information

Competing Frameworks in Perception

Competing Frameworks in Perception Competing Frameworks in Perception Lesson II: Perception module 08 Perception.08. 1 Views on perception Perception as a cascade of information processing stages From sensation to percept Template vs. feature

More information

Ch 5. Perception and Encoding

Ch 5. Perception and Encoding Ch 5. Perception and Encoding Cognitive Neuroscience: The Biology of the Mind, 2 nd Ed., M. S. Gazzaniga,, R. B. Ivry,, and G. R. Mangun,, Norton, 2002. Summarized by Y.-J. Park, M.-H. Kim, and B.-T. Zhang

More information

Spontaneous Cortical Activity Reveals Hallmarks of an Optimal Internal Model of the Environment. Berkes, Orban, Lengyel, Fiser.

Spontaneous Cortical Activity Reveals Hallmarks of an Optimal Internal Model of the Environment. Berkes, Orban, Lengyel, Fiser. Statistically optimal perception and learning: from behavior to neural representations. Fiser, Berkes, Orban & Lengyel Trends in Cognitive Sciences (2010) Spontaneous Cortical Activity Reveals Hallmarks

More information

Lecturer: Rob van der Willigen 11/9/08

Lecturer: Rob van der Willigen 11/9/08 Auditory Perception - Detection versus Discrimination - Localization versus Discrimination - Electrophysiological Measurements - Psychophysical Measurements 1 Three Approaches to Researching Audition physiology

More information

compared to electrophysiological studies on X (sustained) and Y (transient) flickering stimuli (lines and gratings) on the contrast threshold

compared to electrophysiological studies on X (sustained) and Y (transient) flickering stimuli (lines and gratings) on the contrast threshold J. Phyeiol. (1975), 249, pp. 519-548 519 With 9 text-ftgure8 Printed in Great Britain PATTERN AND FLICKER DETECTION ANALYSED BY SUBTHRESHOLD SUMMATION By P. E. KING-SMITH AND J. J. KULIKOWSKI From the

More information

alternate-form reliability The degree to which two or more versions of the same test correlate with one another. In clinical studies in which a given function is going to be tested more than once over

More information

Chapter 5 Test Review. Try the practice questions in the Study Guide and on line

Chapter 5 Test Review. Try the practice questions in the Study Guide and on line Chapter 5 Test Review Try the practice questions in the Study Guide and on line Printing game plan Put six slides on a page Select pure black and white as the printing option Okay, now wade into the answers>>>>

More information

Temporal Feature of S-cone Pathway Described by Impulse Response Function

Temporal Feature of S-cone Pathway Described by Impulse Response Function VISION Vol. 20, No. 2, 67 71, 2008 Temporal Feature of S-cone Pathway Described by Impulse Response Function Keizo SHINOMORI Department of Information Systems Engineering, Kochi University of Technology

More information

Lecture overview. What hypothesis to test in the fly? Quantitative data collection Visual physiology conventions ( Methods )

Lecture overview. What hypothesis to test in the fly? Quantitative data collection Visual physiology conventions ( Methods ) Lecture overview What hypothesis to test in the fly? Quantitative data collection Visual physiology conventions ( Methods ) 1 Lecture overview What hypothesis to test in the fly? Quantitative data collection

More information

The effect of stimulus duration on the persistence of gratings

The effect of stimulus duration on the persistence of gratings Perception & Psychophysics 1980,27 (6),574-578 The effect of stimulus duration on the persistence of gratings ALISON BOWLING and WILLIAM LOVEGROVE University oftasmania, Hobart, Tasmania, Australia 700/

More information

Sensation and Perception. Chapter 6

Sensation and Perception. Chapter 6 Sensation and Perception Chapter 6 1 Sensation & Perception How do we construct our representations of the external world? Text To represent the world, we must detect physical energy (a stimulus) from

More information

Motion direction signals in the primary visual cortex of cat and monkey

Motion direction signals in the primary visual cortex of cat and monkey Visual Neuroscience (2001), 18, 501 516. Printed in the USA. Copyright 2001 Cambridge University Press 0952-5238001 $12.50 DOI: 10.1017.S0952523801184014 Motion direction signals in the primary visual

More information

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence To understand the network paradigm also requires examining the history

More information

Cerebral Cortex. Edmund T. Rolls. Principles of Operation. Presubiculum. Subiculum F S D. Neocortex. PHG & Perirhinal. CA1 Fornix CA3 S D

Cerebral Cortex. Edmund T. Rolls. Principles of Operation. Presubiculum. Subiculum F S D. Neocortex. PHG & Perirhinal. CA1 Fornix CA3 S D Cerebral Cortex Principles of Operation Edmund T. Rolls F S D Neocortex S D PHG & Perirhinal 2 3 5 pp Ento rhinal DG Subiculum Presubiculum mf CA3 CA1 Fornix Appendix 4 Simulation software for neuronal

More information

Two Visual Contrast Processes: One New, One Old

Two Visual Contrast Processes: One New, One Old 1 Two Visual Contrast Processes: One New, One Old Norma Graham and S. Sabina Wolfson In everyday life, we occasionally look at blank, untextured regions of the world around us a blue unclouded sky, for

More information

Theta sequences are essential for internally generated hippocampal firing fields.

Theta sequences are essential for internally generated hippocampal firing fields. Theta sequences are essential for internally generated hippocampal firing fields. Yingxue Wang, Sandro Romani, Brian Lustig, Anthony Leonardo, Eva Pastalkova Supplementary Materials Supplementary Modeling

More information

Introduction to Physiological Psychology

Introduction to Physiological Psychology Introduction to Physiological Psychology Vision ksweeney@cogsci.ucsd.edu cogsci.ucsd.edu/~ksweeney/psy260.html This class n Sensation vs. Perception n How light is translated into what we see n Structure

More information

Chapter 5: Perceiving Objects and Scenes

Chapter 5: Perceiving Objects and Scenes PSY382-Hande Kaynak, PhD 2/13/17 Chapter 5: Perceiving Objects and Scenes 1 2 Figure 5-1 p96 3 Figure 5-2 p96 4 Figure 5-4 p97 1 Why Is It So Difficult to Design a Perceiving Machine? The stimulus on the

More information

A THEORY OF MCCOLLOUGH EFFECT AND. CHUN CHIANG Institute of Physics, Academia Sinica

A THEORY OF MCCOLLOUGH EFFECT AND. CHUN CHIANG Institute of Physics, Academia Sinica A THEORY OF MCCOLLOUGH EFFECT AND CONTINGENT AFTER-EFFECT CHUN CHIANG Institute of Physics, Academia Sinica A model is advanced to explain the McCollough effect and the contingent motion after-effect.

More information

THE VISUAL WORLD! Visual (Electromagnetic) Stimulus

THE VISUAL WORLD! Visual (Electromagnetic) Stimulus THE VISUAL WORLD! Visual (Electromagnetic) Stimulus Perceived color of light is determined by 3 characteristics (properties of electromagnetic energy): 1. : the spectrum (wavelength) of light (color) 2.

More information

Fundamentals of Psychophysics

Fundamentals of Psychophysics Fundamentals of Psychophysics John Greenwood Department of Experimental Psychology!! NEUR3045! Contact: john.greenwood@ucl.ac.uk 1 Visual neuroscience physiology stimulus How do we see the world? neuroimaging

More information

l3;~~?~~~,'0~'~~t~t:~:~~~~~~~~~~!,1

l3;~~?~~~,'0~'~~t~t:~:~~~~~~~~~~!,1 112 Sensation and Perception Line A should look longer, even though both lines are actually the same length. People who come from noncarpentered cultures that do not use right angles and corners often

More information

Supplementary materials for: Executive control processes underlying multi- item working memory

Supplementary materials for: Executive control processes underlying multi- item working memory Supplementary materials for: Executive control processes underlying multi- item working memory Antonio H. Lara & Jonathan D. Wallis Supplementary Figure 1 Supplementary Figure 1. Behavioral measures of

More information

d). Draw the following neural circuits (using the notation taught in class) and then say what would happen if they were stimulated as specified.

d). Draw the following neural circuits (using the notation taught in class) and then say what would happen if they were stimulated as specified. 1. The neuropsychology of perception. a). Describe the process in which a neural impulse travel down one axon, making sure to specify chemical substances involved and how that affects the charge within

More information

A general error-based spike-timing dependent learning rule for the Neural Engineering Framework

A general error-based spike-timing dependent learning rule for the Neural Engineering Framework A general error-based spike-timing dependent learning rule for the Neural Engineering Framework Trevor Bekolay Monday, May 17, 2010 Abstract Previous attempts at integrating spike-timing dependent plasticity

More information

Visual Physiology. Perception and Attention. Graham Hole. Problems confronting the visual system: Solutions: The primary visual pathways: The eye:

Visual Physiology. Perception and Attention. Graham Hole. Problems confronting the visual system: Solutions: The primary visual pathways: The eye: Problems confronting the visual system: Visual Physiology image contains a huge amount of information which must be processed quickly. image is dim, blurry and distorted. Light levels vary enormously.

More information

Sightech Vision Systems, Inc. Real World Objects

Sightech Vision Systems, Inc. Real World Objects Sightech Vision Systems, Inc. Real World Objects Animals See the World in Terms of Objects not Pixels Animals, especially the more compact ones, must make good use of the neural matter that they have been

More information

Paper Airplanes & Scientific Methods

Paper Airplanes & Scientific Methods Paper Airplanes & Scientific Methods Scientific Inquiry refers to the many different ways in which scientists investigate the world. Scientific investigations are done to answer questions and solve problems.

More information

Vision Seeing is in the mind

Vision Seeing is in the mind 1 Vision Seeing is in the mind Stimulus: Light 2 Light Characteristics 1. Wavelength (hue) 2. Intensity (brightness) 3. Saturation (purity) 3 4 Hue (color): dimension of color determined by wavelength

More information

PSY 214 Lecture 5 (09/19/2010) (Vision) Dr. Achtman PSY 214. Lecture 5 Topic: Introduction to Vision Chapter 3, pages 55-71

PSY 214 Lecture 5 (09/19/2010) (Vision) Dr. Achtman PSY 214. Lecture 5 Topic: Introduction to Vision Chapter 3, pages 55-71 Corrections: No corrections needed Announcements: After the completion of chapter 4 a movie will be shown First test is October 3, 2011 Dr. Achtman is available during her office hours The test will include

More information

Neural Networks: Tracing Cellular Pathways. Lauren Berryman Sunfest 2000

Neural Networks: Tracing Cellular Pathways. Lauren Berryman Sunfest 2000 Neural Networks: Tracing Cellular Pathways Lauren Berryman Sunfest 000 Neural Networks: Tracing Cellular Pathways Research Objective Background Methodology and Experimental Approach Results and Conclusions

More information

How is the stimulus represented in the nervous system?

How is the stimulus represented in the nervous system? How is the stimulus represented in the nervous system? Eric Young F Rieke et al Spikes MIT Press (1997) Especially chapter 2 I Nelken et al Encoding stimulus information by spike numbers and mean response

More information

Comment by Delgutte and Anna. A. Dreyer (Eaton-Peabody Laboratory, Massachusetts Eye and Ear Infirmary, Boston, MA)

Comment by Delgutte and Anna. A. Dreyer (Eaton-Peabody Laboratory, Massachusetts Eye and Ear Infirmary, Boston, MA) Comments Comment by Delgutte and Anna. A. Dreyer (Eaton-Peabody Laboratory, Massachusetts Eye and Ear Infirmary, Boston, MA) Is phase locking to transposed stimuli as good as phase locking to low-frequency

More information

Carlson (7e) PowerPoint Lecture Outline Chapter 6: Vision

Carlson (7e) PowerPoint Lecture Outline Chapter 6: Vision Carlson (7e) PowerPoint Lecture Outline Chapter 6: Vision This multimedia product and its contents are protected under copyright law. The following are prohibited by law: any public performance or display,

More information

Auditory System & Hearing

Auditory System & Hearing Auditory System & Hearing Chapters 9 part II Lecture 16 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Spring 2019 1 Phase locking: Firing locked to period of a sound wave example of a temporal

More information

CHAPTER ONE CORRELATION

CHAPTER ONE CORRELATION CHAPTER ONE CORRELATION 1.0 Introduction The first chapter focuses on the nature of statistical data of correlation. The aim of the series of exercises is to ensure the students are able to use SPSS to

More information

IJREAS Volume 2, Issue 2 (February 2012) ISSN: LUNG CANCER DETECTION USING DIGITAL IMAGE PROCESSING ABSTRACT

IJREAS Volume 2, Issue 2 (February 2012) ISSN: LUNG CANCER DETECTION USING DIGITAL IMAGE PROCESSING ABSTRACT LUNG CANCER DETECTION USING DIGITAL IMAGE PROCESSING Anita Chaudhary* Sonit Sukhraj Singh* ABSTRACT In recent years the image processing mechanisms are used widely in several medical areas for improving

More information