Inhib conductance into inhib units (g_bar_i.inhib) Strength of feedforward weights to inhib (scale.ff)

Similar documents
Computational Explorations in Cognitive Neuroscience Chapter 3

Learning in neural networks

Cell Responses in V4 Sparse Distributed Representation

Biologically-Inspired Control in Problem Solving. Thad A. Polk, Patrick Simen, Richard L. Lewis, & Eric Freedman

Lateral Inhibition Explains Savings in Conditioning and Extinction

2 Summary of Part I: Basic Mechanisms. 4 Dedicated & Content-Specific

Decision-making mechanisms in the brain

Introduction to Computational Neuroscience

Artificial Neural Networks (Ref: Negnevitsky, M. Artificial Intelligence, Chapter 6)

Sparse Coding in Sparse Winner Networks

Computing with Spikes in Recurrent Neural Networks

Applied Neuroscience. Conclusion of Science Honors Program Spring 2017

Oscillatory Neural Network for Image Segmentation with Biased Competition for Attention

Spatial Attention: Unilateral Neglect

1 Perception& Attention. 5 The Retina. 3 Overview of the Visual System. 6 LGN of the Thalamus. 4 Two Streams: Ventral what vs.

Perception & Attention

PIB Ch. 18 Sequence Memory for Prediction, Inference, and Behavior. Jeff Hawkins, Dileep George, and Jamie Niemasik Presented by Jiseob Kim

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence

2/3/17. Visual System I. I. Eye, color space, adaptation II. Receptive fields and lateral inhibition III. Thalamus and primary visual cortex

Self-Organization and Segmentation with Laterally Connected Spiking Neurons

Supporting Text. PFC No Gate SRN-PFC

Inhibition: Effects of Timing, Time Scales and Gap Junctions

Recurrent neuronal circuits in the neocortex

Rolls,E.T. (2016) Cerebral Cortex: Principles of Operation. Oxford University Press.

PSY380: VISION SCIENCE

Copyright Dr. Franklin B. Krasne, Swimmy

UNINFORMATIVE MEMORIES WILL PREVAIL

Introduction to Computational Neuroscience

A Dynamic Field Theory of Visual Recognition in Infant Looking Tasks

Modeling Excitatory and Inhibitory Chemical Synapses

TNS Journal Club: Interneurons of the Hippocampus, Freund and Buzsaki

Brad May, PhD Johns Hopkins University

A model of the interaction between mood and memory

Artificial Neural Networks

Neural response time integration subserves. perceptual decisions - K-F Wong and X-J Wang s. reduced model

Serial visual search from a parallel model q

Ben Chandler and Stephen Grossberg

Spiking Inputs to a Winner-take-all Network

An attractor network is a network of neurons with

Intelligent Control Systems

In: Connectionist Models in Cognitive Neuroscience, Proc. of the 5th Neural Computation and Psychology Workshop (NCPW'98). Hrsg. von D. Heinke, G. W.

The storage and recall of memories in the hippocampo-cortical system. Supplementary material. Edmund T Rolls

Learning and Adaptive Behavior, Part II


Free recall and recognition in a network model of the hippocampus: simulating effects of scopolamine on human memory function

Homeostasis Practice Quiz 20 Questions SBI 4UI

Part 1 Making the initial neuron connection

Modeling Category Learning with Exemplars and Prior Knowledge

You submitted this quiz on Sun 19 May :32 PM IST (UTC +0530). You got a score of out of

Emotional Intelligence

Modeling of Hippocampal Behavior

Computational Cognitive Neuroscience

Chapter 2 The Brain or Bio Psychology

Perception and Attention


A model of dual control mechanisms through anterior cingulate and prefrontal cortex interactions

A Neural Basis for Perceptual Dynamics

Reading Assignments: Lecture 5: Introduction to Vision. None. Brain Theory and Artificial Intelligence

Noise in attractor networks in the brain produced by graded firing rate representations

COMP9444 Neural Networks and Deep Learning 5. Convolutional Networks

DO NOW: ANSWER ON PG 73

A Neurally-Inspired Model for Detecting and Localizing Simple Motion Patterns in Image Sequences

Prof. Greg Francis 7/31/15

Supplementary Figure 1

PHY3111 Mid-Semester Test Study. Lecture 2: The hierarchical organisation of vision

DTI C TheUniversityof Georgia

Epileptic Dogs: Advanced Seizure Prediction

Hierarchical dynamical models of motor function

Improving Associative Memory in a Network of Spiking

Formulating Emotion Perception as a Probabilistic Model with Application to Categorical Emotion Classification

The Role of Mitral Cells in State Dependent Olfactory Responses. Trygve Bakken & Gunnar Poplawski

A Dynamic Neural Network Model of Sensorimotor Transformations in the Leech

Cholinergic suppression of transmission may allow combined associative memory function and self-organization in the neocortex.

Motivation represents the reasons for people's actions, desires, and needs. Typically, this unit is described as a goal

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

Sequence 1 E K L M N O P Q R S

Associative Decorrelation Dynamics: A Theory of Self-Organization and Optimization in Feedback Networks

How has Computational Neuroscience been useful? Virginia R. de Sa Department of Cognitive Science UCSD

Resonant synchronization of heterogeneous inhibitory networks

Clusters, Symbols and Cortical Topography

Artificial Intelligence

LECTURE 2. C. Reason correlation and synaptic delay not enough to prove direct connection. D. Underlying mechanism behind oscillations possibilities

arxiv: v1 [q-bio.nc] 13 Jan 2008

Overview of the visual cortex. Ventral pathway. Overview of the visual cortex

Crowd Behavior Modelling in Emergency Situations

Cerebral Cortex. Edmund T. Rolls. Principles of Operation. Presubiculum. Subiculum F S D. Neocortex. PHG & Perirhinal. CA1 Fornix CA3 S D

Why is our capacity of working memory so large?

A Knowledge-Resonance (KRES) Model of Category Learning. Bob Rehder and Gregory L. Murphy. Department of Psychology. New York University

Perceptual Grouping in a Self-Organizing Map of Spiking Neurons

Realization of Visual Representation Task on a Humanoid Robot

Control of Associative Memory During Task Switching (CAM-TS)

Cognitive Neuroscience Section 4

The Re(de)fined Neuron

The Relationship between the Perception of Axes of Symmetry and Spatial Memory during Early Childhood

BBS-D _ Mather_ Houghton. The Dentate Gyrus and the Hilar Revised

Thalamocortical Feedback and Coupled Oscillators

Intro. Comp. NeuroSci. Ch. 9 October 4, The threshold and channel memory

Evaluating the Effect of Spiking Network Parameters on Polychronization

What is working memory? (a.k.a. short-term memory) Sustained activity, Working Memory, Associative memory. Sustained activity in PFC (1) Readings:

Chapter 1. Introduction

Transcription:

1 Networks: Inhibition Controls activity (bidirectional excitation). Inhibition = thermostat-controlled air conditioner inhibitory neurons sample excitatory activity (like a thermostat samples the temperature) more excitatory activity more inhibition to keep the network from getting too hot (active) set point behavior 1. Biology: Feedforward and Feedback. 2. Critical Parameters. 2 Types of Inhibition a) b) Hidden Inhib Anticipates excitation Feed Forward Like having thermostat outside of your house Hidden Feedback Inhib Reacts to excitation Like a normal (indoor) AC thermostat 3. kwta Simplification. 3 Critical Parameters (inhib.proj) 4 Simulations: [amp top down dist.proj; inhib.proj] Feedback a) b) Hidden Inhib Hidden Inhib Feed Forward Inhib conductance into hidden units (g_bar_i.hidden) Inhib conductance into inhib units (g_bar_i.inhib) Strength of feedforward weights to inhib (scale.ff) Strength of feedback weights to inhib (scale.fb) 5 k-winners-take-all (kwta) 6 The function of inhibition is to keep excitatory activity at a rough set point. We can approximate this function by enforcing a max activity level in each layer. kwta: Instead of simulating inhibitory neurons, we choose an inhibitory current g i valuefor each layersuch thatthe specified number k of excitatory neurons are above threshold. Advantages: Much less computationally expensive, avoids oscillations.

7 8 9 10 11 12

13 14 15 16 17 18

19 20 21 22 23 kwta: Summary 24 Benefits of Inhibition Simple shortcut we use instead of actual inhibitory interneurons Captures basic idea that inhibition maintains activity at a set point for a given layer Controls activity (bidirectional excitation) Inhibition forces units to compete to represent the input: Only the most appropriate (best-fitting) units survive the competition Specify inhibition valuefor a layersuch thatkunitsareactive k is a parameter: percent activity levels vary across different brain regions! kwta still allows for some wiggle room in overall activation

25 Simulations: [inhib.proj] 1. kwta approximates set point behavior. 2. Allows for faster updating. 26 Networks 1. Biology: The cortex 2. Excitation: Unidirectional (transformations) Bidirectional (top-down processing, pattern completion, amplification) 3. Inhibition: Controls bidirectional excitation (feedforward, feedback, set point, kwta approximation) 4. Constraint Satisfaction: Putting it all together. 27 Constraint Satisfaction 28 Harmony Process of trying to satisfy various constraints (from environment, connection weights, activations). Bidirectional excitation and inhibition form part of this larger computational goal. Harmony = extent to which unit activations are consistent with weights H = 2 1 j i a i w ij a j Harmonyis highwhen unitswithstrong (positive) weights are co-active 1. Energy/harmony. 2. AttractorDynamics. 3. Noise. 29 Harmony 30 Harmony John Hopfield showed that harmony tends to increase monotonically as the network settles John Hopfield showed that harmony tends to increase monotonically as the network settles network settling = moving to a more harmonious state

31 Attractor Dynamics A network will settle into a stable state over time: the attractor. energy attractor basin Maximize harmony given inputs and weights. 32 Circle Attractor state y state x attractor state Attractor = stable set of states from many starting points. 33 [catsanddogs.proj] 34 Constraint Satisfaction Knowledge is in the weights! (built in here) Harmony as a function of constraints Bidir networks like cats and dogs constitute a content-addressable memory: you can access any part of a memory from any other Who likes to eat grass and play with feathers? In contrast, simple uni-directional nets are more like an address book: you can look up the address from someone s last name, but can t get names from addresses. 35 Constraint Satisfaction 36 Constraint Satisfaction Spontaneous generalization: The network can extract regularities from its base of knowledge What are dogs like? What are medium dogs like? 1. AttractorDynamics. 2. Energy/harmony 3. Noise.

37 The Necker Cube a) b) c) Two different interpretations Can t perceive both at once Alternate between perceptions: bistability 38 The Role of Noise Howmight noise beusefulin yourbrain? Local Minimum Global Minimum (local maximum for harmony) 39 [necker cube.proj] 40 Networks Role of noise Accommodation 1. Biology: The cortex 2. Excitation: Unidirectional (transformations) Bidirectional (top-down processing, pattern completion, amplification) 3. Inhibition: Controls bidirectional excitation (feedforward, feedback, set point, kwta approximation) 4. Constraint Satisfaction: Putting it all together.