Computing with Spikes in Recurrent Neural Networks

Similar documents
Spiking Inputs to a Winner-take-all Network

Neuromorphic computing

Basics of Computational Neuroscience: Neurons and Synapses to Networks

International Journal of Advanced Computer Technology (IJACT)

Rolls,E.T. (2016) Cerebral Cortex: Principles of Operation. Oxford University Press.

SUPPLEMENTARY INFORMATION

Computational cognitive neuroscience: 2. Neuron. Lubica Beňušková Centre for Cognitive Science, FMFI Comenius University in Bratislava

Modeling the Primary Visual Cortex

Temporal coding in the sub-millisecond range: Model of barn owl auditory pathway

How Neurons Do Integrals. Mark Goldman

Input-speci"c adaptation in complex cells through synaptic depression

Evaluating the Effect of Spiking Network Parameters on Polychronization

Heterogeneous networks of spiking neurons: self-sustained activity and excitability

Different inhibitory effects by dopaminergic modulation and global suppression of activity

Cell Responses in V4 Sparse Distributed Representation

Information Processing During Transient Responses in the Crayfish Visual System

TEMPORAL PRECISION OF SENSORY RESPONSES Berry and Meister, 1998

Neurons: Structure and communication

What is Anatomy and Physiology?

Models of visual neuron function. Quantitative Biology Course Lecture Dan Butts

Noise in attractor networks in the brain produced by graded firing rate representations

Shunting Inhibition Does Not Have a Divisive Effect on Firing Rates

Synaptic Transmission: Ionic and Metabotropic

Resonant synchronization of heterogeneous inhibitory networks

Spatial and temporal coding in an olfaction-inspired network model

STDP enhances synchrony in feedforward network

Neuron Phase Response

Hierarchical dynamical models of motor function

Active Control of Spike-Timing Dependent Synaptic Plasticity in an Electrosensory System

Outline. Animals: Nervous system. Neuron and connection of neurons. Key Concepts:

Sample Lab Report 1 from 1. Measuring and Manipulating Passive Membrane Properties

Introduction to Computational Neuroscience

Neurobiology: The nerve cell. Principle and task To use a nerve function model to study the following aspects of a nerve cell:

Introduction to Computational Neuroscience

Chapter 2--Introduction to the Physiology of Perception

PSY 215 Lecture 3 (1/19/2011) (Synapses & Neurotransmitters) Dr. Achtman PSY 215

Chapter 11 Introduction to the Nervous System and Nervous Tissue Chapter Outline

All questions below pertain to mandatory material: all slides, and mandatory homework (if any).

Model neurons!!!!synapses!

Using An Expanded Morris-Lecar Model to Determine Neuronal Dynamics In the Event of Traumatic Brain Injury

PHGY 210,2,4 - Physiology SENSORY PHYSIOLOGY. Martin Paré

Implantable Microelectronic Devices

Inhibition: Effects of Timing, Time Scales and Gap Junctions

A developmental learning rule for coincidence. tuning in the barn owl auditory system. Wulfram Gerstner, Richard Kempter J.

Human Brain and Senses

EE 791 Lecture 2 Jan 19, 2015

Mechanisms of stimulus feature selectivity in sensory systems

The inertial-dnf model: spatiotemporal coding on two time scales

Learning in neural networks

Winter 2017 PHYS 178/278 Project Topics

Neural response time integration subserves. perceptual decisions - K-F Wong and X-J Wang s. reduced model

Shadowing and Blocking as Learning Interference Models

CAS Seminar - Spiking Neurons Network (SNN) Jakob Kemi ( )

Questions Addressed Through Study of Behavioral Mechanisms (Proximate Causes)

Nervous System. 2. Receives information from the environment from CNS to organs and glands. 1. Relays messages, processes info, analyzes data

LESSON 3.3 WORKBOOK. Why does applying pressure relieve pain? Workbook. Postsynaptic potentials

Dynamics of Hodgkin and Huxley Model with Conductance based Synaptic Input

Why is our capacity of working memory so large?

ASSOCIATIVE MEMORY AND HIPPOCAMPAL PLACE CELLS

Communication within a Neuron

CHARACTERIZING NEUROTRANSMITTER RECEPTOR ACTIVATION WITH A PERTURBATION BASED DECOMPOSITION METHOD. A Thesis. presented to

A Connectionist Model based on Physiological Properties of the Neuron

The Ever-Changing Brain. Dr. Julie Haas Biological Sciences

Neural coding and information theory: Grandmother cells v. distributed codes

Analysis of in-vivo extracellular recordings. Ryan Morrill Bootcamp 9/10/2014

VS : Systemische Physiologie - Animalische Physiologie für Bioinformatiker. Neuronenmodelle III. Modelle synaptischer Kurz- und Langzeitplastizität

LESSON 3.3 WORKBOOK. Why does applying pressure relieve pain?

What is Neuroinformatics/ Computational Neuroscience? Computational Neuroscience? Computational Neuroscience? Computational Neuroscience?

SUPPLEMENTARY INFORMATION

A toy model of the brain

Multi-Associative Memory in flif Cell Assemblies

Theory of correlation transfer and correlation structure in recurrent networks

You submitted this quiz on Sun 19 May :32 PM IST (UTC +0530). You got a score of out of

Imperfect Synapses in Artificial Spiking Neural Networks

Visual Motion Perception using Critical Branching Neural Computation

What is a moment? Transient synchrony as a collective mechanism for spatiotemporal integration. Abstract

Temporally asymmetric Hebbian learning and neuronal response variability

Beyond bumps: Spiking networks that store sets of functions

The Role of Mitral Cells in State Dependent Olfactory Responses. Trygve Bakken & Gunnar Poplawski

Prof. Greg Francis 7/31/15

Chapter 4 Neuronal Physiology

Effects of synaptic noise on a neuronal pool model with strong excitatory drive and recurrent inhibition

The storage and recall of memories in the hippocampo-cortical system. Supplementary material. Edmund T Rolls

LEARNING AS A PHENOMENON OCCURRING IN A CRITICAL STATE. Gan W. et al., 2000, Neuron High magnification image of cortical

THE HISTORY OF NEUROSCIENCE

The Role of Coincidence-Detector Neurons in the Reliability and Precision of Subthreshold Signal Detection in Noise

Optimal information decoding from neuronal populations with specific stimulus selectivity

A COMPETITIVE NETWORK OF SPIKING VLSI NEURONS

Modeling synaptic facilitation and depression in thalamocortical relay cells

Running PyNN Simulations on SpiNNaker

Artificial Neural Networks (Ref: Negnevitsky, M. Artificial Intelligence, Chapter 6)

Ameen Alsaras. Ameen Alsaras. Mohd.Khatatbeh

Recognition of English Characters Using Spiking Neural Networks

Spiking neural network simulator: User s Guide

BIONB/BME/ECE 4910 Neuronal Simulation Assignments 1, Spring 2013

Computational analysis of epilepsy-associated genetic mutations

Adaptive leaky integrator models of cerebellar Purkinje cells can learn the clustering of temporal patterns

Theta sequences are essential for internally generated hippocampal firing fields.

Levodopa vs. deep brain stimulation: computational models of treatments for Parkinson's disease

Transcription:

Computing with Spikes in Recurrent Neural Networks Dezhe Jin Department of Physics The Pennsylvania State University Presented at ICS Seminar Course, Penn State Jan 9, 2006

Outline Introduction Neurons, neural networks, and neural computations with dynamical attractors Spike sequence attractors Exist for a large class of neural networks Fast convergence Rich structures Summary

Introduction

Brain & local neural networks Human brain : 10 11 neurons Hierarchal, modular, interacting structures: cortical areas Local neural networks

Neuron: membrane potential & spikes A neuron is like a battery charged leaky capacitor Dendrite Cell body Axon Inhibitory neurons Inhibitory conductance Excitatory neurons Excitatory conductance + - + - - + Membrane potential V ~ -70 mv Outside 0 mv Ions Leak conductance Voltagedependent conductance Membrane Input V (mv) Spikes are transmitted to other neurons 0-70 Spike (width ~1msec) Threshold Reset Output time

Local networks: lateral excitation & global inhibition Composition Excitatory neurons : number ~80%, output to other networks Inhibitory neurons : number ~20%, no output to other networks Coupling between the excitatory neurons Lateral excitation Global inhibition via the inhibitory neurons (inter-neurons) Inhibitory neuron Excitatory neuron Inputs from lower area neurons Outputs to other local networks

Computing with dynamical attractors spike Local neural network 4 4 1 1 3 3 2 2 1 2 3 4 1 2 3 4 Membrane potential Tiger attractor time Cow attractor Dynamical attractors potential

Charactering the attractors Encoding capability Is the convergence fast? Is the number of attractors large enough to encode a large number of external input patterns? Spatial or spatiotemporal? Spatial: only spiking rates are important (Hopfield, PNAS, 1984). 1 2 1 2 4 3 3 4

Spatiotemporal patterns of spikes Neurons of the local networks in locust antennal lobe responding to odor presentation Trial 1 Neuron 1 Neuron 2 Membrane potential Trial 2 Neuron 1 Neuron 2 Presentation of odor 200 msec 40 mv Stopfer & Laurent (Nature, 1999)

Spatiotemporal spike attractors For a large class of neural networks, spatiotemporal spikes with precise timings are the dynamical attractors. Fast convergence with a few transient spikes Rich spatiotemporal structures Simplifications Simple models of neurons and the coupling between them No inter-neurons, allowing direct excitation and inhibition between neurons No noise, spike transmission delay,... Roadmap: A special case: winner-take-all computation General case

Winner-take-all computation

The structure of the network Inhibitory connection (global inhibition) Excitatory connection (self-excitation) External inputs No inhibitory inter-neurons Identical neurons, excitatory connection strength, and inhibitory connection strength External inputs constant in time but vary spatially

Neuron model : Leaky integrate-and-fire neuron Leaky integrate- Membrane potential Leak time constant τ dv dt = E R V + I External input Resting membrane potential Spike (not modeled) -and-fire (spike) Spike threshold If the membrane potential reaches a threshold V th (< 0mV), send a spike out and reset the mebrane potential to V r < V th. Reset

δ-pulse coupling τ dv dt = E R V + I G E δ ( t t spike )V ' + G I δ t t spike ( ) ( ) E I V G E : strength of excitatory connection G I : strength of inhibitory connection E I : reversal potential, -75 mv ' t spike, t spike : time of spike reception conductance Spike time

The winner-take-all attractor No spikes External inputs time potential Periodic spiking Neuron with the maximum input The attractor Only the neuron with the maximum input spikes; it spikes periodically.

Fast winner-take-all computation Computation Maximum input selection: peak detection in the external inputs Fast convergence The computation is done as soon as the neuron with the maximum input spikes once. Very few transient spikes are needed. (simulation) Jin & Seung (PRE, 2002)

Intuitive picture Two stage dynamics: between spikes & at the spike With a strong inhibition, spikes from the winner suppress spiking of all other neurons. Between spikes: Race to spike At spike: membrane potentials jump

A mapping technique

The Γ-mapping Spike time without interaction T j,k(n) = τ log 1+ V th V + j,k(n) I j I th τlog ( Γ j,k(n) ). Neuron ID of next spike Threshold current Γ k(n+1),k(n) = min ( Γ j,k(n) ), j=1...n Γ j,k(n) Γ = ψ + ε. j,k(n +1) j Γ k(n+1),k(n) Pseudo-spike time Neuron ID of the nth spike of the network Pseudo-spike times relative to next spike Constants depending on the external inputs and the connection strength

Condition for winner-take-all I i I th > η( G E,G I )( I j I ) th for all j i. Γ i,k(n)=i < Γ j,k(n)=i for all j i. After neuron i spikes once, no other neuron can spike. Maximum input selection : η( G E,G I ) = 1.

Spatiotemporal spike attractors

A class of neural networks Excitatory connection Network structure Strong global inhibition Arbitrary number of spiking neurons Arbitrary connectivity Arbitrary patterns of the external inputs Heterogeneity in neuron properties External inputs Inhibitory connection Simplifications No inter-neurons Leaky integrate-and-fire neuron model Synaptic coupling : δ-pulse No noise, no spike transmission delay External inputs constant in time but distributed spatially

Spike sequence attractors Spike sequence attractor 1 3 4 2 1 2 3 4 2 1 2 3 All spike sequences flow into spike sequence attractors. Timings of the spikes in the attractor are precise. The convergence is fast when the inhibition is strong. (simulation) Jin (PRL, 2002)

Description of the dynamics In between spikes: race to spike One neurons spikes; all membrane potentials jump discontinuously

The Γ-mapping Neuron ID of next spike Pseudo-spike time Γ k(n+1),k(n) = min ( Γ j,k(n) ), j=1...n Γ j,k(n+1) = ψ j,k(n+1) + ε j,k(n+1) Neuron ID of the nth spike of the network Γ j,k(n) Γ k(n+1),k(n). Pseudo-spike times relative to next spike Constants depending on the external inputs and the connection strength

Stability of the mapping Exponential damping of small perturbations Neuron ID 3 1 1 Γ 2 3 2 Perturbed Unperturbed 1 2 1 st 2 nd 3 rd 3 Spike No. Define Δ n max l=1...n Γ l,k(n) Γ ' l,k(n). Then Δ n < λ n 2DΔ 1. ( ). Here, λ < 1, and λ exp minimum connection strength

Trapping of spike sequences Consider two spike sequences: S1=(...,i1,i2,...,iP,iP+1,...), S2=(..., j1, j2,..., jp, jp+1,...), with i n = j n for n = 1,..., P. There exits a finite P * such that if P > P *, i n = j n for all n > P. Moreover, the spike timing difference decreases exponentially with P. Here P * 1 log λ.

Spike sequence attractors All spike sequences will be trapped in periodic patterns (spike sequence attractors). Subsequences of any finite length will appear again in an infinite sequence with finite number of neurons. For N = 2 and P * = 4 : S = ( 1,1,1,1,2,2,1,1,2,1,2,2,1,2,2,2,2,1,2,2,1,2,2,2,... ) Spike sequence attractor

An example N = 1000. 0.4<inhibition strength < 0.6. 0<excitation strength<0.05. τ = 40 msec. Random inputs.

Fast convergence - statistics Histogram Number of transient spikes Number of transient spikes Length of the attractor sequence Simulation 2000 runs. For each run, the connections and the external inputs are randomly set. The maximum of the external inputs is fixed. The range of the connection strength is fixed. Results Poisson distribution of the number of the transient spikes No relationship between the length of the spike sequence attractor and the number of transient spikes

Rich structures - statistics Number of attractors Spike sequence attractors Number of neurons, N Spatial pattern attractors Simulation Averaged over 20 random networks 10 N sets of randomly selected inputs with fixed maximum for each network 10 random initial conditions for each network and each set of inputs Results Exponential growth of the number of spike sequence attractors with the network size On average one attractor for one set of external inputs

Summary Spike sequence attractors are the dynamical attractors for a large class of neural networks. These attractors have two favorable characteristics for neural computation: fast convergence and rich structures.