Efficient Emulation of Large-Scale Neuronal Networks

Similar documents
Active Control of Spike-Timing Dependent Synaptic Plasticity in an Electrosensory System

STDP enhances synchrony in feedforward network

Synaptic Transmission: Ionic and Metabotropic

Effects of Inhibitory Synaptic Current Parameters on Thalamocortical Oscillations

Investigation of Physiological Mechanism For Linking Field Synapses

Neuromorphic computing

Temporally asymmetric Hebbian learning and neuronal response variability

Modulators of Spike Timing-Dependent Plasticity

VS : Systemische Physiologie - Animalische Physiologie für Bioinformatiker. Neuronenmodelle III. Modelle synaptischer Kurz- und Langzeitplastizität

Dynamics of Hodgkin and Huxley Model with Conductance based Synaptic Input

Modeling Excitatory and Inhibitory Chemical Synapses

Implementing Spike-Timing-Dependent Plasticity on SpiNNaker Neuromorphic Hardware

Introduction to Computational Neuroscience

Rolls,E.T. (2016) Cerebral Cortex: Principles of Operation. Oxford University Press.

Intro. Comp. NeuroSci. Ch. 9 October 4, The threshold and channel memory

A general error-based spike-timing dependent learning rule for the Neural Engineering Framework

Input-speci"c adaptation in complex cells through synaptic depression

Shadowing and Blocking as Learning Interference Models

The storage and recall of memories in the hippocampo-cortical system. Supplementary material. Edmund T Rolls

Designing Behaviour in Bio-inspired Robots Using Associative Topologies of Spiking-Neural-Networks

Spiking neural network simulator: User s Guide

Heterogeneous networks of spiking neurons: self-sustained activity and excitability

Spiking Inputs to a Winner-take-all Network

Computational cognitive neuroscience: 2. Neuron. Lubica Beňušková Centre for Cognitive Science, FMFI Comenius University in Bratislava

LECTURE 2. C. Reason correlation and synaptic delay not enough to prove direct connection. D. Underlying mechanism behind oscillations possibilities

Evaluating the Effect of Spiking Network Parameters on Polychronization

Modeling Depolarization Induced Suppression of Inhibition in Pyramidal Neurons

Information processing at single neuron level*

A hardware implementation of a network of functional spiking neurons with hebbian learning

Different inhibitory effects by dopaminergic modulation and global suppression of activity

Course Introduction. Neural Information Processing: Introduction. Notes. Administration

Neuron Phase Response

TEMPORAL PRECISION OF SENSORY RESPONSES Berry and Meister, 1998

Basics of Computational Neuroscience: Neurons and Synapses to Networks

Spike-timing-dependent synaptic plasticity can form zero lag links for cortical oscillations.

Neural Information Processing: Introduction

Action potential. Definition: an all-or-none change in voltage that propagates itself down the axon

Signal detection in networks of spiking neurons with dynamical synapses

Dynamic Stochastic Synapses as Computational Units

Model neurons!!!!synapses!

Recognition of English Characters Using Spiking Neural Networks

Part 3. Synaptic models

A SUPERVISED LEARNING

Project SpikeFORCE: Real-time Spiking Networks for Robot Control

Dendritic compartmentalization could underlie competition and attentional biasing of simultaneous visual stimuli

Biomimetic Cortical Nanocircuits: The BioRC Project. Alice C. Parker NSF Emerging Models of Technology Meeting July 24, 2008

Beyond Vanilla LTP. Spike-timing-dependent-plasticity or STDP

Cellular Bioelectricity

Computational analysis of epilepsy-associated genetic mutations

Mike Davies Director, Neuromorphic Computing Lab Intel Labs

Electrophysiology. General Neurophysiology. Action Potentials

Theme 2: Cellular mechanisms in the Cochlear Nucleus

Methods for Simulating High-Conductance States in Neural Microcircuits

A developmental learning rule for coincidence. tuning in the barn owl auditory system. Wulfram Gerstner, Richard Kempter J.

Task 1: Machine Learning with Spike-Timing-Dependent Plasticity (STDP)

Learning in neural networks by reinforcement of irregular spiking

GABAA AND GABAB RECEPTORS

Supplementary Figure 1. Basic properties of compound EPSPs at

Axon initial segment position changes CA1 pyramidal neuron excitability

Temporal coding in the sub-millisecond range: Model of barn owl auditory pathway

Information Processing During Transient Responses in the Crayfish Visual System

Delay Learning and Polychronization for Reservoir Computing

arxiv: v1 [q-bio.nc] 25 Apr 2017

Neural response time integration subserves. perceptual decisions - K-F Wong and X-J Wang s. reduced model

Neuromorhic Silicon Neuron and Synapse: Analog VLSI Implemetation of Biological Structures

SUPPLEMENTARY INFORMATION

Observational Learning Based on Models of Overlapping Pathways

Supervised Learning in Spiking Neural Networks with ReSuMe: Sequence Learning, Classification and Spike-Shifting

LESSON 3.3 WORKBOOK. Why does applying pressure relieve pain? Workbook. Postsynaptic potentials

Synapses and synaptic plasticity. Lubica Benuskova Lecture 8 How neurons communicate How do we learn and remember

Neurons! John A. White Dept. of Bioengineering

Rule-based firing for network simulations

Memory Systems II How Stored: Engram and LTP. Reading: BCP Chapter 25

Synaptic Interactions

Running PyNN Simulations on SpiNNaker

5-Nervous system II: Physiology of Neurons

1) Drop off in the Bi 150 box outside Baxter 331 or to the head TA (jcolas).

Exploring the Functional Significance of Dendritic Inhibition In Cortical Pyramidal Cells

Network model of spontaneous activity exhibiting synchronous transitions between up and down states

Copyright Dr. Franklin B. Krasne, Swimmy

Spike Sorting and Behavioral analysis software

Self-Organization and Segmentation with Laterally Connected Spiking Neurons

A General Theory of the Brain Based on the Biophysics of Prediction

Realization of Visual Representation Task on a Humanoid Robot

LESSON 3.3 WORKBOOK. Why does applying pressure relieve pain?

How Synapses Integrate Information and Change

Hiroki Mori Graduate School of Fundamental Science and Engineering Waseda University Tokyo, Japan

Welcome to CSE/NEUBEH 528: Computational Neuroscience

Emanuel Todorov, Athanassios Siapas and David Somers. Dept. of Brain and Cognitive Sciences. E25-526, MIT, Cambridge, MA 02139

Problem Set 3 - Answers. -70mV TBOA

Cerebellum and spatial cognition: A connectionist approach

Stable Hebbian Learning from Spike Timing-Dependent Plasticity

Inhibitory synchrony as a mechanism for attentional gain modulation q

Shunting Inhibition Does Not Have a Divisive Effect on Firing Rates

ASP: Learning to Forget with Adaptive Synaptic Plasticity in Spiking Neural Networks

THE HISTORY OF NEUROSCIENCE

Reading Neuronal Synchrony with Depressing Synapses

Спайковые и бионические нейронные сети, нейроморфные системы. Dmitry Nowicki

Firing Cell: An Artificial Neuron with a Simulation of Long-Term-Potentiation-Related Memory

F09/F10 Neuromorphic Computing

Transcription:

Efficient Emulation of Large-Scale Neuronal Networks BENG/BGGN 260 Neurodynamics University of California, San Diego Week 6 BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 1 / 8

Reading Material W. Gerstner and W. Kistler, Spiking Neural Models: Single Neurons, Populations, Plasticity, Cambridge Univ. Press, 2002, Ch. 4, pp. 93-145. C. Koch, Biophysics of Computation, Oxford Univ. Press, 1999, Ch. 14, pp. 330-349. P. Dayan and L. Abbott, Theoretical Neuroscience, MIT Press, 2001, Ch. 7.2, pp. 231-240. Michelle Rudolph and Alain Destexhe, Analytical Integrate-and-Fire Neuron Models with Conductance-Based Dynamics for Event-Driven Simulation Strategies, Neural Computation, vol. 18, pp. 2146-2210, 2006. E. Ros, R. Carrillo, E. Ortigosa, B. Barbour, and R. Agis, Event-Driven Simulation Scheme for Spiking Neural Networks Using Lookup Tables to Characterize Neuronal Dynamics, Neural Computation, vol. 18, pp. 2959-2993, 2006. BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 2 / 8

Efficient Event-Driven Emulation of Spiking Networks INPUT SPIKES HEAP NEURON LIST events SIMULATION ENGINE OUTPUT SPIKES INTERCONN. LIST NETWORK DEFINITION CHARACTERIZATION TABLES Vm Ge Gi Tf Figure 1: Main structures of the ED-LUT simulator Input spikes are stored in an input queue and are sequentially inserted into the spike heap. The network definition process produces a neuron list and an interconnection list, which are consulted by the simulation engine. Event processing is done by accessing the neuron characterization tables to retrieve updated neuronal states and forecast spike firing times. Ros et al, 2006 BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 3 / 8

Efficient Event-Driven Emulation of Spiking Networks While t sim < t end { Extract the event with a shortest latency in the spike heap If it is a firing event If it is still a valid event and the neuron is not under a refractory period Update the neuron state: Vm, gexc, g inh to the postfiring state. Prevent this neuron from firing during the refractory period. (Once this is done, update the neuron time label to t sim + t refrac ). Predict if the source neuron will fire again with the current neuron state. If the neuron will fire: Insert a new firing event into the spike heap. Insert the propagated event with the shortest latency (looking at the output connection list). C m V m E rest g rest g exc E exc g inh E inh } If it is a propagated event Update the target neuron state: Vm, gexc, g inh looking at the characterization tables, before the event is computed. Modify the conductances (gexc, g inh ) using the connection weight (G exc,i, G inh,i ) for the new spike. Update the neuron time label to t sim. Predict if the target neuron will fire. If it fires: Insert the firing event into the spike heap with the predicted time. Insert only the next propagated event with the next shortest latency (looking at the output connection delay table). Figure 3: Equivalent electrical circuit of a model neuron g exc and g inh are the excitatory and inhibitory synaptic conductances, while g rest is the resting conductance, which returns the membrane potential to its resting state (E rest ) in the absence of input stimuli. Figure 2: Simulation algorithm This pseudocode describes the simulation engine. It processes all the events of the spike heap in chronological order. Ros et al, 2006 BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 4 / 8

Exponential Postsynaptic Conductance Model g exc (t) = g inh (t) = { 0, t < t 0 G exc e (t t0)/τexc, t t 0 { 0, t < t 0 G inh e (t t0)/τ inh, t t 0 (1) Figure 4. A postsynaptic neuron receives two consecutive input spikes The evolution of the synaptic conductance is the middle plot. The two excitatory postsynaptic potentials (EPSPs) caused by the two input spikes are shown in the bottom plot. In the solid line plots, the synaptic conductance transient is represented by a double-exponential expression (one exponential for the rising phase, one for the decay phase). In the dashed line plot, the synaptic conductance is approximated by a single-exponential expression. The EPSPs produced with the different conductance waveforms are almost identical. Single decaying exponential model for synaptic conductance is adequate for accurate emulation of postsynaptic membrane dynamics. Ros et al, 2006 BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 5 / 8

Event-Driven, Table-Based Estimation of Postsynaptic Action Potential Timing For single-exponential conductance dynamics, arrival of a new excitatory presynaptic spiking event at time t signifies a one-time increment in the conductance: g exc (t) = G exc,j + e (t t previousspike)/τ exc g exc (t previousspike ) (2) and similar for an inhibitory event. Postsynaptic time-to-firing (in the absence of other events ) is then tabulated for different values of initial excitatory and inhibitory conductance: C m dv m dt = g exc (t 0 )e (t t 0)/τ exc (E exc V m ) + g inh (t 0 )e (t t 0)/τ inh (E inh V m ) + G rest (E rest V m ) (3) In the event of an earlier other event, the conductance values are updated according to (2). Ros et al, 2006 BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 6 / 8

Accuracy of Event-Driven, Table-Based Emulation Figure 9: Single neuron simulation Excitatory and inhibitory spikes are indicated on the upper plots. Excitatory and inhibitory conductance transients are plotted in the middle plots. The bottom plot is a comparison between the neural model simulated with iterative numerical calculation (continuous trace) and the event-driven scheme, in which the membrane potential is updated only when an input spike is received (marked with an x). Ros et al, 2006 BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 7 / 8

Notes on the Event-Driven Approach The table-based, event-driven approach can be extended to multiple excitatory and inhibitory synapse types with different synaptic strengths, time constants, and reversal potentials. Each additional and distinct time constant requires an additional variable dimension in the table lookup. Additional synapse types with time constant identical to another existing synapse type do not incur additional variable dimensions, independent of synaptic strength or reversal potential. The complexity of table lookup is exponential in the number of variable dimensions, and hence does not scale to large numbers of different time constants in the synaptic dynamics. In the case of a single synaptic time constant, a simple and exact closed-form expression for the postsynaptic time-to-firing replaces the table lookup for efficient implementation [Rudolph and Destexhe, 2006]. BENG/BGGN 260 Neurodynamics (UCSD) Efficient Emulation of Large-Scale Neuronal Networks Week 6 8 / 8