CAS Seminar - Spiking Neurons Network (SNN) Jakob Kemi ( )

Similar documents
Computational cognitive neuroscience: 2. Neuron. Lubica Beňušková Centre for Cognitive Science, FMFI Comenius University in Bratislava

Recognition of English Characters Using Spiking Neural Networks

Artificial Neural Networks (Ref: Negnevitsky, M. Artificial Intelligence, Chapter 6)

Evaluating the Effect of Spiking Network Parameters on Polychronization

Rolls,E.T. (2016) Cerebral Cortex: Principles of Operation. Oxford University Press.

Spiking Inputs to a Winner-take-all Network

Dynamic Stochastic Synapses as Computational Units

Introduction to Computational Neuroscience

1. Introduction 1.1. About the content

1. Introduction 1.1. About the content. 1.2 On the origin and development of neurocomputing

CHAPTER I From Biological to Artificial Neuron Model

Investigation of Physiological Mechanism For Linking Field Synapses

The storage and recall of memories in the hippocampo-cortical system. Supplementary material. Edmund T Rolls

International Journal of Advanced Computer Technology (IJACT)

Self-Organization and Segmentation with Laterally Connected Spiking Neurons

Cognitive Neuroscience History of Neural Networks in Artificial Intelligence The concept of neural network in artificial intelligence

Neural Information Processing: between synchrony and chaos

Computational Cognitive Neuroscience

Neuromorphic computing

The Integration of Features in Visual Awareness : The Binding Problem. By Andrew Laguna, S.J.

Ameen Alsaras. Ameen Alsaras. Mohd.Khatatbeh

Modeling of Hippocampal Behavior

VHDL implementation of Neuron based classification for future artificial intelligence applications

Computing with Spikes in Recurrent Neural Networks

CS 453X: Class 18. Jacob Whitehill

Chapter 1. Introduction

Part 11: Mechanisms of Learning

Learning and Adaptive Behavior, Part II

Intelligent Control Systems

Introduction to Computational Neuroscience

Artificial Neural Networks

Introduction to Computational Neuroscience

Intro. Comp. NeuroSci. Ch. 9 October 4, The threshold and channel memory

Racing to Learn: Statistical Inference and Learning in a Single Spiking Neuron with Adaptive Kernels

Lecture 22: A little Neurobiology

Active Control of Spike-Timing Dependent Synaptic Plasticity in an Electrosensory System

Learning in neural networks

Synfire chains with conductance-based neurons: internal timing and coordination with timed input

Realization of Visual Representation Task on a Humanoid Robot

Running PyNN Simulations on SpiNNaker

A Computational Biological Network for Wood Defect Classification

Lateral Geniculate Nucleus (LGN)

Mike Davies Director, Neuromorphic Computing Lab Intel Labs

Information Processing During Transient Responses in the Crayfish Visual System

A computational account for the ontogeny of mirror neurons via Hebbian learning

CS/NEUR125 Brains, Minds, and Machines. Due: Friday, April 14

2Lesson. Outline 3.3. Lesson Plan. The OVERVIEW. Lesson 3.3 Why does applying pressure relieve pain? LESSON. Unit1.2

The Re(de)fined Neuron

TIME SERIES MODELING USING ARTIFICIAL NEURAL NETWORKS 1 P.Ram Kumar, 2 M.V.Ramana Murthy, 3 D.Eashwar, 4 M.Venkatdas

Free recall and recognition in a network model of the hippocampus: simulating effects of scopolamine on human memory function

ERA: Architectures for Inference

Question 1 Multiple Choice (8 marks)

CS294-6 (Fall 2004) Recognizing People, Objects and Actions Lecture: January 27, 2004 Human Visual System

OPTO 5320 VISION SCIENCE I

COMP9444 Neural Networks and Deep Learning 5. Convolutional Networks

A Dynamic Neural Network Model of Sensorimotor Transformations in the Leech

Spatiotemporal clustering of synchronized bursting events in neuronal networks

What is Anatomy and Physiology?

Analysis of in-vivo extracellular recordings. Ryan Morrill Bootcamp 9/10/2014

ANN predicts locoregional control using molecular marker profiles of. Head and Neck squamous cell carcinoma

The Perceptron: : A Probabilistic Model for Information Storage and Organization in the brain (F. Rosenblatt)

Sparse Coding in Sparse Winner Networks

Using stigmergy to incorporate the time into artificial neural networks

AU B. Sc.(Hon's) (Fifth Semester) Esamination, Introduction to Artificial Neural Networks-IV. Paper : -PCSC-504

A Neural Network Model of Naive Preference and Filial Imprinting in the Domestic Chick

Neural Networks. Nice books to start reading:

A Scientific Model of Consciousness that Explains Spirituality and Enlightened States

Biomimetic Cortical Nanocircuits: The BioRC Project. Alice C. Parker NSF Emerging Models of Technology Meeting July 24, 2008

Dynamics of Hodgkin and Huxley Model with Conductance based Synaptic Input

MODELING SMALL OSCILLATING BIOLOGICAL NETWORKS IN ANALOG VLSI

Prof. Greg Francis 7/31/15

File name: Supplementary Information Description: Supplementary Figures, Supplementary Table and Supplementary References

Cell Responses in V4 Sparse Distributed Representation

Anatomy Review. Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (

Neurons: Structure and communication

Motor systems III: Cerebellum April 16, 2007 Mu-ming Poo

How we study the brain: a survey of methods used in neuroscience

LESSON 3.3 WORKBOOK. Why does applying pressure relieve pain? Workbook. Postsynaptic potentials

New Ideas for Brain Modelling

Physiology of Tactile Sensation

Lecture 1: Neurons. Lecture 2: Coding with spikes. To gain a basic understanding of spike based neural codes

TEMPORAL PRECISION OF SENSORY RESPONSES Berry and Meister, 1998

EE 791 Lecture 2 Jan 19, 2015

Neural Processing of Counting in Evolved Spiking and McCulloch-Pitts Agents

Clusters, Symbols and Cortical Topography

Chapter 3 Neurotransmitter release

Efficient Emulation of Large-Scale Neuronal Networks

LESSON 3.3 WORKBOOK. Why does applying pressure relieve pain?

A toy model of the brain

Chapter 2--Introduction to the Physiology of Perception

Task 1: Machine Learning with Spike-Timing-Dependent Plasticity (STDP)

Neural Coding. Computing and the Brain. How Is Information Coded in Networks of Spiking Neurons?

Evolution of Spiking Neural Controllers for Autonomous Vision-Based Robots

Introduction. Chapter The Perceptual Process

Plasticity of Cerebral Cortex in Development

An Automated Method for Neuronal Spike Source Identification

Brain-inspired Balanced Tuning for Spiking Neural Networks

Control of Selective Visual Attention: Modeling the "Where" Pathway

Visual semantics: image elements. Symbols Objects People Poses

Transcription:

CAS Seminar - Spiking Neurons Network (SNN) Jakob Kemi (820622-0033) kemiolof@student.chalmers.se November 20, 2006

Introduction Biological background To be written, lots of good sources. Background First generation of neural networks Introduced in 1943 by McColloch and Pitts. This is a binary model where the only states for a neuron is firing or not firing. The firing is controlled by a threshold. In this model neurons can be connected in multiple layers and with varying configurations of input neurons etc. However every neuron act simultaneously in a synchronized fashion. This model is good for many types of computations but is not very related to actual biological behavior due to the binary and synchronized simplifications. Hopfield model etc. Second generation This model encodes additional information in neuron output spikes by allowing "analog" outputs from neurons, by for example using a sigmoid function (as described in [1]) The sigmoid output can for example symbolize the firing rate of biological neurons. This model is perhaps the most common and can compute all sorts of analog functions and is studied in detail in [1] for example. Third generation Studies of real biological neurons shows that the firing rate of individual neurons hardly exceeds one hundred spikes per seconds. However the reaction time of specific networks (some visual tasks for e.g.) is far lower than the length of combined neuron chains would indicate. This among other things led to the assumption that also timing plays a part in biological networks. Which gave rise to a third type of networks, (Asynchronous) Spiking Neural Networks, (A)SNN, thus also tries to incorporate timing by letting individual neurons act in an asynchronously fashion where the timing of the individual neuron is determined by the precise time and sometimes also duration of the presynaptic spikes. This can sometimes also by combined with local oscillations modeling the fact that signals can also be arriving by the outer membrane. The model is almost always also utilizing som sort of decay of the inner energy of the individual neurons. This was modeled early but didn t catch on mostly due to the fact that the calculations are costly on traditional sequential computers. One of the reasons that this model have gained recent popularity is due not only to it s perhaps more biologically similarities but also to the fact that it s inherently parallel in nature. This means that it s well suited to distributed implementations, by the use of computer clusters or in custom parallel hardware which are recently gaining interest due to the ease and availability of modern FPGA chips etc. It has also been shown that it s computationally stronger than the standard sigmoid model and that it s possible to express any function expressible with sigmoid networks using SNN instead., [[2], more references needed for this section] 1

The artificial Spiking Neuron model There exists some variations, but I choose to explain one that incorporates most of the common elements of the variants as used for e.g. in [4]. Each spiking neuron, called SN contains an inner variable describing the inner energy potential, u. This inner potential is strives to reach a rest state (u = 0) which means that it will decay with time if positive and vice verse. The inputs of the SN consists of multiple synapses connected to the output of other SNs. Each synapse has two parameters, namely the transmission delay, d and a weight, w which can be positive or negative. The (positive) delay denotes the time it takes for the signal to travel through the synapse and the weight determines the strength of the signal. A negative value of the weight causes an inhibitory effect whilst a positive value causes a excitatory effect. Sometimes a third parameter, τis assigned to each synapse denoting the duration and steepness of the spike. Every arriving signal is basically multiplied by the synapse weight and added to the inner energy potential. Once this potential is larger than a certain threshold the SN will fire a spike in order to decrease its energy potential. The commonly used shape of the output spike is often a sharp peak with a uniform negative response in the potential energy. Sometimes the output signal is allowed to borrow energy leaving a negative recoil" in the potential afterward. [nice diagram over output signal and potential energy] [include kernel functions etc and mathematical formulas] Applications Associative memory Perhaps talk about findings from [8] Visual pattern recognition One rather interesting application of SNNs can be found in [9] where they ve used a network for visual face recognition. The motivating question in the approach outline below is the fact that studies on monkeys show that the response of complex stimuli such as faces leads to selective responses in as low latencies as 80-100 ms. Considering that during this time information has to be process by the retina, and at least four cortical areas where at least two synaptic stages occur in every such area. This indicates that there is time for little more than one spike at each neuron in the signal path. The question is therefore if it s possible to construct an artificial SNN that can process visual face input using only one spike per neuron. The model used by [9] is as follows. The network is constructed out of four different layers each performing a specific part in the recognition process. Each layer consists of multiple SNs each allowed to spike only once. The delay from each neuron is then used as input to the next layer. A shorter delay indicates a stronger match, and a longer delay indicates a weak match. The layers are constructed as follows 2

Layer one handles contrast recognition. It contains twice as many nodes as there are input pixels, two nodes for each pixel. One node is used to signal if the pixel is ON focus (that is; surrounding pixels differ in contrast) and another node signals if it s OFF focus. In the second layer the input from the contrast layer is used to determine local orientation. Eight symmetrically spread orientations are detected (45 degrees separation). The output are then fed into the third layer which detects facial features (eyes and mouth). The last layer are finally used for the actual face pattern matching. [insert relevant, informative pictures and results. also describe training method] Classification... Training of SNNs Error-Backpropagation discuss [6] and [7]... SpikeProp ([6]) only allows static synapse delays and consists of exactly three layers, namely I(input), H(hidden) and O(output). [picture?] The function that the network calculates is formulated by passing in input spikes with different delay from some reference time, t 0. The network then allows a maximum of one spike per neuron and presents the output as zero or one spike on each of the output neurons with varying time delays. Two different backpropagating synapse-weight modifiers are then used for the synapses between I and H and between H and O. Improved SpikeProp ([7]) further improves this by keeping the general model intact but also deriving analogue methods for updating the firing threshold, the delay time and the synaptic time stamp (duration and steepness of spike) for each neuron. Evolutionary Algorithms incorporate findings from [5] for example. Implementation My implementation Implementation of smallish SNN using EA for training. Method, results etc. 3

GPU implementation Deep Blue etc. FPGA Recent/Intresting implementations Conclusion 4

Bibliography [1] Neural Networks (book) Simon Haykin [2] Applications of Spiking Neural Networks (phd thesis) Sander M. Bohte Joost N. Kok [3] Spiking Neurons on GPUs (paper) Fabrice Bernhard and Renaud Keriven [4] Temporal Pattern Classification using Spiking Neural Networks (paper) Olaf Booij [5] Spiking Neural Network Training Using Evolutionary Algorithms (paper) N.G. Pavlidis et. al. [6] Spike-prop: error-backprogation in multi-layer networks of spiking neurons. (paper) Bohte, S.M. and Kok, J.N. and La Poutr\ {e}, H. [7] Improving SpikeProp: Enhancements to An Error-Backpropagation Rule for Spiking Neural Networks (paper) Benjamin Schrauwen, Jan Van Campenhout [8] Associative Memory in Networks of Spiking Neurons (paper) Friedrich T. Sommer and Thomas Wennekers [9] FACE PROCESSING USING ONE SPIKE PER NEURONE (paper) Rufin Van Rullen, Jacques Gautrais, Arnaud Delorme and Simon Thorpe 5