14. Development and Plasticity - SNU

7. Associators and synaptic pla sticity Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002. Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate Programs in Cognitive Science, Brain Science and Bioinformatics Brain-Mind-Behavior Concentration Program Seoul National University E-mail: [email protected] This material is available online at http://bi.snu.ac.kr/ 1 Outline 7.1 7.2 7.3 7.4 7.5 7.6 7.7 7.8 Associative memory and Hebbian learning An example of learning associations

The biochemical basis of synaptic plasticity The temporal structure of Hebbian plasticity: LTP and LTD Mathematical formulation of Hebbian plasticity Weight distributions Neuronal response variability, gain control, and scaling Features of associators and Hebbian learning (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 2 7.1 Associative memory and Hebbian learn ing To find the general principles of brain development is one of t he major scientific quests in neuroscience Not all characteristics of the brain can be specified by a geneti c code The number of genes would certainly be too small to specify all the detail of the brain networks Advantageous that not all the brain functions are specified genet ically To adapt to particular circumstances in the environment

An important adaptation mechanism that is thought to form th e basis of building associations Adapting synaptic efficiencies (learning algorithm) (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 3 7.1.1 Synaptic plasticity Synaptic plasticity is a major key to adaptive mechanisms in t he brain Artificial neural networks Abstract synaptic plasticity

Learning rules are not biologically realistic Learn entirely from experience Genetic coding would be of minimal importance in the brain de velopment The mechanism of a neural network for Self-organization Associative abilities (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 4 7.1.2 Hebbian learning Donald O. Hebb, The Organization of Behavior When an axon of a cell A is near enough to excite cell B or r epeatedly or persistently takes part in firing it, some growth o r metabolic change takes place in both cells such that As effic

iency, as one of the cells firing B, is increased. Brain mechanisms and how they can be related to behavior Cell assemblies The details of synaptic plasticity Experimental result and evidence Hebbian learning (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 5 7.1.3 Associations Computer memory Information is stored in magnetic or other physical form Using memory address to recalling Natural systems cannot work with such demanding precision The human memory Recall vivid memories of events from small details Learn associations Trigger memories based on related information Only partial information can be sufficient to recall memories Association memory

The basis for many cognitive functions (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 6 7.1.4 The associative node 7.1.5 The associative network Fig. 7.1 Associative node and network architecture. (A) A simplified neuron that receives a large number of inputs riin. The synaptic efficiency is denoted by wi. the output of the neuron, rout depends on the particular input stimulus. (B) A network of associative nodes. Each component of the input vector, riin, is distributed to each neuron in the network. However, the effect of the input can be different for each neuron as each individual synapse can have different efficiency values wij, where j labels the neuron in the network. 7 (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 7.2 An example of learning associations (A) h wi riin 3 i , (7.1) threshold = 1.5

Fig. 7.2 Examples of an associative node that is trained on two feature vectors with a Hebbian-type learning algorithm that increases the synaptic strength by w = 0.1 each time w = 0.1 each time a presynaptic spike occurs in the same temporal window as a postsynaptic spike (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 8 7.2.1 Hebbian learning in the conditioning framework The mechanisms of an associative neuron The first stimulus was already effective in eliciting a response of neuron before neuron Unconditioned stimulus (USC) Based on the random initial weight distribution For the second input the response of the neuron ch anges during learning Conditioned stimulus (CS) Fig. 7.3 Different models of associative nodes res embling the principal architecture found in biolog ical nervous systems such as (A) cortical neurons in mammalian cortex

(C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 9 7.2.2 Alternative plasticity schemes Fig. 7.3 Different models of associative nodes resembling the principal architecture found in biological nervous systems such as (B) Purkinje cells in the cerebellum, which have strong input from climbing fibers through many hundreds or thousands of synapses. In contrast, the model as shown in (C) that utilizes specific input to a presynaptic terminal as is known to exist in invertebrate systems, would have to supply the UCS to all synapses simultaneously in order to achieve the same kind of result as in the previous two models. Such architectures are unlikely to play an important role in cortical processing. (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 10 7.2.3 Issues around synaptic plasticity Store information with associative learning Imprinting an event-response pattern Recall the response from partial information about the event

Synaptic plasticity is thought to be the underlying principle be hind associative memory Formulate the learning rules more precisely Synaptic potentiation The synaptic efficiencies would then become too large so that th e response of the node is less specific to input pattern Synaptic depression (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 11 7.3 The biochemical basis of synaptic plast icity Activity-dependent synaptic plasticity The co-activation of pre- and postsynaptic neurons Backfiring The basis of signaling the postsynaptic state NMDA receptors Open when the postsynaptic membrane becomes depolarized

Allow influx of calcium ions The excess of intracellular calcium can thus indicate the co-activ ation of pre- and postsynaptic activity Longlasting synaptic changes Lifelong memories The phosphorylation of proteins (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 12 7.4 The temporal structure of Hebbian plasticity: LTP a nd LTD 7.4.1 Experimental example of Hebbian plasticity Results of experiments with varying pre- and postsynaptic co nditions EPSC: EPSP-related current Fig. 7.4 (A) Relative EPSC amplitudes between glutamatergic neurons in hippocampal slices. A strong postsynaptic stimulation was introduced at the t =0 for 1 minute that induced spiking of the

postsynaptic neuron. The postsynaptic firing was induced in relation to the onset of an EPSC that resulted from the stimulation of a presynaptic neuron at 1 Hz. The squares mark the results when the postsynaptic firing times followed the onset of EPSCs within a short time window of 5 ms. The enhancement of synaptic efficiencies demonstrates LTP. The circles mark the results when the postsynaptic neuron was fired 5 ms before the onset of the EPSC. The reduction of synaptic efficiencies demonstrates LTD. (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 13 7.4.2 LTP and LTD Long-term potentiation (LTP): The amplifications in the synaptic ef ficiency

Long-term depression (LTD): The reductions in the synaptic efficie ncy Whether such synaptic changes can persist for the lifetime of an org anism is unknown Such forms of synaptic plasticity support the basic model of associa tion LTP can enforce associative response to a presynaptic firing pattern that is temporally linked to postsynaptic firing LTD can facilitate the unlearning of presynaptic input that is not co nsistent with postsynaptic firing The basis of mechanisms of associative memories (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 14 7.4.3 Time window of Hebbian plasticity The crucial temporal relation between pre- and postsynaptic s pikes by varying the time between pre- postsynaptic spikes Fig. 7.4 (B) The relative changes in EPSC amplitudes are shown for various time windows between the onset of an EPSC induced by presynaptic firing and the time of induction of spikes in the postsynaptic neuron (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr

15 7.4.4 Variation of temporal Hebbian plastic ity The asymmetric and symmetric form of Hebbian plasticity Fig. 7.5 Several examples of the schematic dependence of synaptic efficiencies on the temporal relations between pre- and postsynaptic spikes (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 16 7.4.5 Dependence of synaptic changes on i nitial strength A A Whether the size of synaptic changes depends on the strength of a synapse The absolute strength of the synaptic efficiencies in LTD is pr oportional to the initial synaptic efficiency

The relative changes of EPSC amplitudes for LTP are largest f or small initial EPSC amplitudes LTD: A const A A : multiplicative A (7.2) LTP: A 1 A const : additive A A (7.3) Fig. 7.6 Dependence of LTP and LTD on the magnitude of the EPSCs before synaptic plasticity is induced. (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 17 7.5 Mathematical formulation of Hebbian p lasticity

Synaptic plasticity by a change of weight values The weight values are not static but can change over time The variation of weight values after time steps tt in a discrete fashion as wij (t t ) wij (t ) wij (ti f , t jf , t; wij ) (7.4) The dependence of the weight changes on various factors Activity-dependent synaptic plasticity Depend on the firing times of the pre- and postsynaptic neur on The strength of synapse can vary within some interval (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 18 7.5.1 Hebbian learning with spiking neuron s

The dependence of the synaptic changes (LTP: + LTD: -) wij f K (t post t pre ) f decay (7.5) Kernel function: exponential form K (t post t pre ) e t post t pre ([t post t pre ]) (7.6) (x)

: Threshold function that restricts LTP and LTD to the corr ect domains Amplitude factor f 1. Additive rule with absorbing boundaries, min max a for w w w ij ij ij f otherwise (7.7) 0 2. Multiplicative rule with more graded nonlinearity when approachi f a ( wmax wij ) (7.8) ng the boundaries, f a ( wmin wij ) z

(C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr (7.9) 19 7.5.2 Hebbian learning in rate models The average behavior of neurons or cell assemblies (rate models) Cannot incorporate the spike timing The plasticity depends on the average correlation of pre- and postsynaptic firing w ij f 1 [( ri f 2 )( r j f 3 ) f 4 ] w ij k ri w ij

k r i ri r ri j r j rj

rj (7.11 ) (7.10 ) : Hebbian plasticity rule : Hebbian rule without decay term k ri r j ri rj

: Covariance of ri and rj (7.12 ri: firing rate of a postsynaptic node i ) rj : firing rate of a presynaptic node i f1: learning rate f2 and f3: plasticity thresholds f4: weight decay The average change of synaptic weights is proportional to the covariance of the pre- and postsynaptic firing (cross-correlation function) (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 20 7.6 Weight distributions Synaptic efficiencies are continuously changing as long as learning rules are applie d Problem Rapid changes of weight can lead to instabilities in the system The neuron should adapt to rapid changes A neuron should roughly maintain its main firing rate Solution The overall weight distribution stays relatively constant Hebbian models depend on the form of the weight distribution

Fig. 7.7 Distribution of fluorescence intensities of synapses from a spinal neuron that were labeled with fluorescence antibodies, which can be regarded as an estimate of the synaptic efficiencies. (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 21 7.6.1 Example of weight distribution in a r ate model Rate models of recurrent networks trained with the Hebbian tr aining rule on random patterns have Gaussian distribution wei ght component Fig. 7.8 Normalized histograms of weight values from simulations of a simplified neuron (sigma node) simulating average firing rates after training with the basic Hebbian learning rules 7.11 on exponentially distributed random patterns. A fit of a Gaussian distribution to the data is shown as a solid line.

(C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 22 7.6.2 Change of synaptic characteristics Dales principle: Neurons make either excitatory or inhibitory synapses The synapses from a presynaptic neuron cannot change its speci fic characteristics The simulations above we did not restrict the synapses to be e ither inhibitory or excitatory Weight values can be set to cross the boundaries between positiv e and negative values, which is physiologically unrealistic But, simulations with such constraints produce similar results for the distribution of the weight matrix component Therefore, it is common to relax this biological detail(Dales principle) in the simulation

(C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 23 7.6.3 Examples with spiking neurons Asymmetric Hebbian rules for spiking neuron Fig. 7.9 (A) Average firing rate (decreasing curve) and C v, the coefficient of variation (increasing and fluctuating curve), of an IF-neuron that is driven by 1000 excitatory Poisson spike trains while the synaptic efficiencies are changed according to an additive Hebbian rule with asymmetric Gaussian plasticity windows. (B) Distribution of weight values after 5 minutes of simulated training time (which is similar to the distribution after 3 minutes). The weights were limited to be in the range of 0-0.015. The distribution has two maxima, one at each boundary of the allowed interval. (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 24 7.7 Neuronal response variability, gain control, and scal ing 7.7.1 Variability and gain control

The firing time of the IF-neuron is mainly determined by the average firing input current Measure this statement using cross-correlation function C (n) s pre (t ) s post t nt s pre s post (7.13 ) Fig. 7.10 Average cross-correlation function between pre-synaptic Poisson spike trains and the postsynaptic spike train (averaged over all presynaptic spike trains) in simulation of an IF-neuron with 1000 input channels. The spike trains that lead to the results shown by stars were generated with each weight value fixed to value 0.015. The cross-correlations are consistent with zero when considered within the variance indicated by the error bars. The squares represent the simulation results from simulations of the IF-neuron driven by the same presynaptic spike trains as before, but with the weight matrix after Hebbian learning shown in Fig. 7.9. Some presynaptic spike trains caused postsynaptic spiking with a positive peak in the average cross-correlation functions when the presynaptic spikes precede the postsynaptic spike. No error bars are shown for this curve for clarity. (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 25 7.7.2 Synaptic scaling

The dependence of overall synaptic efficiencies on the averag e postsynaptic firing rate Crucial to keep the neurons in the regime of high variability Keep neurons sensitive for information processing in the ner vous systems Many experiments have demonstrated Synaptic efficiencies are scaled by the average postsynaptic acti vity The threshold where LTP is induced can depend on the time-a veraged recent activity of the neuron Weight normalization Weight decay (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 26

7.7.3 Ojas rule and principal component Weight normalization through heterosynaptic depression wij ri post rjpre (ri post ) 2 wij (7.14 ) Fig. 7.11 Simulation of a linear node trained with Ojas rule on training examples (indicated by the dots) drawn from a two-dimensional probability distribution with mean zero. The weight vector with initial conditions indicated by the cross converges to the weight vector (thick arrow), which has length |w| = 1 and points in the direction of the first principal component. (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 27 7.7.4 Short-term synaptic plasticity and ne uronal gain control Short-term synaptic plasticity Cortical neurons typically have a transient response with a de creasing firing rate to a constant input current

Short-term synaptic depression (STD) The computational consequences of short-term depression can be manifold Allows a neuron to respond strongly to input that has not been i nfluencing the neuron recently and therefore has a strong novelt y value Rapid spike trains that would exhaust the neuron can be weaken ed (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 28 7.8 Features of associators and Hebbian lea rning Pattern completion and generalization Recall from partial input The output node responds to all patterns with a certain similarity to the trained pattern

Prototypes and extraction of central tendencies The ability to extract central tendencies Noise reduction Graceful degradation The loss of some components of system should not make the sys tem fail completely. Fault tolerance (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 29 7.8.4 Biologically faithful learning rules The associative Hebbian learning rules Biologically faithful models: 1. Unsupervised

2. Local 3. No specific learning signal Self-organization rule Reinforcement learning Only presynaptic and postsynaptic observable are required to change the synaptic weight values Benefit from true parallel distributed processing Online The learning rule does not require storage of firing patterns or netwo rk parameters (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 30 Conclusion

What is the associative memory? Biochemical mechanisms of synaptic plasticity Hebbian learning rule Synaptic plasticity Temporal structure of Hebbian plasticity LTP and LTD Weight distribution Gain control, synaptic scaling, Ojas rule and PCA Associators and Hebbian learning Hebbian learning rule is bilogically faithful learning rules Unsupervised, local, online (C) 2009 SNU CSE Biointelligence Lab, http://bi.snu.ac.kr 31

Recently Viewed Presentations