{"title": "Neuronal Regulation Implements Efficient Synaptic Pruning", "book": "Advances in Neural Information Processing Systems", "page_first": 97, "page_last": 103, "abstract": null, "full_text": "Neuronal Regulation Implements \n\nEfficient Synaptic Pruning \n\nGal Chechik and Isaac Meilijson \n\nSchool of Mathematical Sciences \n\nTel Aviv University, Tel Aviv 69978, Israel \nggal@math.tau.ac.il \nisaco@math.tau.ac.il \n\nSchools of Medicine and Mathematical Sciences \n\nTel Aviv University, Tel Aviv 69978, Israel \n\nEytan Ruppin \n\nruppin@math.tau.ac.il \n\nAbstract \n\nHuman and animal studies show that mammalian brain undergoes \nmassive synaptic pruning during childhood , removing about half of \nthe synapses until puberty. We have previously shown that main(cid:173)\ntaining network memory performance while synapses are deleted, \nrequires that synapses are properly modified and pruned, remov(cid:173)\ning the weaker synapses. We now show that neuronal regulation , a \nmechanism recently observed to maintain the average neuronal in(cid:173)\nput field , results in weight-dependent synaptic modification . Under \nthe correct range of the degradation dimension and synaptic up(cid:173)\nper bound, neuronal regulation removes the weaker synapses and \njudiciously modifies the remaining synapses . It implements near \noptimal synaptic modification, and maintains the memory perfor(cid:173)\nmance of a network undergoing massive synaptic pruning. Thus , \nthis paper shows that in addition to the known effects of Hebbian \nchanges, neuronal regulation may play an important role in the \nself-organization of brain networks during development. \n\n1 \n\nIntroduction \n\nThis paper studies one of the fundamental puzzles in brain development: the mas(cid:173)\nsive synaptic pruning observed in mammals during childhood , removing more than \nhalf of the synapses until puberty (see [1] for review) . This phenomenon is ob(cid:173)\nserved in various areas of the brain both in animal studies and human studies. How \ncan the brain function after such massive synaptic elimination? what could be the \ncomputational advantage of such a seemingly wasteful developmental strategy? In \n\n\f98 \n\nG. Chechik. I. Meilijson and E. Ruppin \n\nprevious work [2], we have shown that synaptic overgrowth followed by judicial \npruning along development improves the performance of an associative memory \nnetwork with limited synaptic resources, thus suggesting a new computational ex(cid:173)\nplanation for synaptic pruning in childhood. The optimal pruning strategy was \nfound to require that synapses are deleted according to their efficacy, removing the \nweaker synapses first. \n\nBut is there a mechanism that can implement these theoretically-derived synaptic \npruning strategies in a biologically plausible manner? To answer this question , we \nfocus here on studying the role of neuronal regulation (NR) , a mechanism operating \nto maintain the homeostasis of the neuron 's membrane potential. NR has been re(cid:173)\ncently identified experimentally by [3], who showed that neurons both up-regulate \nand down-regulate the efficacy of their incoming excitatory synapses in a multi(cid:173)\nplicative manner, maintaining their membrane potential around a baseline level. \nIndependently, [4] have studied NR theoretically, showing that it can efficiently \nmaintain the memory performance of networks undergoing synaptic degradation . \nBoth [3] and [4] have hypothesized that NR may lead to synaptic pruning during \ndevelopment. \n\nIn this paper we show that this hypothesis is both computationally feasible and \nbiologically plausible by studying the modification of synaptic values resulting from \nthe operation of NR. Our work thus gives a possible account for the way brain \nnetworks maintain their performance while undergoing massive synaptic pruning. \n\n2 The Model \n\nNR-driven synaptic modification (NRSM) results from two concomitant processes: \nsynaptic degradation (which is the inevitable consequence of synaptic turnover \n[5]) , and neuronal regulation (NR) operating to compensate for the degradation. \nWe therefore model NRSM by a sequence of degradation-strengthening steps. At \neach time step, synaptic degradation stochastically reduces the synaptic strength \nW t (Wt > 0) to W't+l by \n\nW't+l = W t - (wtt'1]t; \n\n1] \"\" N(J..{/ , (1\"1/) \n\n(1) \nwhere 1] is noise term with positive mean and the power a defines the degradation \ndimension parameter chosen in the range [0,1] . Neuronal regulation is modeled by \nletting the post-synaptic neuron multiplicatively strengthen all its synapses by a \ncommon factor to restore its original input field \nW t+1 = W'tH li~ \nIi \n\nwhere If is the input field of neuron i at time t. The excitatory synaptic efficacies are \nassumed to have a viability lower bound B- below which a synapse degenerates and \nvanishes, and a soft upper bound B+ beyond which a synapse is strongly degraded \nreflecting their maximal efficacy. To study of the above process in a network, a \nmodel incorporating a segregation between inhibitory and excitatory neurons (i.e. \nobeying Dale's law) is required. To generate this essential segregation, we modify \nthe standard low-activity associative memory model proposed by [6] by adding a \nsmall positive term to the synaptic learning rule. \nIn this model, M memories \nare stored in an excitatory N -neuron network forming attractors of the network \ndynamics. The synaptic efficacy Wij between the jth (pre-synaptic) neuron and \nthe ith (post-synaptic) neuron is \n\n(2) \n\nM \n\nWij = I: [(er - p)(ej - p) + a] , 1 ~ i i= j ~ N \n\n(3) \n\n\fNeuronal Regulation Implements Efficient Synaptic Prnning \n\n99 \n\nwhere {e'}~=l are {O, I} memory patterns with coding level p (fraction of firing \nneurons), and a is some positive constant 1. The updating rule for the state Xf of \nthe ith neuron at time t is \n\nxI+1 = (J(Jf), \n\nIf = ~ L9(Wij)Xj - ~ L xj - T, \n\n(J(J) = 1 + sign(J) (4) \n\n2 \n\nN \n\nj=l \n\nN \n\nj=l \n\nwhere T is the neuronal threshold, and I is the inhibition strength. 9 is a general \nmodification function over the excitatory synapses, which is either derived explicitly \n(See Section 4), or determined implicitly by the operation of NRSM. If 9 is linear \nand I = Mathe model reduces to the original model described by [6]. The overlap \nmil (or similarity) between the network 's activity pattern X and the memory ~II \nserves to measure memory performance (retrieval acuity), and is defined as mil = \n~ Ef=l (~j - p)Xj. \n\n3 N euronally Regulated Synaptic Modification \n\nNRSM was studied by simulating the degradation-strengthening sequence in a net(cid:173)\nwork in which memory patterns were stored according to Eq.3. Figure la plots a \ntypical distribution of synaptic values as traced along a sequence of degradation(cid:173)\nstrengthening steps (Eq. 1,2) . As evident, the synaptic values diverge: some of the \nweights are strengthened and lie close to the upper synaptic bounds, while the other \nsynapses degenerate and vanish. Using probabilistic considerations, it can be shown \nthat the synaptic distribution converge to a meta-stable state where it remains for \nlong waiting times. Figure Ib describes the metastable synaptic distribution as \ncalculated for different 0 values. \n\na. Simulation results \n\nb. Numerical results \n\nEvolving distribution of synaptic efficacies \n\n10000 \n\n-\n\nCJ) \nQ) \nCJ) \nc.. \nctl \nc:: \n>-CJ) \n0 \n.... \nQ) \n.0 \nE \n:::l \nc:: \n\n/1, 5000 \nr \nr \nr \nI \n\nI \nI \n\nr \n\n\\1000 I \n\nr \n1400 I \n\n_._. Alpha=O.O \n--- Alpha=O.5 \n-\nAlpha=O.9 \n\n1.0 \n\n0.8 \n\n~.6 \n'(j) \nc:: \n~0.4 \n\n0.2 \n\n0.0 \n\nI \ni \n.. \ni \n/ . \n// \n\n/ \n\n0 \n\nFigure 1: Distribution of synaptic strengths following a degradation-strengthening \nprocess. \na) Synaptic distribution after 0,200 , 400, 1000 and 5000 degradation(cid:173)\nstrengthening steps of a 400 neurons network with 1000 stored memory patterns. \n0=0.8, p = 0.1, B- = 10- 5 , B+ = 18 and T/ '\" N(0.05, 0.05). Qualitatively similar \nresults were obtained for a wide range of simulation parameters. b) The synaptic \ndistribution of the remaining synapses at the meta-stable state was calculated as \nthe main eigen vector of the transition probability matrix. \n\n1 As the weights are normally distributed with expectation M a > 0 and standard devi(cid:173)\nation O(VM) , the probability of a negative synapse vanishes as M goes to infinity (and \nis negligible already for several dozens of memories in the parameters' range used here). \n\n\f100 \n\nG. Chechik, I Meilijson and E. Ruppin \n\na. NRSM functions at the \n\nMetastable state \n\n, 20 i ----r--- --j::=::::== \n\n0.0 '---''-------'''--....... ''\"'''-...\u00a3...-~--------,' \n\n0.0 \n\n4 .0 \n\n8.0 \n\n,2.0 \n\nOriginal synaptic strength \n\nb. NRSM and \n\nrandom deletion \n\n1.0 r-:-~-~---_r__-_-~---, \nr~ ~-\u00b7-v~ \u2022 .M,~ \n\n, \n\n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\n~ c \n'\" \nE 0.5 \n.g \nQ) a... \n\n-\nNR modification \n--- Random deletion \n\n\\ \n\\ \n\\ \n\n\\ , , \n' ...... \n0.7 \n\n0.0 '---- ' - -- - ' -, -~----\"~.,.~--~, -\n0.5 \n0.8 \nNetwork's Connectivity \n\n0.9 \n\n0.6 \n\n- ' -' -~ \n0.3 \n\n04 \n\nFigure 2: a) NRSM functions at the metastable state for different a values. Re(cid:173)\nsults were obtained in a 400-neurons network after performing 5000 degradation(cid:173)\nstrengthening steps. Parameter values are as in Figure 1, except B+ = 12. b) \nPerformance of NR modification and random deletion. The retrieval acuity of 200 \nmemories stored in a network of 800 neurons is portrayed as a function of network \nconnectivity, as the network undergoes continuous pruning until NR reaches the \nmetastable state. a = 0, B+ = 7.5, p = 0.1, rna = 0.80, a = 0.01, T = 0.35, \nB- = 10- 5 and TJ\"'\" N(O.OI, 0.01). \n\nTo further investigate which synapses are strengthened and which are pruned, we \nstudy the resulting synaptic modification function. Figure 2a plots the value of \nsynaptic efficacy at the metastable state as a function of the initial synaptic effi(cid:173)\ncacy, for different values of the degradation dimension a. As observed, a sigmoidal \ndependency is obtained, where the slope of the sigmoid s.trongly depends on the \ndegradatiori dimension. In the two limit cases, additive degradation (a = 0) results \nin a step function at the metastable state, while multiplicative degradation (a = 1) \nresults in random diffusion of the synaptic weights toward a memory less mean value. \nDifferent values of a and B+ result in different levels of synaptic pruning: When \nthe synaptic upper bound B+ is high, the surviving synapses assume high values, \nleading to massive pruning to maintain the neuronal input field, which in turn re(cid:173)\nduces network 's performance. Low B+ values lead to high connectivity, but limit \nsynapses to a small set of possible values, again reducing memory performance. Our \nsimulations show that optimal memory retrieval is obtained for B+ values that lead \nto deletion levels of 40% - 60%, in which NR indeed maintains the network perfor(cid:173)\nmance. Figure 2b traces the average retrieval acuity of a network throughout the \noperation of NR, versus a network subject to random deletion at the same pruning \nlevels. While the retrieval of a randomly pruned network collapses already at low \ndeletion levels of about 20%, a network undergoing NR performs well even in high \ndeletion levels. \n\n4 Optimal Modification In Excitatory-Inhibitory Networks \n\nTo obtain a a comparative yardstick to evaluate the efficiency of NR as a selective \npruning mechanism, we derive optimal modification functions maximizing memory \nperformance in our excitatory-inhibitory model and compare them to the NRSM \nfunctions. \n\n\fNeuronal Regulation Implements Efficient Synaptic Pruning \n\n101 \n\nWe study general synaptic modification functions, which prune some of the synapses \nand possibly modify the rest, while satisfying global constraints on synapses such \nas the number or total strength of the synapses. These constraints reflect the \nobservation that synaptic activity is strongly correlated with energy consumption \nin the brain [7], and synaptic resources may hence be inherently limited in the adult \nbrain. \nWe evaluate the impact of these functions on the network's retrieval performance, \nby deriving their effect on the signal to noise ratio (SIN) of the neuron's input field \n(Eqs. 3,4)' known to be the primary determinant of retrieval capacity ([8]). This \nanalysis, conducted in a similar manner to [2] yields \n\nwhere z'\" N(O, 1) and 9 is the modification function of Eq. 4 but is now explicitly \napplied to the synapses. To derive optimal synaptic modification functions with \nlimited synaptic resources, we consider 9 functions that zero all synapses except \nthose in some set A, and keep the integral \n\nk = 0, 1, ... \n\n; g(z) = OVz ~ A \n\n(6) \n\ni l(z)\u00a2(z)dz \n\nlimited. We then maximize the SIN under this constraint using the Lagrange \nmethod. Our results show that without any synaptic constraints the optimal func(cid:173)\ntion is the identity function, that is, the original Hebbian rule is optimal. When \nthe number of synapses is restricted (k = 0), the optimal modification function is a \nlinear function for all the remaining synapses \n\ng(W) = aW -J.ta+b where \n\n{\n\n-\n\na \n\nb \n\nL z2\u00a2(z)dz \nJ z\u00a2(z )dz \n(1-L \u00a2(z)dz) \n\nA \n\n(Ta \n\nE(W) \nV(W) \n\n(7) \n\nfor any deletion set A. To find the synapses that should be deleted, we have numer(cid:173)\nically searched for a deletion set maximizing SIN while limiting g(W) to positive \nvalues (as required by the segregation between excitatory and inhibitory neurons). \nThe results show, that weak synapses pruning, a modification strategy that re(cid:173)\nmoves the weakest synapses and modifies the rest according to Eq. 7, is optimal \nat deletion levels above 50%. For lower deletion levels, the above 9 function fails \nto satisfy the positivity constraint for any set A. When the positivity constraint is \nignored, SIN is maximized if the weights closest to the mean are deleted and the \nremaining synapses are modified according to Eq 7. We name this strategy mean \nsynapses pruning. Figure 3 plots the memory capacity under weak-synapses \npruning (compared with random deletion and mean-synaptic pruning) showing that \npruning the weak synapses performs at least near optimally for lower deletion levels \nas well. Even more interesting, under the correct parameter values weak-synapses \npruning results in a modification function that has a similar form to the NR-driven \nmodification function studied in the previous Section: both strategies remove the \nweakest synapses and linearly modify the remaining synapses in a similar manner. \nIn the case of limited overall synaptic strength (k > 0 in Eq. 6), the optimal 9 \nsatisfies \n\n(8) \nand thus for k = 1 and k = 2 the optimal modification function is again linear. For \nk > 2 a sublinear modification function is optimal, where 9 is a function of zl/(k-1), \n\nz - 2\"Y1 [g(z) - E(g(z))] - \"Y2kg(z)k-1 = 0 \n\n\f102 \n\nG. Chechik, I. Meilijson and E. Ruppin \n\nCapacity of different synaptic modification functions g(w) \n\na. Analysis results \n\nb. Simulations results \n\n1oo0r---~----~--~----~---' \n\n800 \n\n.~600 \n(.) \n\u00abI c.. \n\u00abI 400 \n(.) \n\n200 \n\n800 \n\n~6oo \n(.) \n\u00abI c.. \n\u00ab1400 \n(.) \n\n200 \n\n=---==-. ...... ,::...:.. .. _.-.-........ \n\n'..... \n\n......... \n\n..... .. \n. \n.... .. \n'\" \n.... \n.... \n. \n, \n'\" \n. \n'\" \n'\" \n'\" \n, \n'\" \n'\" \n. \n'\" , \n'\" \n'\" . , , , . \n\" '\\ 1\\ . \\ , . \n\nFigure 3: Comparison between performance of different modification strategies as a \nfunction of the deletion level (percentage of synapses pruned). Capacity is measured \nas the number of patterns that can be stored in the network (N = 2000) and be \nrecalled almost correctly (rn > 0.95) from a degraded pattern (rna = 0.80). \n\nand is thus unbounded for all k. Therefore, in our model, bounds on the synaptic \nefficacies are not dictated by the optimization process. Their computational ad(cid:173)\nvantage arises from their effect on preserving memory capacity in face of ongoing \nsynaptic pruning. \n\n5 Discussion \n\nBy studying NR-driven synaptic modification in the framework of associative mem(cid:173)\nory networks, we show that NR prunes the weaker synapses and modifies the re(cid:173)\nmaining synapses in a sigmoidal manner. The critical variables that govern the \npruning process are the degradation dimension and the upper synaptic bound. Our \nresults show that in the correct range of these parameters, NR implements \na near optimal strategy, maximizing memory capacity in the sparse con(cid:173)\nnectivity levels observed in the brain. \n\nA fundamental requirement of central nervous system development is that the sys(cid:173)\ntem should continuously function, while undergoing major structural and func(cid:173)\ntional developmental changes. It has been proposed that a major functional role \nof neuronal down-regulation during early infancy is to maintain neuronal activity \nat its baseline levels while facing continuous increase in the number and efficacy \nof synapses [3]. Focusing on up-regulation, our work shows that NR has another \nimportant interesting effect: that of modifying and pruning synapses in a continu(cid:173)\nously optimal manner. Neuronally regulated synaptic modifications may play the \nsame role also in the peripheral nervous system: It was recently shown that in the \nneuro-muscular junction the muscle regulates its incoming synapses in a way simi(cid:173)\nlar to NR [9]. Our analysis suggests this process may be the underlying cause for \nthe finding that synapses in the neuro-muscular junction are either strengthened or \npruned according to their initial efficacy [10]. \n\nThe significance of our work goes beyond understanding synaptic organization and \nremodeling in the associative memory models studied in this paper. Our analysis \nbears relevance to two other fundamental paradigms: Hetero Associative memory \nand self organizing maps, sharing the same basic synaptic structure of storing as-\n\n\fNeuronal Regulation Implements Efficient Synaptic Pruning \n\n103 \n\nsociations between sets of patterns via a Hebbian learning rule. \n\nCombining the investigation of a biologically identified mechanism with the ana(cid:173)\nlytic study of performance optimization in neural network models, this paper shows \nthe biologically plausible and beneficial role of weight dependent synaptic pruning. \nThus, in addition to the known effects of Hebbian learning, neuronal regulation may \nplay an important role in the self-organization of brain networks during develop(cid:173)\nment. \n\nReferences \n\n[1] G.M. Innocenti. Exuberant development of connections and its possible per(cid:173)\n\nmissive role in cortical evolution. Trends Neurosci, 18:397-402, 1995. \n\n[2] G. Chechik, I. Meilijson, and E. Ruppin. Synaptic pruning during development: \n\nA computational account. Neural Computation. In press., 1998. \n\n[3] G.G. Turrigano, K. Leslie, N. Desai, and S.B. Nelson . Activity depen(cid:173)\n\ndent scaling of quantal amplitude in neocoritcal pyramidal neurons. Nature, \n391(6670):892-896,1998. \n\n[4] D. Horn, N. Levy, and E. Ruppin. Synaptic maintenance via neuronal regula(cid:173)\n\ntion. Neural Computation, 10(1):1- 18,1998. \n\n[5] J .R. Wolff, R. Laskawi, W.B. Spatz, and M. Missler. Structural dynamics of \nsynapses and synaptic components. Behavioral Brain Research, 66(1-2):13- 20, \n1995. \n\n[6] M.V . Tsodyks and M. Feigel'man. Enhanced storage capacity in neural net(cid:173)\n\nworks with low activity level. Europhys. Lett., 6:101- 105,1988. \n\n[7] Per E. Roland. Brain Activation. Willey-Liss, 1993 . \n[8] I. Meilijson and E. Ruppin. Optimal firing in sparsely-connected low-activity \n\nattractor networks. Biological cybernetics, 74:479-485, 1996. \n\n[9] G .W . Davis and C.S. Goodman. Synapse-specific control of synaptic efficacy \n\nat the terminals of a single neuron. Nature, 392(6671):82- 86, 1998. \n\n[10] H. Colman, J . Nabekura, and J. W. Lichtman. Alterations in synaptic strength \n\npreceding axon withdrawal. Science, 275(5298):356-361, 1997. \n\n\f", "award": [], "sourceid": 1554, "authors": [{"given_name": "Gal", "family_name": "Chechik", "institution": null}, {"given_name": "Isaac", "family_name": "Meilijson", "institution": null}, {"given_name": "Eytan", "family_name": "Ruppin", "institution": null}]}