Page 1

Title

Spontaneous Order of Self-organizing Systems(Nonlinear

Analysis and Convex Analysis)

Author(s)Shih, Mau-Hsiang; Tsai, Feng-Sheng

Citation数理解析研究所講究録 (2007), 1544: 163-167

Issue Date2007-04

URLhttp://hdl.handle.net/2433/80731

Right

TypeDepartmental Bulletin Paper

Textversionpublisher

KURENAI : Kyoto University Research Information Repository

Kyoto University

Page 2

Spontaneous Order of Self-organizing Systems*

Mau-HsiangShih\dagger and Feng-ShengTsai\ddagger

As Paul Adams wrote in the JoumalofTheoretical Biology in 1998 [1J, ‘Two of the

most influential books in the history of biologyare Darwin’s On the OriginofSp.ecies.

and Hebb’s The Organizationof

profound impact upon how populations of neurons interact collectively to perfom in-

tegrative activity underlying brain process. He postulated that the changes of synaptic

$strengh_{8}$ based on coincidence detection could sustain persistent reverberatory activ-

ity in cortical circuits. The circulating neural impulses between populations of neurons

would continue to circulate, forming a diffuse self-assembling structure called “cell as-

semblies” [6, 13, 15].

There is increasing empirical support for Hebb’s contribution to neuropsychological

theory [3, 5, 11] and there also stimulates an intensive effort to promote the building

of computer or network models of the brain based on Hebbian synaptic plasticity (the

coincidence-detection rule) [2, 4, 7, 9, 14, 16]. But there still lacks a complete mathemat-

ical explanation, a rigorous proof, that eonfirms the relation between Hebbian synaptic

plasticity and cell-assembly postulate.

The main problem concerning the relation between Hebbian synaptic plasticity and

cell-assembly postulate can be formulated as follows:

The Consolidation Problem

Are there any organizing principles underlying the dynamical-combinatori$al$processof

the evolutionary network that allow one to describe neural populations underlying plas-

ticity and to probe their time- and activity-dependent interactions capturing the charac-

teristic propertyofthe entire ensemblesofcell-assemblies?

The model ofthe evolutionary network we are concerned with consists ofa Population

of distinct inteyate-and-ffie Processingunits (McCulloch-Pitts neurons or neurons) [10,

12]; each constantly integrates all incoming signals transferred from synaPses on its cell

body and dendrites, and fires actionPotentials to send signals to other neurons when

–

*Th18 work was $suPsuPported$ by rants fr

was suPsuPported by grants from the National Science Council of the RePublic of China and the Fbculty of

Science of National ?biwan Normal $Univer8ity$.

\dagger Department of Mathematics, National laiwan Normal University, 88 Sec. 4, Ting Chou Boad, 丁団 pei 116,

($m\epsilon hlh\alpha ath.$ntnu. $edu.tw$).

$*D\epsilon partment$of Mathematios, National

aiwan Normal University, 88 Sec. 4, Ting Chou Road, $IhIpei$ $116$, Ilaiwan

($t\cdot t\iota a1hP716$.math.ntnu.edu.tw).

$Beha\tau\dot{n}or$,“ Donald Hebb’s classic book [6] is to have

$a$.

$\tau u_{wan}$

数理解析研究所講究録

第 1544 巻 2007 年 163-167

163

Page 3

the combined effort reaches its threshold. Name those $neuron81,$

the ordered pair $(i,j)$ the evolutionary coupling linking neuron

neuron

and to each evolutionary coupling $(i,j)$ there is associated the coupling strength variable

$a_{ij}$. The phase space of the evolutionary network of

$\{0,1\}^{n}$, the binary code consisting of all Ol-strings

function hea is the Heaviside function: $hea(u)=1$ for$u\geq 0$, otherwise

dynamical system of the

parametric equations:

$\ldots,$

$n$and denote by

$j$to neuron

$i$. To each

$i$there is associated the threshold

$b_{i}$and the active state variable $x_{i}=0$ or 1,

$n$ coupled neurons is denoted by

$x_{1}x_{2}\cdots x_{n}$ of fixed-length

$n$. The

$0$. The nonlinear

$n$ coupled neurons is then modeled by the following nonlinear

$x(t+1)$

$=F_{s(t),t}(x(t))$,

$t=0,1,$

$\ldots$ ,

(1)

$A(t+.1)$

$=A(t)+D_{x(t)arrow x(t+1)}A$,

$t=0,1,$

$\ldots$ ,

(2)

where

is the evolutionary coupling state at time

activity at time

dynamics, whose ith component is defined by

$x(t)=(x_{1}(t), \ldots , x_{n}(t))$is the neuronal active state at time

$t,$ $A(t)=(a_{ij}(t))_{n\cross n}$

$t,$ $s(t)$ denotes the neurons that adjust their

$tim\triangleright \bm{t}d$-state varying function encoding the

$t$, and

$F_{\epsilon(t),t}(x)$ is the

$[F_{s(t),t}(x)]_{i}=x_{i}$

if$i\not\in s(t)$,

otherwise

$[F_{\epsilon(t),t}(x)]_{i}=hea( \sum_{j=1}^{n}a_{ij}(t)x_{j}-b_{i})$.

The ith row and$jth$ column

the ultraderivative of the evolutionary coupling state at the evoIutionary coupling $(i,j)$,

which varies with respect to the neuronal active states changing from$x(t)$ to $x(t+1)$.

The ultraderivatives quantify plasticity of evolutionary couplings that allows the system

as a whole to undergo spontaneous organization.

As the evolutionary network evolves, there are associated subsets

space$\{1, 2, \ldots, n\}\cross\{1,2, \ldots, n\}$ such that

$\mathcal{D}oe(t)arrow x\langle t+1)a_{ij}$of the parametric matrix

$\mathcal{D}_{i(t)arrow\alpha(t+1)}A$is called

$\Lambda(t),$ $\Omega(t)$of the

$\Lambda(t)$

$=$

$\{(i,j);\mathcal{D}ae(t)arrow x(t+1)\alpha_{j}>0\}$,

$\Omega(t)$

$=$

$\{(i,j);\mathcal{D}_{x(t)arrow\alpha(t+1)}a_{ij}\geq 0\}$.

That raises the search for the algorithm that dictates the existence of$T\geq 0$ and a

subset

where 1$(x(t))$ denotes the collection of all active neurons at time

$V$of

$\{1, \ldots, n\}$such that the condition $1(x(t))=V$ holds true for all$t\geq T$,

$t$, and that

$\bigcap_{t\succeq T}\Omega(t)\supset VxV\supset.\bigcup_{t\geq T}\Lambda(t)$.

In this case, we say that the neurons in the subset

$x(t)$, accompanying a spontaneously organized distribution of the positive and negative

ultraderivatives to bind themselves into a single operational unit.

$V$are synchronized with $re8pect$ to

164

Page 4

The bewildering complexity with which the evolutionary network identified is in-

herently associated with two simple quantities to measure the driving forces of the

evolutionary network’s dynamics.

When a given neuron

are two classes emerging from the active evolutionary coupling strengths

$i=1,$ $\ldots,n$.

The first consists of those with neurons

but fired at $t+1$, while the second consists of those with neurons

at time

strengths as the above fired-driven or unfired-driven character, we define thefired-driven

strength $FS(t)$ to be the sum of all

all

the second class where

$FS(t)$ is not necessarily greater than the unfired-driven strength US$(t)$ over time,

but when the discrete flow $x(t)$ behaves in the way that $x(t.)=x(t$“

$t_{*}\leq\hat{t}<t^{*}$(a feedback loop initiated by active neurons at time

strength and the unfired-driven strength in the period of

ness: $FS(t_{*})+FS(t_{*}+1)+\cdots+FS(t^{*}-1)>US(t_{*})+US(t_{*}+1)+\cdots+US(t^{*}-1)$.

This gives a fundamental law of pulsedynamics that combines dynamical and structural

complexity of the evolutionary network.

We.introduce what we call the “Hebbian evolving algorithm,” a generalized learning

rule analogous to a coincidence-detection rule of the Hebbian synaptic plasticity that

provides a representative for the choices of the ultraderivatives in the growth dynamics

of the evolutionary network. They can be expressed in the formulas:

if neurons

and further we require that

where$\delta_{ij}(t+1)=0$or 1 is the indicator that indicates whether the active of neuron

at time

Define

the minimal total excitability within the group of neurons

and

measure of excitability with respect to the neuron

of the neuron

generating action potentials. Let us call that the discrete flow$x(t)$ generated by (1)

iterates asynchronously if$s(t)$is a singlton for all $t=0,1,$

$j$ is active at time

$t$, there

$a_{1j}(t)$ for

$i$being quiescent at time

$t$

$i$being active

$t$but quiescent at $t+1$. By classifying all the active evolutionary coupling

$a_{ij}(t)s$ in the first class where

$j$ is taken over

$a_{1j}(t)s$in

$act\dot{i}ve$neurons, whereas theunfired-drivenstrength US$(t)$the sum of all

$j$ is taken over all active neurons. The fired-driven strength

$)$

$\neq x(t)$with

$t_{*}$), the fired-driven

$t_{*}\bm{t}dt^{*}$ emerge

$\cdot the$orderli-

$D_{x(t)arrow x(t+1):j}a\geq 0$

$i$and$j$are active simultaneously at time $t+1$, otherwise $D_{x(t)arrow x(l+1)}a_{1j}<0$,

$|D_{x(t)arrow x(t+1)}a_{1j}|\geq|D_{x(t)arrow x(t+1)}a_{ji}|$if$\delta_{ij}(t+1)>\delta_{ji}(t+1)$,

$j$

$t$has been able to change the active state of neuron

$i$at $t+1.-$

$\cdot$$E_{U}(t_{*},t^{*}.)$to be $\sum_{i\in U}\min(\{a_{ii}(t);t=t_{*}, \ldots,t^{*}\})$ , a quantity that measures

$U$in the period of time

$a_{ii}$ is considered to be a

$i[8]\bm{t}d^{\backslash }$, according to the working

$t$.

$t^{*}$with

$t_{*}\leq t^{*}$, where the coupling strength variable

$i$, the increased excitability has a tendency to decrease the threshold for

$\ldots$ .

THEOREM. Consider the evolutionary network $ofn$ coupled neurons that subjects

to the dynamics (1) and obeys the Hebbian evolving algorithm. Given any initial neu-

ronal active state$x(O)$ in the phase space $\{0,1\}^{n}$,if$x(t)$ iterates asynchrvnously and the

165

Page 5

minimal total excitabilitysatisfiesthe assembling coordination:

$E_{U}(t_{*}, t^{*}) \geq\sum_{i,j\in U}\max(\{a_{ij}(t)-a_{ji}(t)\geq 0;t=t_{*}, \ldots, t^{*}\}\cup\{0\})$

(3)

foreach non-empty subset

then there e.xist

$t\geq T$and that

$U$of

$\{1, 2, \ldots, n\}$ andforeach$t_{*},t^{*}=0,1,$

$0$and a subset

$\ldots$ with

$t_{*}\leq t^{*}$,

$T\geq$

.

$V$of$\{1, 2, \ldots,n\}$ such that $1(x(t))=V$forall

$\bigcap_{t\succeq T}\Omega(t)\supset VxV\supset\bigcup_{t\geq T}\Lambda(t)$.

Consequently, starting with an initial neuronal active state $x(O)$ the Hebbian evolving

algorthm can be used to search a gmupofsynchronized firing neurons, and the synchro-

nized activity leads to a spontaneously organized distributionofthe positive and negative

ultraderivatives thatfeedback toreinforcethe neurons tofirein synchrvny, with positive

feedbacksconspiring to produce a cascadeofsync-dependent neural circuits, giving rise

to the consolidationofneuml circuitry.

The above theorem is to formulate a consolidation problem of evolutionary net-

works concerning the relation between Hebbian synaptic plasticity and Hebbian cell-

assembly postulate, and to demonstrate a dominant theme of connectionist unraveling

a dynamical-combinatorial process in a huge, interconnected system, in which the on-

going changes of the nodal-and.-coupling dynamics underlying plasticity are guaranteed

to result in group synchrony and syncsdependent circuits. Our proof of the theorem

shows a potential role of pulsedynamics in the stability of a nonlinear evolutionary net-

worked system, concerning the “driving forces” derived from the evolutionary network’s

nodal and coupling activity, without invoking any physical “energy” to control system

dynamics.

References

[1] P. ADAMS, Hebb and Damrin, J.

$Th\infty r$. Biol., 195 (1998), pp. 419-438.

[2] J. A. ANDERSON, A simple neural network generating an interactive memory,

Math. Biosci., 14 (1972),

$p\acute{p}$. $197-220$.

[3] F. CRICK, Functionofthe thalamic reticular complex: the searchlight hypothesis,

Proc. Natl. Acad. Sci. USA, 81 (1984), pp. 4586-4590.

[4] S. DEHAENE, M. KERSZBERG, AND J.-P. CHANGEUX, A neuronal modelof

global workspace ineffortfulcognitive tasks, Proc. Natl. Acad. Sci. USA, 95 (1998),

pp. 14529-14534.

$a$

.

166

Page 6

[5] K. D. HARRIS, J. CSICSVARI, H. HIRASE, G. DRAGOI, AND G. BUZS\’AKI, Or-

ganizationofcell assemblies in the hippocampus, Nature, 424 (2003), pp. 552-556.

[6] D. O. HEBB, The OrganizationofBehavior, Wiley, New York, 1949.

[7] J. J. HOPFIELD, Neural networks and physical systems with emergent collective

computational abilities, Proc. Natl. Acad. Sci. USA, 79’ (1982), pp. 2554-2558.

[8] M. KLEIN, B. HOCHNER, AND E. R. KANDEL, Facilitatory tmnsmitters and

cAMP can modulate accommodation as well as transmitter release in Aplysia sen-

sory neurons. Evidenceforpamllel prvcessing in a single cell, Proc. Natl. Acad. Sci..

USA, 83 (1986), pp. 7994-7998.

[9] J. L\"UCKE, Hierarchical self-organizationofminocolumnar receptive fields, Neural

Netw., 17 (2004), pp. 1377-1389.

[10] W. S. $McCuLLOCH$ AND W. PITTS, A logical calculusofthe ideas immanent in

neivous activity, Bull. Math. Biophys., 5 (1943), pp. 115-133.

[11] B. MILNER, L. R. SQUIRE, AND E. R. KANDEL, Cognitive neuroscience and the

studyofmemory, Neuron, 20 (1998), pp. 445-468.

[12] M. MINSKY, Computation: Finite andInfiniteMachines, Prentice-Hall, New York,

1967.

[13] M. A. L. NICOLELIS, E. E. FANSELOW, AND A. A. GHAZANFAR, Hebb’s dmam:

the resurgenceofcell assemblies, Neuron, 19 (1997), pp. 219-221.

[14] N. ROCHESTER, J. H. HOLLAND, L. H. HAIBT, AND W. L. $DUDA$, Tests on a

cell assembly theoryofthe actionofthe bmin, using a large digital computer, IRE

Rans.$Inf$. Theory, 2 (1956), pp. 80-93.

[15] T. J. SEJNOWSKI, The bookofHebb, Neuron, 24 (1999), pp. 773-776.

[16] D. J. WILLSHAW, O. P. BUNEMAN, AND H. C. LONGUET-HIGGINS, Non-

holographic associative memory, Nature, 222 (1969), pp. 960-962.

167