# Stochastic synchronization of neuronal populations with intrinsic and extrinsic noise

- Paul C Bressloff
^{1, 2}Email author and - Yi Ming Lai
^{1}

**1**:2

**DOI: **10.1186/2190-8567-1-2

© Bressloff, Lai; licensee Springer 2011

**Received: **12 November 2010

**Accepted: **3 May 2011

**Published: **3 May 2011

## Abstract

We extend the theory of noise-induced phase synchronization to the case of a neural master equation describing the stochastic dynamics of an ensemble of uncoupled neuronal population oscillators with intrinsic and extrinsic noise. The master equation formulation of stochastic neurodynamics represents the state of each population by the number of currently active neurons, and the state transitions are chosen so that deterministic Wilson-Cowan rate equations are recovered in the mean-field limit. We apply phase reduction and averaging methods to a corresponding Langevin approximation of the master equation in order to determine how intrinsic noise disrupts synchronization of the population oscillators driven by a common extrinsic noise source. We illustrate our analysis by considering one of the simplest networks known to generate limit cycle oscillations at the population level, namely, a pair of mutually coupled excitatory (*E*) and inhibitory (*I*) subpopulations. We show how the combination of intrinsic independent noise and extrinsic common noise can lead to clustering of the population oscillators due to the multiplicative nature of both noise sources under the Langevin approximation. Finally, we show how a similar analysis can be carried out for another simple population model that exhibits limit cycle oscillations in the deterministic limit, namely, a recurrent excitatory network with synaptic depression; inclusion of synaptic depression into the neural master equation now generates a stochastic hybrid system.

## 1 Introduction

Synchronous oscillations are prevalent in many areas of the brain including sensory cortices, thalamus and hippocampus [1]. Recordings of population activity based on the electroencephalogram (EEG) or the local field potential (LFP) often exhibit strong peaks in the power spectrum at certain characteristic frequencies. For example, in the visual system of mammals, cortical oscillations in the *γ* frequency band (20-70 Hz) are generated with a spatially distributed phase that is modulated by the nature of a visual stimulus. Stimulus-induced phase synchronization of different populations of neurons has been proposed as a potential solution to the binding problem, that is, how various components of a visual image are combined into a single coherently perceived object [2, 3]. An alternative suggestion is that such oscillations provide a mechanism for attentionally gating the flow of neural information [4, 5]. Neuronal oscillations may be generated by intrinsic properties of single cells or may arise through excitatory and inhibitory synaptic interactions within a local population of cells. Irrespective of the identity of the basic oscillating unit, synchronization can occur via mutual interactions between the oscillators or via entrainment to a common periodic stimulus in the absence of coupling.

From a dynamical systems perspective, self-sustained oscillations in biological, physical and chemical systems are often described in terms of limit cycle oscillators where the timing along each limit cycle is specified in terms of a single phase variable. The phase-reduction method can then be used to analyze synchronization of an ensemble of oscillators by approximating the high-dimensional limit cycle dynamics as a closed system of equations for the corresponding phase variables [6, 7]. Although the phase-reduction method has traditionally been applied to deterministic limit cycle oscillators, there is growing interest in extending the method to take into account the effects of noise, in particular, the phenomenon of noise induced phase synchronization [8–15]. This concerns the counterintuitive idea that an ensemble of independent oscillators can be synchronized by a randomly fluctuating input applied globally to all of the oscillators. Evidence for such an effect has been found in experimental studies of oscillations in the olfactory bulb [11]. It is also suggested by the related phenomenon of spike-time reliability, in which the reproducibility of a single neuron’s output spike train across trials is greatly enhanced by a fluctuating input when compared to a constant input [16, 17].

In this paper we extend the theory of noise-induced phase synchronization to the case of a neural master equation describing the stochastic dynamics of an ensemble of uncoupled neuronal population oscillators with intrinsic and extrinsic noise. The master equation formulation of stochastic neurodynamics represents the state of each population by the number of currently active neurons, and the state transitions are chosen such that deterministic Wilson-Cowan rate equations [18, 19] are recovered in an appropriate mean-field limit (where statistical correlations can be neglected) [20–23]. We will consider the particular version of the neural master equation introduced by Bressloff [23], in which the state transition rates scale with the size *N* of each population in such a way that the Wilson-Cowan equations are obtained in the thermodynamic limit $N\to \infty $. Thus, for large but finite *N*, the network operates in a regime characterized by Gaussian-like fluctuations about attracting solutions (metastable states) of the mean-field equations (at least away from critical points), combined with rare transitions between different metastable states [24]. (In contrast, the master equation of Buice *et al.* assumes that the network operates in a Poisson-like regime at the population level [21, 22].) The Gaussian-like statistics can be captured by a corresponding neural Langevin equation that is obtained by carrying out a Kramers-Moyal expansion of the master equation [25]. One motivation for the neural master equation is that it represents an intrinsic noise source at the network level arising from finite size effects. That is, a number of studies of fully or sparsely connected integrate-and-fire networks have shown that under certain conditions, even though individual neurons exhibit Poisson-like statistics, the neurons fire asynchronously so that the total population activity evolves according to a mean-field rate equation [26–30]. However, formally speaking, the asynchronous state only exists in the thermodynamic limit $N\to \infty $, so that fluctuations about the asynchronous state arise for finite *N* [31–34]. (Finite-size effects in IF networks have also been studied using linear response theory [35].)

The structure of the paper is as follows. First, we introduce the basic master equation formulation of neuronal population dynamics. We reduce the master equation to a corresponding neural Langevin equation and show that both intrinsic and extrinsic noise sources lead to multiplicative white noise terms in the Langevin equation. We then consider an ensemble of uncoupled neuronal populations each of which evolves according to a neural master equation. We assume that each population supports a stable limit cycle in the deterministic or mean-field limit. We apply stochastic phase reduction and averaging methods to the corresponding system of neural Langevin equations, following along similar lines to Nakao *et al.* [12], and use this to determine how independent intrinsic noise disrupts synchronization due to a common extrinsic noise source. (Previous studies have mostly been motivated by single neuronal oscillator models, in which both the independent and common noise sources are extrinsic to the oscillator. In contrast, we consider a stochastic population model in which the independent noise sources are due to finite size effects intrinsic to each oscillator.) We then apply our analysis to one of the simplest networks known to generate limit cycle oscillations at the population level, namely, a pair of mutually coupled excitatory (*E*) and inhibitory (*I*) subpopulations [36]. A number of modeling studies of stimulus-induced oscillations and synchrony in primary visual cortex have taken the basic oscillatory unit to be an E-I network operating in a limit cycle regime [37, 38]. The E-I network represents a cortical column, which can synchronize with other cortical columns either via long-range synaptic coupling or via a common external drive. In the case of an E-I network, we show how the combination of intrinsic independent noise and extrinsic common noise can lead to clustering of limit cycle oscillators due to the multiplicative nature of both noise sources under the Langevin approximation. (Clustering would not occur in the case of additive noise.) Finally, we show how a similar analysis can be carried out for another important neuronal population model that exhibits limit cycle oscillations in the deterministic limit, namely, an excitatory recurrent network with synaptic depression; such a network forms the basis of various studies of spontaneous synchronous oscillations in cortex [39–43]. We also highlight how the inclusion of synaptic depression into the master equation formulation leads to a novel example of a stochastic hybrid system [44].

## 2 Neural Langevin equation

*M*homogeneous neuronal subpopulations labeled $i=1,\dots ,M$, each consisting of

*N*neurons.

^{1}Assume that all neurons of a given subpopulation are equivalent in the sense that the pairwise synaptic interaction between a neuron of subpopulation

*i*and a neuron of subpopulation

*j*only depends on

*i*and

*j*. Each neuron can be in either an active or quiescent state. Let ${N}_{i}(t)$ denote the number of active neurons at time

*t*. The state or configuration of the full system (network of subpopulations) is then specified by the vector $\mathbf{N}(t)=({N}_{1}(t),{N}_{2}(t),\dots ,{N}_{M}(t))$, where each ${N}_{i}(t)$ is treated as a discrete stochastic variable that evolves according to a one-step jump Markov process. Let $P(\mathbf{n},t)=Prob[\mathbf{N}(t)=\mathbf{n}]$ denote the probability that the full system has configuration $\mathbf{n}=({n}_{1},{n}_{2},\dots ,{n}_{M})$ at time

*t*$t>0$, given some initial distribution $P(\mathbf{n},0)$. The probability distribution is taken to evolve according to a master equation of the form [20–23]

*i*. Here ${\mathbf{e}}_{k}$ denotes the unit vector whose

*k*th component is equal to unity. The corresponding transition rates are chosen so that in the thermodynamic limit $N\to \infty $ one recovers the deterministic Wilson-Cowan equations [18, 19] (see below):

*l*th to the

*k*th population, and ${I}_{k}$ are external inputs. The gain function

*F*is taken to be the sigmoid function

with gain *γ* and maximum firing rate ${F}_{0}$. (Any threshold can be absorbed into the external inputs ${I}_{k}$.) Equation (1) preserves the normalization condition ${\sum}_{{n}_{1}\ge 0}\phantom{\rule{0.2em}{0ex}}\cdots {\sum}_{{n}_{M}\ge 0}P(\mathbf{n},t)=1$ for all $t\ge 0$. The master equation given by equations (1) and (2) is a phenomenological representation of stochastic neurodynamics [20, 23]. It is motivated by various studies of noisy spiking networks which show that under certain conditions, even though individual neurons exhibit Poisson-like statistics, the neurons fire asynchronously so that the population activity can be characterized by fluctuations around a mean rate evolving according to a deterministic mean-field equation [26–29]. On the other hand, if population activity is itself Poisson-like, then it is more appropriate to consider an *N*-independent version of the master equation, in which $NF\to F$ and $\mathbf{w}/N\to \mathbf{w}$ [21, 22]. The advantage of our choice of scaling from an analytical viewpoint is that one can treat ${N}^{-1}$ as a small parameter and use perturbation methods such as the Kramers-Moyal expansion to derive a corresponding neural Langevin equation [45].

Strictly speaking, the mean-field description is only valid in the thermodynamic limit $N\to \infty $, and provided that this limit is taken before the limit $t\to \infty $ [24]. In this paper we are interested in the effects of intrinsic noise fluctuations arising from the fact that each neural subpopulation is finite.

Equation (12) is the neural analog of the well known chemical Langevin equation [46, 47]. (A rigorous analysis of the convergence of solutions of a chemical master equation to solutions of the corresponding Langevin equation in the mean-field limit has been carried out by Kurtz [48].) It is important to note that the Langevin equation (12) takes the form of an Ito rather than Stratonovich stochastic differential equation (SDE). This distinction will be important in our subsequent analysis.

*N*of neurons in each sub-population is large but finite. It is also possible to extend the neural Langevin equation to incorporate the effects of a common extrinsic noise source. In particular, suppose that the external drive ${I}_{k}$ to the

*k*th subpopulation can be decomposed into a deterministic part and a stochastic part according to

*σ*and ${\sum}_{k=1}^{M}{\chi}_{k}=1$. Substituting for ${I}_{k}$ in equation (7) and assuming that

*σ*is sufficiently small, we can Taylor expand ${\Omega}_{k,1}$ to first order in

*σ*to give

and $dW(t)=\xi (t)\phantom{\rule{0.2em}{0ex}}dt$ is an additional independent Wiener process that is common to all subpopulations. We now have a combination of intrinsic noise terms that are treated in the sense of Ito, and an extrinsic noise term that is treated in the sense of Stratonovich. The latter is based on the physical assumption that external sources of noise have finite correlation times, so that we are considering the external noise to be the zero correlation time limit of a colored noise process.

## 3 Stochastic synchronization of an ensemble of population oscillators

*t*, where

*T*is the period of the oscillations. The Langevin equation (16) then describes a noise-driven population oscillator. Now consider an ensemble of $\mathcal{N}$ identical population oscillators each of which consists of

*M*interacting sub-populations evolving according to a Langevin equation of the form (16). We ignore any coupling between different population oscillators, but assume that all oscillators are driven by a common source of extrinsic noise. Introducing the ensemble label

*μ*$\mu =1,\dots ,\mathcal{N}$, we thus have the system of Langevin equations

Langevin equations of the form (18) have been the starting point for a number of recent studies of noise-induced synchronization of uncoupled limit cycle oscillators [9, 11–15]. The one major difference from our own work is that these studies have mostly been motivated by single neuron oscillator models, in which both the independent and common noise sources are extrinsic to the oscillator. In contrast, we consider a stochastic population model in which the independent noise sources are due to finite size effects intrinsic to each oscillator. The reduction of the neural master equation (1) to a corresponding Langevin equation (16) then leads to multiplicative rather than additive noise terms; this is true for both intrinsic and extrinsic noise sources. We will show that this has non-trivial consequences for the noise-induced synchronization of an ensemble of population oscillators. In order to proceed, we carry out a stochastic phase reduction of the full Langevin equations (18), following the approach of Nakao *et al.* [12] and Ly and Ermentrout [15]. We will only sketch the analysis here, since further details can be found in these references. We do highlight one subtle difference, however, associated with the fact that the intrinsic noise terms are Ito rather than Stratonovich.

### 3.1 Stochastic phase reduction

*k*th component of the infinitesimal phase resetting curve (PRC) defined as

*π*-periodic solution of the adjoint linear equation [7]

Equation (34) was previously derived by Nakao *et al.* [12] (see also [15]). Here, however, there is an additional contribution to the drift term ${\mathcal{A}}^{(\mu )}$ arising from the fact that the independent noise terms appearing in the full system of Langevin equations (18) are Ito rather than Stratonovich, reflecting the fact that they arise from finite size effects.

### 3.2 Steady-state distribution for a pair of oscillators

*et al.*[12]. The basic idea is to introduce the slow phase variables $\mathit{\psi}=({\psi}^{(1)},\dots ,{\psi}^{(\mathcal{N})})$ according to ${\theta}^{\mu}=\omega t+{\psi}^{\mu}$ and set $Q(\mathit{\psi},t)=P(\{\omega t+{\theta}^{(\mu )}\},t)$. For sufficiently small

*ϵ*and

*σ*

*Q*is a slowly varying function of time so that we can average the Fokker-Planck equation for

*Q*over one cycle of length $T=2\pi /\omega $. The averaged FP equation for

*Q*is thus [12]

*et al.*[12] and Ly and Ermentrout [15], we can now investigate the role of finite size effects on the noise-induced synchronization of population oscillators by focussing on the phase difference between two oscillators. Setting $\mathcal{N}=2$ in equation (35) gives

where Γ_{0} is a normalization constant.

*ϕ*of two population oscillators. First, in the absence of a common extrinsic noise source ($\sigma =0$) and $\u03f5>0$${\Phi}_{0}(\varphi )$ is a uniform distribution, which means that the oscillators are completely desynchronized. On the other hand, in the thermodynamic limit $N\to \infty $ we have $\u03f5={N}^{-1/2}\to 0$ so that the independent noise source vanishes. The distribution ${\Phi}_{0}(\varphi )$ then diverges at $\theta =0$ while keeping positive since it can be shown that $g(0)\ge g(\theta )$ [12]. Hence, the phase difference between any pair of oscillators accumulates at zero, resulting in complete noise-induced synchronization. For finite

*N*, intrinsic noise broadens the distribution of phase differences. Taylor expanding $g(\varphi )$ to second order in

*ϕ*shows that, in a neighbourhood of the maximum at $\varphi =0$, we can approximate ${\Phi}_{0}(\varphi )$ by the Cauchy distribution

The second general result is that the functions $\alpha (\theta )$ and ${\beta}_{k}(\theta )$ that determine $g(\varphi )$ and $h(\varphi )$ according to equations (38) are nontrivial products of the phase resetting curves ${Z}_{k}(\theta )$ and terms ${a}_{k}(\theta )$${b}_{k}(\theta )$ that depend on the transition rates of the original master equation, see equations (17), (25) and (28). This reflects the fact that both intrinsic and extrinsic noise sources in the full neural Langevin equation (18) are multiplicative rather than additive. As previously highlighted by Nakao *et al.* [12] for a Fitzhugh-Nagumo model of a single neuron oscillator, multiplicative noise can lead to additional peaks in the function $g(\varphi )$, which can induce clustering behavior within an ensemble of noise-driven oscillators. In order to determine whether or not a similar phenomenon occurs in neural population models, it is necessary to consider specific examples. We will consider two canonical models of population oscillators, one based on interacting sub-populations of excitatory and inhibitory neurons and the other based on an excitatory network with synaptic depression.

## 4 Excitatory-inhibitory (E-I) network

### 4.1 Deterministic network

*F*is the simple sigmoid $F(u)={(1+{\mathrm{e}}^{-u})}^{-1}$, that is, ${F}_{0}=1$ and $\gamma =1$ in equation (3). Using the fact that the sigmoid function then satisfies ${F}^{\prime}=F(1-F)$, the Jacobian obtained by linearizing about the fixed point takes the simple form

**J**have negative real parts, where

**w**, we can then construct bifurcation curves in the $({x}_{E}^{\ast},{x}_{I}^{\ast})$-plane by imposing a constraint on the eigenvalues ${\lambda}_{\pm}$. For example, the constraint

**w**. An example phase diagram is shown in Figure 1(b).

### 4.2 Stochastic network and noise-induced synchronization

*N*is sufficiently large so that the master equation can be approximated by the corresponding Langevin equation. This was also checked explicitly in computer simulations.) Having numerically computed the phase resetting curve $({Z}_{E},{Z}_{I})$ and the solution on the limit cycle for the deterministic E-I network, we can then compute the functions $g(\varphi )$ and $h(\varphi )$ of the stationary phase distribution ${\Phi}_{0}(\varphi )$ according to equations (17), (25), (28) and (38). We plot these functions in Figure 4 for the parameter values of the limit cycle shown in Figure 2, assuming symmetric common noise to excitatory and inhibitory populations. That is, ${\chi}_{E}={\chi}_{I}=1/2$ in equation (17). It can be seen that the periodic function

*g*is unimodal with $g(0)\ge g(\varphi )$ so that ${\Phi}_{0}(\varphi )$ is also unimodal with a peak at $\varphi =0$.

*ϵ*and extrinsic noise

*σ*. This is illustrated in Figure 5 where the amplitude

*σ*of the common signal is kept fixed but the system size

*N*is varied. Increasing

*N*effectively increases the correlation of the inputs by reducing the uncorrelated intrinsic noise, which results in sharper peaks and stronger synchronization, see also Marella and Ermentrout [13]. We find that there is good agreement between our analytical calculations and numerical simulations of the phase-reduced Langevin equations, as illustrated in Figure 6. We simulated the phase oscillators by using an Euler-Maruyama scheme on the Ito Langevin equation (29). A large number $\mathcal{M}\approx O({10}^{2})$ of oscillators were simulated up to a large time

*T*(obtained by trial and error), by which time their pairwise phase differences had reached a steady state. As we were comparing pairwise phase differences each simulation gave us $\frac{1}{2}\mathcal{M}(\mathcal{M}-1)$ data points and we averaged over many simulations to obtain 10

^{6}data points for each diagram in Figure 6. These were then placed into 50 bins along $[-\pi ,\pi )$ and normalised. Also shown in Figure 6(b) are data points obtained from simulations of the full planar Langevin equations. Here computations were much slower so we only averaged over relatively few trials and thus the data is more noisy. Nevertheless a reasonable fit with the analytical distribution can still be seen.

*et al.*have previously shown that in the case of Stuart-Landau or Fitzhugh-Nagumo limit cycle oscillators with both uncorrelated and correlated extrinsic noise sources, parameter regimes can be found where the periodic function

*g*has multiple peaks [12]. This can occur when higher harmonics of the phase resetting curve become dominant or when the common noise source is multiplicative. The presence of multiple peaks in

*g*results in an ensemble of oscillators forming clustered states. Moreover, there are intermittent transitions between the clustered states induced by the uncorrelated noise. In the case of stochastic E-I limit cycle oscillators, we were unable to find a parameter regime where

*g*develops multiple peaks when the common extrinsic noise source is the same for both excitatory and inhibitory populations, that is, ${\chi}_{E}={\chi}_{I}=1/2$ in equations (14) and (17). However, multiple peaks can occur when there is an asymmetry between the excitatory and inhibitory stochastic drives, as illustrated in Figure 7. The corresponding stationary distribution ${\Phi}_{0}(\varphi )$ for the phase differences

*ϕ*also develops additional peaks, see Figure 8. When the common stochastic input is mainly presented to the inhibitory population, we find a peak at $\varphi =0$ and smaller peaks at $\varphi =\pm 2\pi /3$. Consequently, the ensemble of oscillators tend to cluster in three regions around the limit cycle as shown in the inset of Figure 8(a). On the other hand, when the stochastic drive is predominantly to the excitatory population, we find a much sharper peak at $\varphi =0$ (compared to the symmetric case) and a small peak at $\varphi =\pi $. However, the latter does not contribute significantly to the dynamics, so that the oscillators are strongly synchronized.

## 5 Excitatory network with synaptic depression

So far we have applied the stochastic phase reduction method to a two-population model consisting of mutually interacting excitatory and inhibitory populations. This E-I network is one of the simplest population models known to exhibit limit cycle oscillations in the deterministic limit, and forms the basic module in various studies of stimulus-induced oscillations and synchronization in visual cortex [37, 38]. An even simpler population model known to exhibit limit cycle oscillations is a recurrent excitatory network with synaptic depression. For example, Tabak *et al.* [39, 40] have analyzed Wilson-Cowan mean-field equations representing a recurrent excitatory network with both slow and fast forms of synaptic depression, and used this to model the dynamics of synchronized population bursts in developing chick spinal cord. These burst oscillations are more robust in the presence of an extrinsic noise source or some form of spatial heterogeneity within the network [50, 51]. An excitatory network with synaptic depression and extrinsic noise has also been used to model transitions between cortical Up and Down states [41–43]. Here we will show how our analysis of noise-induced synchronization of population oscillators based on a Langevin approximation of a neural master equation can be extended to take into account the effects of synaptic depression. In addition to the relevance of synaptic depression in the generation of neural oscillations, it is interesting from a mathematical perspective since the resulting master equation provides a novel example of a so-called stochastic hybrid system [44, 52].

### 5.1 Deterministic network

### 5.2 Stochastic network and noise-induced synchronization

*q*. (Both the deterministic and stochastic models make a strong simplication by assuming that synaptic depression, which occurs at individual synapses, can be represented in terms of a single scalar variable

*q*.)

^{2}Let $N(t)$ denote the number of excitatory neurons active at time

*t*, with $P(n,t)=Prob[N(t)=n]$ evolving according to the master equation (1) with $M=1$:

*I*is an external input, and $q(t)$ satisfies

The master equation (44) is non-autonomous due to the dependence of the birth rate ${T}_{+}$ on $q(t)$, with the latter itself coupled to the associated jump Markov process via the depletion rate ${k}_{-}X(t)$. Thus equation (46) is only defined between jumps, during which *q* evolves deterministically.

The system defined by equations (44)-(46) is an example of a so-called stochastic hybrid model based on a piecewise deterministic process. This type of model has recently been applied to genetic networks [55] and to excitable neuronal membranes [44, 52, 56]. In the latter case, the hybrid model provides a mathematical formulation of excitable membranes that incorporates the exact Markovian dynamics of single stochastic ion channels. Moreover, the limit theorems of Kurtz [48] can be adapted to prove convergence of solutions of the hybrid model to solutions of a corresponding Langevin approximation in the limit $N\to \infty $ and finite time, where *N* is the number of ion channels within the membrane [44, 52].

*x*as a continuous variable, we Taylor expand the master equation to second order in $1/N$ to obtain the Fokker-Planck equation

*h*is a constant input and $\stackrel{\u02c6}{\xi}(t)$ is a white noise term of strength

*σ*. Substituting for

*I*in equation (50) and assuming that

*σ*is sufficiently small, we can Taylor expand to first order in

*σ*to give

*W*is a second independent Wiener process and

*et al.*[12]. The final result of this analysis is the steady state distribution ${\Phi}_{0}(\varphi )$ for the phase difference

*ϕ*of any pair of oscillators given by equation (39) with

*α*

*β*given by equation (56). An example plot of the periodic functions $g(\psi )$$h(\psi )$ for an excitatory network with synaptic depression is given in Figure 11. In Figure 12 we plot an example of the distribution Φ

_{0}illustrating how, as in the case of an E-I network, the synchronizing effects of a common extrinsic noise source are counteracted by the uncorrelated intrinsic noise arising from finite-size effects.

## 6 Discussion

In this paper we extended the theory of noise-induced synchronization to a stochastic Wilson-Cowan model of neural population dynamics formulated as a neural master equation. We considered two canonical network structures that are known to exhibit limit cycle oscillations in the deterministic limit; an E-I network of mutually interacting excitatory and inhibitory populations, and an excitatory network with synaptic depression. In both cases, we used phase reduction methods and averaging theory to explore the effects of intrinsic noise on the synchronization of uncoupled limit cycle oscillators driven by a common extrinsic noise source. We achieved this by first approximating the neural master equation by a corresponding neural Langevin equation. Such an approximation is reasonable for sufficiently large system size *N*, and provided that there do not exist other stable attractors of the deterministic system [24]. One important consequence of intrinsic noise is that it broadens the distribution of phase differences. The degree of broadening depends on the term ${N}^{-1}h(0)$, see equation (39), where *N* is the system size and $h(0)$ depends on the intrinsic dynamics of each uncoupled limit cycle oscillator. Another result our study is that the reduction of the master equation generates multiplicative rather than additive terms in the associated Langevin equation for both intrinsic and extrinsic noise sources. Multiplicative noise can lead to clustering of limit cycle oscillators, as was demonstrated in the case of an ensemble of uncoupled E-I networks.

It is important to point out that the master equation formulation of stochastic neurodynamics developed here and elsewhere [21–24] is a phenomenological representation of stochasticity at the population level. It is not derived from a detailed microscopic model of synaptically coupled spiking neurons, and it is not yet clear under what circumstances such a microscopic model would yield population activity consistent with the master equation approach. Nevertheless, if one views the Wilson-Cowan rate equations [18, 19] as an appropriate description of large-scale neural activity in the deterministic limit, it is reasonable to explore ways of adding noise to such equations from a top-down perspective. One possibility is to consider a Langevin version of the Wilson-Cowan equations involving some form of extrinsic additive white noise [57, 58], whereas another is to view the Wilson-Cowan rate equations as the thermodynamic limit of an underlying master equation that describes the effects of intrinsic noise [20–23]. As we have highlighted in this paper, the latter leads to a multiplicative rather than additive form of noise.

There are a number of possible extensions of this work. First, one could consider more complicated network architectures that generate limit cycle oscillations at the population level. One particularly interesting example is a competitive network consisting of two excitatory populations with synaptic depression (or some other form of slow adaptation) that mutually inhibit each other. Such a network has recently been used to model noise-induced switching during binocular rivalry [59–64]. Binocular rivalry concerns the phenomenon whereby perception switches back and forth between different images presented to either eye [65, 66]. Experimentally, it has been found that the eye dominance time statistics may be fit to a gamma distribution, suggesting that binocular rivalry is driven by a stochastic process [67]. One possibility is that there is an extrinsic source of noise associated with the input stimuli. A number of recent models have examined dominance switching times due to additive noise in a competitive Wilson-Cowan model with additional slow adapting variables [61–63]. On the other hand, Laing and Chow [59] considered a deterministic spiking neuron model of binocular rivalry in which the statistics of the resulting dominance times appeared noisy due to the aperiodicity of the high-dimensional system’s trajectories. The latter is suggestive of an effective intrinsic noise source within a rate-based population model. A second extension of our work would be to introduce synaptic coupling between the limit cycle oscillators. For example, in the case of E-I networks such coupling could represent intracortical connections between columns in visual cortex [37, 38]. The effects of mutual coupling on noise-induced synchronization has been explored within the context of a pair of coupled conductance-based neurons [15]. Finally, the neural master equation has certain similarities to individual-based models in theoretical ecology, in particular, stochastic urn models of predator-prey systems [68, 69]. Given that predator-prey systems often exhibit limit cycle oscillations and receive extrinsic environmental signals, it would be interesting to extend our results on neuronal population oscillators to explore the effects of demographic noise on the stimulus-induced synchronization of an ensemble of ecological communities.

## Footnotes

^{1}One could take the number of neurons in each sub-population to be different provided that they all scaled with *N*. For example, one could identify the system size parameter *N* with the mean number of synaptic connections into a neuron in a sparsely coupled network.

^{2}In order to relate the population depression variable

*q*to what is happening at individual synapses, we label individual neurons within an excitatory network by the index $a=1,\dots ,N$ and assume that the neurons are globally coupled. Suppose that the firing rate ${r}_{a}$ of the

*a*th neuron evolves according to

*b*and dividing through by

*N*leads to the following equation for ${q}_{a}={N}^{-1}{\sum}_{b=1}^{N}{q}_{ab}$

*a*and again impose the mean field approximation, we see that

Finally, noting that ${q}_{a}(t)\to q(t)$ for sufficiently large *t* (after transients have disappeared), we recover equations (42). In constructing a stochastic version of the network, we will assume that the above mean-field approximation still holds even though the activity variables are now random. See [70] for a recent discussion of the validity of mean-field approximations in a stochastic network model with synaptic depression.

## Declarations

### Acknowledgements

This publication was based on work supported in part by the National Science Foundation (DMS-0813677) and by Award No KUK-C1-013-4 made by King Abdullah University of Science and Technology (KAUST). PCB was also partially supported by the Royal Society Wolfson Foundation.

## Authors’ Affiliations

## References

- Buzsaki G:
*Rhythms of the Brain*. Oxford University Press, Oxford; 2006.View ArticleGoogle Scholar - Singer W, Gray CM:
**Visual feature integration and the temporal correlation hypothesis.***Annu. Rev. Neurosci.*1995,**18:**555–586.View ArticleGoogle Scholar - Gray CM:
**The temporal correlation hypothesis of visual feature integration: still alive and well.***Neuron*1999,**24:**31–47.View ArticleGoogle Scholar - Salinas E, Sejnowski TJ:
**Correlated neuronal activity and the flow of neural information.***Nat. Rev., Neurosci.*2001,**4:**539–550.View ArticleGoogle Scholar - Sejnowski TJ, Paulsen O:
**Network oscillations: emerging computational principles.***J. Neurosci.*2006,**26:**1673–1676.View ArticleGoogle Scholar - Kuramoto Y:
*Stochastic Processes in Physics and Chemistry*. North-Holland, Amsterdam; 1992.Google Scholar - Ermentrout GB, Kopell N:
**Multiple pulse interactions and averaging in coupled neural oscillators.***J. Math. Biol.*1991,**29:**195–217.MathSciNetView ArticleGoogle Scholar - Pikovsky AS:
**Synchronization and stochastization of an ensemble of autogenerators by external noise.***Radiophys. Quantum Electron.*1984,**27:**576–581.Google Scholar - Teramae JN, Tanaka D:
**Robustness of the noise-induced phase synchronization in a general class of limit cycle oscillators.***Phys. Rev. Lett.*2004.,**93:**Google Scholar - Goldobin DS, Pikovsky A:
**Synchronization and desynchronization of self-sustained oscillators by common noise.***Phys. Rev. E*2005.,**71:**Google Scholar - Galan RF, Ermentrout GB, Urban NN:
**Correlation-induced synchronization of oscillations in olfactory bulb neurons.***J. Neurosci.*2006,**26:**3646–3655.View ArticleGoogle Scholar - Nakao H, Arai K, Kawamura Y:
**Noise-induced synchronization and clustering in ensembles of uncoupled limit cycle oscillators.***Phys. Rev. Lett.*2007.,**98:**Google Scholar - Marella S, Ermentrout GB:
**Class-II neurons display a higher degree of stochastic synchronization than class-I neurons.***Phys. Rev. E*2008.,**77:**Google Scholar - Teramae JN, Nakao H, Ermentrout GB:
**Stochastic phase reduction for a general class of noisy limit cycle oscillators.***Phys. Rev. Lett.*2009.,**102:**Google Scholar - Ly C, Ermentrout GB:
**Synchronization of two coupled neural oscillators receiving shared and unshared noisy stimuli.***J. Comput. Neurosci.*2009,**26:**425–443.MathSciNetView ArticleGoogle Scholar - Mainen ZF, Sejnowski TJ:
**Reliability of spike timing in neocortical neurons.***Science*1995,**268:**1503–1506.View ArticleGoogle Scholar - Galan RF, Ermentrout GB, Urban NN:
**Optimal time scale for spike-time reliability: theory, simulations and expeiments.***J. Neurophysiol.*2008,**99:**277–283.View ArticleGoogle Scholar - Wilson HR, Cowan JD:
**Excitatory and inhibitory interactions in localized populations of model neurons.***Biophys. J.*1972,**12:**1–23.View ArticleGoogle Scholar - Wilson HR, Cowan JD:
**A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue.***Kybernetik*1973,**13:**55–80.View ArticleGoogle Scholar - Ohira T, Cowan JD:
**Stochastic neurodynamics and the system size expansion.**In*Proceedings of the First International Conference on Mathematics of Neural Networks*. Edited by: Ellacott S., Anderson I.J.. Academic Press, New York; 1997:290–294.Google Scholar - Buice M, Cowan JD:
**Field-theoretic approach to fluctuation effects in neural networks.***Phys. Rev. E*2007.,**75:**Google Scholar - Buice M, Cowan JD, Chow CC:
**Systematic fluctuation expansion for neural network activity equations.***Neural Comput.*2010,**22:**377–426.MathSciNetView ArticleGoogle Scholar - Bressloff PC:
**Stochastic neural field theory and the system-size expansion.***SIAM J. Appl. Math.*2009,**70:**1488–1521.MathSciNetView ArticleGoogle Scholar - Bressloff PC:
**Metastable states and quasicycles in a stochastic Wilson-Cowan model of neuronal population dynamics.***Phys. Rev. E*2010.,**85:**Google Scholar - Gardiner C:
*Stochastic Methods*. Springer, Berlin; 2006.Google Scholar - Abbott LF, van Vreeswijk C:
**Asynchronous states in networks of pulse-coupled oscillators.***Phys. Rev. E*1993,**48:**1483–1490.View ArticleGoogle Scholar - Gerstner W, Van Hemmen JL:
**Coherence and incoherence in a globally coupled ensemble of pulse-emitting units.***Phys. Rev. Lett.*1993,**99:**312–315.View ArticleGoogle Scholar - Brunel N, Hakim V:
**Fast global oscillations in networks of integrate-and-fire neurons with low firing rates.***Neural Comput.*1999,**11:**1621–1671.View ArticleGoogle Scholar - Brunel N:
**Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons.***J. Comput. Neurosci.*2000,**8:**183–208.View ArticleGoogle Scholar - Renart A, Brunel N, Wang XJ:
**Mean-field theory of irregularly spiking neuronal populations and working memory in recurrent cortical networks.**In*Computational Neuroscience: A Comprehensive Approach*. Edited by: Feng J.. CRC Press, Boca Raton, Fl; 2004:431–490.Google Scholar - Meyer C, van Vreeswijk C:
**Temporal correlations in stochastic networks of spiking neurons.***Neural Comput.*2002,**14:**369–404.View ArticleGoogle Scholar - Mattia M, Del Guidice P:
**Population dynamics of interacting spiking neurons.***Phys. Rev. E*2002.,**66:**Google Scholar - Soula H, Chow CC:
**Stochastic dynamics of a finite-size spiking neural network.***Neural Comput.*2007,**19:**3262–3292.MathSciNetView ArticleGoogle Scholar - Boustani SE, Destexhe A:
**A master equation formalism for macroscopic modeling of asynchronous irregular activity states.***Neural Comput.*2009,**21:**46–100.MathSciNetView ArticleGoogle Scholar - Doiron B, Lindner B, Longtin A, Maler L, Bastian J:
**Oscillatory activity in electrosensory neurons increases with the spatial correlation of the stochastic input stimulus.***Phys. Rev. Lett.*2004.,**93:**Google Scholar - Borisyuk R, Kirillov AB:
**Bifurcation analysis of a neural network model.***Biol. Cybern.*1992,**66:**319–325.View ArticleGoogle Scholar - Shuster HG, Wagner P:
**A model for neuronal oscillations in the visual cortex.***Biol. Cybern.*1990,**64:**77–82.View ArticleGoogle Scholar - Grannan ER, Kleinfeld D, Sompolinsky H:
**Stimulus-dependent synchronization of neuronal assemblies.***Neural Comput.*1993,**5:**550–569.View ArticleGoogle Scholar - Tabak J, Senn W, O’Donovan MJ, Rinzel J:
**Modeling of spontaneous activity in developing spinal cord using activity-dependent depression in an excitatory network.***J. Neurosci.*2000,**20:**3041–3056.Google Scholar - Tabak J, O’Donovan MJ, Rinzel J:
**Differential control of active and silent phases in relaxation models of neuronal rhythms.***J. Comput. Neurosci.*2006,**21:**307–328.MathSciNetView ArticleGoogle Scholar - Holcman D, Tsodyks M:
**The emergence of Up and Down states in cortical networks.***PLoS Comput. Biol.*2006,**2:**174–181.View ArticleGoogle Scholar - Kilpatrick ZP, Bressloff PC:
**Effects of synaptic depression and adaptation on spatiotemporal dynamics of an excitatory neuronal network.***Physica D*2010,**239:**547–560.MathSciNetView ArticleGoogle Scholar - Kilpatrick ZP, Bressloff PC:
**Spatially structured oscillations in a two-dimensional excitatory neuronal network with synaptic depression.***J. Comput. Neurosci.*2010,**28:**193–209.MathSciNetView ArticleGoogle Scholar - Pakdaman K, Thieullen M, Wainrib G:
**Fluid limit theorems for stochastic hybrid systems with application to neuron models.***Adv. Appl. Probab.*2010,**42:**761–794.MathSciNetView ArticleGoogle Scholar - Van Kampen NG:
*Stochastic Processes in Physics and Chemistry*. North-Holland, Amsterdam; 1992.Google Scholar - Gillespie DT:
**The chemical Langevin equation.***J. Chem. Phys.*2000,**113:**297–306.View ArticleGoogle Scholar - Lei J:
**Stochasticity in single gene expression with both intrinsic noise and fluctuations in kinetic parameters.***J. Theor. Biol.*2009,**256:**485–492.View ArticleGoogle Scholar - Kurtz TG:
**Limit theorems and diffusion approximations for density dependent Markov chains.***Math. Program. Stud.*1976,**5:**67–78.MathSciNetView ArticleGoogle Scholar - Ermentrout GB:
**Noisy oscillators.**In*Stochastic Methods in Neuroscience*. Edited by: Laing C.R., Lord G.J.. Oxford University Press, Oxford; 2009.Google Scholar - Vladimirski B, Tabak J, O’Donovan MJ, Rinzel J:
**Neural networks with dynamic synapses.***J. Comput. Neurosci.*2008,**25:**39–63.MathSciNetView ArticleGoogle Scholar - Nesse WH, Borisyuk A, Bressloff PC:
**Fluctuation-driven rhythmogenesis in an excitatory neuronal network with slow adaptation.***J. Comput. Neurosci.*2009,**25:**317–333.MathSciNetView ArticleGoogle Scholar - Buckwar E, Riedler MG:
**An exact stochastic hybrid model of excitable membranes including spatio-temporal evolution.***J. Math. Biol.*2011.Google Scholar - Abbott LF, Varela JA, Sen K, Nelson SB:
**Synaptic depression and cortical gain control.***Science*1997,**275:**220–224.View ArticleGoogle Scholar - Tsodyks MS, Pawelzik K, Markram H:
**Neural networks with dynamic synapses.***Neural Comput.*1998,**10:**821–835.View ArticleGoogle Scholar - Zeisler S, Franz U, Wittich O, Liebscher V:
**Simulation of genetic networks modelled by piecewise deterministic Markov processes.***IET Syst. Bio.*2008,**2:**113–135.View ArticleGoogle Scholar - Smith GD:
**Modeling the stochastic gating of ion channels.**In*Computational Cell Biology*. Edited by: Fall C.P., Marland E.S., Wagner J.M., Tyson J.J.. Springer, New York; 2002:289–323.Google Scholar - Hutt A, Longtin A, Schimansky-Geier L:
**Additive noise-induced Turing transitions in spatial systems with application to neural fields and the Swift-Hohenberg equation.***Physica D*2008,**237:**755–773.MathSciNetView ArticleGoogle Scholar - Faugeras O, Touboul J, Cessac B:
**A constructive mean-field analysis of multi-population neural networks with random synaptic weights and stochastic inputs.***Front. Comput. Neurosci.*2009,**3:**1–28.View ArticleGoogle Scholar - Laing CR, Chow CC:
**A spiking neuron model for binocular rivalry.***J. Comput. Neurosci.*2002,**12:**39–53.View ArticleGoogle Scholar - Wilson HR:
**Computational evidence for a rivalry hierarchy in vision.***Proc. Natl. Acad. Sci. USA*2003,**100:**14499–14503.View ArticleGoogle Scholar - Moreno-Bote R, Rinzel J, Rubin N:
**Noise-induced alternations in an attractor network model of perceptual bistability.***J. Neurophysiol.*2007,**98:**1125–1139.View ArticleGoogle Scholar - Shpiro A, Curtu R, Rinzel J, Rubin N:
**Dynamical characteristics common to neuronal competition models.***J. Neurophysiol.*2007,**97:**462–473.View ArticleGoogle Scholar - Shpiro A, Moreno-Bote R, Rubin N, Rinzel J:
**Balance between noise and adaptation in competition models of perceptual bistability.***J. Comput. Neurosci.*2009,**27:**37–54.MathSciNetView ArticleGoogle Scholar - Kilpatrick ZP, Bressloff PC:
**Binocular rivalry in a competitive neural network with synaptic depression.***SIAM J. Appl. Dyn. Syst.*2010,**9:**1303–1347.MathSciNetView ArticleGoogle Scholar - Blake R:
**A primer on binocular rivalry, including current controversies.***Brain and Mind*2001,**2:**5–38.MathSciNetView ArticleGoogle Scholar - Blake R, Logethetis N:
**Visual competition.***Nat. Rev., Neurosci.*2002,**3:**13–23.View ArticleGoogle Scholar - Logothetis NK, Leopold DA, Sheinberg DL:
**What is rivalling during binocular rivalry?***Nature*1996,**380:**621–624.View ArticleGoogle Scholar - McKane AJ, Newman TJ:
**Stochastic models in population biology and their deterministic analogs.***Phys. Rev. E*2004.,**70:**Google Scholar - Tome T, de Oliviera MJ:
**Role of noise in population dynamics cycles.***Phys. Rev. E*2009.,**79:**Google Scholar - Igarashi Y, Oizumi M, Okada M:
**Mean field analysis of stochastic neural network models with synaptic depression.***J. Phys. Soc. Jpn.*2010.,**79:**Google Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.