Asymptotic description of stochastic neural networks. I. Existence of a large deviation principle

Olivier Faugeras, James Maclaurin

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. The dynamics of the neurons is described by a set of stochastic differential equations in discrete time. The neurons interact through the synaptic weights that are Gaussian correlated random variables. We describe the asymptotic law of the network when the number of neurons goes to infinity. Unlike previous works which made the biologically unrealistic assumption that the weights were i.i.d. random variables, we assume that they are correlated. We introduce the process-level empirical measure of the trajectories of the solutions into the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions into the equations of the network of neurons. The result is that the image law through the empirical measure satisfies a large deviation principle with a good rate function. We provide an analytical expression of this rate function in terms of the spectral representation of certain Gaussian processes.

Original languageEnglish (US)
Pages (from-to)841-846
Number of pages6
JournalComptes Rendus Mathematique
Volume352
Issue number10
DOIs
StatePublished - Oct 1 2014
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • General Mathematics

Fingerprint

Dive into the research topics of 'Asymptotic description of stochastic neural networks. I. Existence of a large deviation principle'. Together they form a unique fingerprint.

Cite this