Asymptotic description of stochastic neural networks. II. Characterization of the limit law

Olivier Faugeras, James Maclaurin

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

We continue the development, started in [8], of the asymptotic description of certain stochastic neural networks. We use the Large Deviation Principle (LDP) and the good rate function H announced there to prove that H has a unique minimum μe, a stationary measure on the set of trajectories TZ. We characterize this measure by its two marginals, at time 0, and from time 1 to T. The second marginal is a stationary Gaussian measure. With an eye on applications, we show that its mean and covariance operator can be inductively computed. Finally, we use the LDP to establish various convergence results, averaged, and quenched.

Original languageEnglish (US)
Pages (from-to)847-852
Number of pages6
JournalComptes Rendus Mathematique
Volume352
Issue number10
DOIs
StatePublished - Oct 1 2014
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • General Mathematics

Fingerprint

Dive into the research topics of 'Asymptotic description of stochastic neural networks. II. Characterization of the limit law'. Together they form a unique fingerprint.

Cite this