TY - JOUR
T1 - Asymptotic description of stochastic neural networks. II. Characterization of the limit law
AU - Faugeras, Olivier
AU - Maclaurin, James
N1 - Publisher Copyright:
© 2014 Académie des sciences.
PY - 2014/10/1
Y1 - 2014/10/1
N2 - We continue the development, started in [8], of the asymptotic description of certain stochastic neural networks. We use the Large Deviation Principle (LDP) and the good rate function H announced there to prove that H has a unique minimum μe, a stationary measure on the set of trajectories TZ. We characterize this measure by its two marginals, at time 0, and from time 1 to T. The second marginal is a stationary Gaussian measure. With an eye on applications, we show that its mean and covariance operator can be inductively computed. Finally, we use the LDP to establish various convergence results, averaged, and quenched.
AB - We continue the development, started in [8], of the asymptotic description of certain stochastic neural networks. We use the Large Deviation Principle (LDP) and the good rate function H announced there to prove that H has a unique minimum μe, a stationary measure on the set of trajectories TZ. We characterize this measure by its two marginals, at time 0, and from time 1 to T. The second marginal is a stationary Gaussian measure. With an eye on applications, we show that its mean and covariance operator can be inductively computed. Finally, we use the LDP to establish various convergence results, averaged, and quenched.
UR - http://www.scopus.com/inward/record.url?scp=84907983762&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84907983762&partnerID=8YFLogxK
U2 - 10.1016/j.crma.2014.08.017
DO - 10.1016/j.crma.2014.08.017
M3 - Article
AN - SCOPUS:84907983762
SN - 1631-073X
VL - 352
SP - 847
EP - 852
JO - Comptes Rendus Mathematique
JF - Comptes Rendus Mathematique
IS - 10
ER -