Periodic solutions in threshold-linear networks and their entrainment

Andrea Bel, Romina Cobiaga, Walter Reartes, Horacio G. Rotstein

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


Threshold-linear networks (TLNs) are recurrent networks where the dynamics are threshold-linear (linearly rectified at zero). Mathematically, they consist of coupled nonsmooth ordinary differential equations. When the nodes in the network are assumed to be neurons or neuronal populations, TLNs represent firing rate models. We investigate the dynamics of a subclass of TLNs referred to as competitive TLNs where all the connections between different nodes are inhibitory. We prove the existence of periodic solutions in competitive TLNs with three nodes using a combination of mathematical analysis and numerical simulations. We calculate the analytical expressions of the periodic solutions, then we consider a reduced system of transcendental equations and apply a Kantorovich's convergence result to demonstrate the existence of these solutions. We then analyze the attributes (frequency and amplitude) of these periodic solutions as the model parameters vary. Finally, we study the entrainment properties of competitive TLNs in the oscillatory regime, by examining their response to external periodic inputs to one of the nodes in the network. We numerically determine the ranges of input amplitudes and frequencies for which competitive TLNs are able to follow the periodic input for three-node networks and larger networks with cyclic symmetry.

Original languageEnglish (US)
Pages (from-to)1177-1208
Number of pages32
JournalSIAM Journal on Applied Dynamical Systems
Issue number3
StatePublished - 2021

All Science Journal Classification (ASJC) codes

  • Analysis
  • Modeling and Simulation


  • Nonsmooth dynamical systems
  • Periodic solutions
  • Recurrent neural networks


Dive into the research topics of 'Periodic solutions in threshold-linear networks and their entrainment'. Together they form a unique fingerprint.

Cite this