To whom it may concern,

I am writing regarding the problem with giving my topology network noise after simulation with the signal.  The network is oscillating with noise generator which is created as single layer which provides noise for other nodes (which consist of layers).  I stimulate my network with stronger signal(another different layer) and after the simulation the Nodes stop reacting to the noise input and become  stuck on refresh Voltage and its for longer  periods than  refractory period.
The model is derivation of Hill-Tononi example network.
The Nest is running on Ubuntu 20.04 , Conda/Spyder  The nest is version v2.20.1
To visualize the problem  here are the images which illustrate the problem
image.png
image.png

Here is the snippet of the code with the connections and layer setups


Best regards

Bc. Filip Blaštík