You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Question:for j, x in enumerate(layer2):
active = []
if(x.t_rest<t):
x.P = x.P + np.dot(synapse[j], train[:,t])
if(x.P>par.Prest):
x.P -= par.D
active_pot[j] = x.P
Dear researcher, bother you!
For the above intercepted code, I have a doubt and would like to communicate with you.
For xP-= par.D . I still can't figure out the meaning of subtracting par.D. I checked the value of par.D, which is 0.75 in your document, and I looked up the relevant information, and I still don't know this clearly meaning.
At your convenience, would you please help me explain this problem?My email is [email protected]. Thank you very much and look forward to hearing from you!
The text was updated successfully, but these errors were encountered:
Hey,
It's how the "Leaky, integrate and fire" neuron model is described. Neuron potential decays every timestep. You can refer to the following book to get a deeper understanding: "https://neuronaldynamics.epfl.ch/online/Ch5.S2.html"
Hey,
It's how the "Leaky, integrate and fire" neuron model is described. Neuron potential decays every timestep. You can refer to the following book to get a deeper understanding: "https://neuronaldynamics.epfl.ch/online/Ch5.S2.html"
A few days ago I read this book on github about the characteristics of LIF neurons, and I also know some dynamic characteristics of SRM neurons. According to my understanding, the membrane potential of neurons will be lost during excitement transmission, It is equivalent to the loss on the axon during the transmission process, but I have a problem. In the code I played in the previous question, the loss of neurons should occur after discharge, and after discharge xp-= par. D, but the code does not reflect the situation has been discharged.
What I know about that code is: just the membrane potential is greater than the resting potential, the loss occurs
if (x.p> x.rest):
........
x.p-= par.D
I feel that x.prest here and par.D minus at this time do not quite understand.
Question:for j, x in enumerate(layer2):
active = []
if(x.t_rest<t):
x.P = x.P + np.dot(synapse[j], train[:,t])
if(x.P>par.Prest):
x.P -= par.D
active_pot[j] = x.P
For xP-= par.D . I still can't figure out the meaning of subtracting par.D. I checked the value of par.D, which is 0.75 in your document, and I looked up the relevant information, and I still don't know this clearly meaning.
At your convenience, would you please help me explain this problem?My email is [email protected]. Thank you very much and look forward to hearing from you!
The text was updated successfully, but these errors were encountered: