MultistepLIFNode surrogate backpropagation #384
-
Hello, I am currently using MultiStepLIFNodes in my project and have recently reviewed the documentation of the surrogate gradient method on the readTheDocs page and the suggested Paper. I understand that using a surrogate such as sigmoid enables an easy gradient computation in the backwards pass during optimization, however, I am confused as to which parameters are adjusted using this information? I did not find a set of input weights, so I assume it must be the spiking threshold or the reset value, is that correct? Unfortunately I was unable to find the corresponding implementation, that is why I am asking here. Thank you very much for your help. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
The amplitude of gradients is determined by the surrogate function. For example, if you use the atan surrogate function, you can set
The weighted inputs, threshold and the surrogate function will influence the gradients. The reset vaule has not effect in the current time-step, but it may influence the future gradients. |
Beta Was this translation helpful? Give feedback.
-
Yes, you need to set the vthreshold as learnable parameters. Here is an example:
Weights of inputs are weights of layers which are in front of spiking neurons. You can refer to pytorch's api doc for how to modify their weights. |
Beta Was this translation helpful? Give feedback.
Yes, you need to set the vthreshold as learnable parameters. Here is an example:
#371 (comment)
Weights of inputs are weights of layers which are in front of spiking neurons. You can refer to pytorch's api doc for how to modify their weights.