Skip to content

Commit 4a71866

Browse files
author
Alexander Ororbia
committed
cleaned up syn modeling doc
1 parent bc713f6 commit 4a71866

File tree

2 files changed

+22
-94
lines changed

2 files changed

+22
-94
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ ngc-learn requires:
3737
2) NumPy (>=1.26.0)
3838
3) SciPy (>=1.7.0)
3939
4) ngcsimlib (>=0.3.b4), (visit official page <a href="https://github.com/NACLab/ngc-sim-lib">here</a>)
40-
5) JAX (>= 0.4.18) (to enable GPU use, make sure to install one of the CUDA variants)
40+
5) JAX (>= 0.4.28) (to enable GPU use, make sure to install one of the CUDA variants)
4141
<!--
4242
5) scikit-learn (>=1.3.1) if using `ngclearn.utils.density`
4343
6) matplotlib (>=3.4.3) if using `ngclearn.utils.viz`
@@ -79,7 +79,7 @@ Python 3.11.4 (main, MONTH DAY YEAR, TIME) [GCC XX.X.X] on linux
7979
Type "help", "copyright", "credits" or "license" for more information.
8080
>>> import ngclearn
8181
>>> ngclearn.__version__
82-
'1.2b3'
82+
'2.0.0'
8383
```
8484

8585
<i>Note:</i> For access to the previous Tensorflow-2 version of ngc-learn (of
@@ -126,7 +126,7 @@ $ python install -e .
126126
</pre>
127127

128128
**Version:**<br>
129-
1.2.3-Beta <!-- -Alpha -->
129+
2.0.0 <!--1.2.3-Beta--> <!-- -Alpha -->
130130

131131
Author:
132132
Alexander G. Ororbia II<br>

docs/modeling/synapses.md

Lines changed: 19 additions & 91 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Synapses
22

3-
The synapse is a key building blocks for connecting/wiring together the various
3+
The synapse is a key building block for connecting/wiring together the various
44
component cells that one would use for characterizing a biomimetic neural system.
55
These particular objects are meant to perform, per simulated time step, a
66
specific type of transformation -- such as a linear transform or a
@@ -17,9 +17,7 @@ steps, or by integrating a differential equation, e.g., via eligibility traces.
1717

1818
### Static (Dense) Synapse
1919

20-
This synapse performs a linear transform of its input signals.
21-
Note that this synaptic cable does not evolve and is meant to be
22-
used for fixed value (dense) synaptic connections.
20+
This synapse performs a linear transform of its input signals. Note that this synaptic cable does not evolve and is meant to be used for fixed value (dense) synaptic connections. Note that all synapse components that inherit from the `DenseSynapse` class support sparsification (via the `p_conn` probability of existence argument) to produce sparsely-connected synaptic connectivity patterns.
2321

2422
```{eval-rst}
2523
.. autoclass:: ngclearn.components.StaticSynapse
@@ -48,8 +46,7 @@ Note that this synaptic cable does not evolve and is meant to be used for fixed
4846

4947
### Static Deconvolutional Synapse
5048

51-
This synapse performs a deconvolutional transform of its input signals.
52-
Note that this synaptic cable does not evolve and is meant to be used for fixed value deconvolution/transposed convolution synaptic filters.
49+
This synapse performs a deconvolutional transform of its input signals. Note that this synaptic cable does not evolve and is meant to be used for fixed value deconvolution/transposed convolution synaptic filters.
5350

5451
```{eval-rst}
5552
.. autoclass:: ngclearn.components.DeconvSynapse
@@ -63,10 +60,9 @@ Note that this synaptic cable does not evolve and is meant to be used for fixed
6360

6461
## Dynamic Synapse Types
6562

66-
### Short-Term Plasticity(Dense) Synapse
63+
### Short-Term Plasticity (Dense) Synapse
6764

68-
This synapse performs a linear transform of its input signals. Note that this
69-
synapse is "dynamic" in the sense that it engages in short-term plasticity (STP), meaning that its efficacy values change as a function of its inputs (and simulated consumed resources), but it does not provide any long-term form of plasticity/adjustment.
65+
This synapse performs a linear transform of its input signals. Note that this synapse is "dynamic" in the sense that it engages in short-term plasticity (STP), meaning that its efficacy values change as a function of its inputs/time (and simulated consumed resources), but it does not provide any long-term form of plasticity/adjustment.
7066

7167
```{eval-rst}
7268
.. autoclass:: ngclearn.components.STPDenseSynapse
@@ -80,22 +76,11 @@ synapse is "dynamic" in the sense that it engages in short-term plasticity (STP)
8076

8177
## Multi-Factor Learning Synapse Types
8278

83-
Hebbian rules operate in a local manner -- they generally use information more
84-
immediately available to synapses in both space and time -- and can come in a
85-
wide variety of flavors. One general way to categorize variants of Hebbian learning
86-
is to clarify what (neural) statistics they operate on, e.g, do they work with
87-
real-valued information or discrete spikes, and how many factors (or distinct
88-
terms) are involved in calculating the update to synaptic values by the
89-
relevant learning rule. <!--(Note that, in principle, all forms of plasticity in
90-
ngc-learn technically work like local, factor-based rules. )-->
79+
Hebbian rules operate in a local manner -- they generally use information more immediately available to synapses in both space and time -- and can come in a wide variety of flavors. One general way to categorize variants of Hebbian learning is to clarify what (neural) statistics/values they operate on, e.g, do they work with real-valued information or discrete spikes, and how many factors (or distinct terms) are involved in calculating the update to synaptic values by the relevant learning rule. <!--(Note that, in principle, all forms of plasticity in ngc-learn technically work like local, factor-based rules. )-->
9180

9281
### (Two-Factor) Hebbian Synapse
9382

94-
This synapse performs a linear transform of its input signals and evolves
95-
according to a strictly two-factor update rule. In other words, the
96-
underlying synaptic efficacy matrix is changed according to a product between
97-
pre-synaptic compartment values (`pre`) and post-synaptic compartment (`post`)
98-
values, which can contain any type of vector/matrix statistics.
83+
This synapse performs a linear transform of its input signals and evolves according to a strictly two-factor/term update rule. In other words, the underlying synaptic efficacy matrix is changed according to a product between pre-synaptic compartment values (`pre`) and post-synaptic compartment (`post`) values, which can contain any type of vector/matrix statistics. This particular synapse further features some tools for advanced forms of Hebbian descent/ascent (such as applying the Hebbian update via adaptive learning rates such as adaptive moment estimation, i.e., Adam).
9984

10085
```{eval-rst}
10186
.. autoclass:: ngclearn.components.HebbianSynapse
@@ -111,13 +96,7 @@ values, which can contain any type of vector/matrix statistics.
11196

11297
### (Two-Factor) BCM Synapse
11398

114-
This synapse performs a linear transform of its input signals and evolves
115-
according to a multi-factor, Bienenstock-Cooper-Munro (BCM) update rule. The
116-
underlying synaptic efficacy matrix is changed according to an evolved
117-
synaptic threshold parameter `theta` and a product between
118-
pre-synaptic compartment values (`pre`) and a nonlinear function of post-synaptic
119-
compartment (`post`) values, which can contain any type of vector/matrix
120-
statistics.
99+
This synapse performs a linear transform of its input signals and evolves according to a multi-factor, Bienenstock-Cooper-Munro (BCM) update rule. The underlying synaptic efficacy matrix is changed according to an evolved synaptic threshold parameter `theta` and a product between pre-synaptic compartment values (`pre`) and a nonlinear function of post-synaptic compartment (`post`) values, which can contain any type of vector/matrix statistics.
121100

122101
```{eval-rst}
123102
.. autoclass:: ngclearn.components.BCMSynapse
@@ -133,10 +112,7 @@ statistics.
133112

134113
### (Two-Factor) Hebbian Convolutional Synapse
135114

136-
This synapse performs a convolutional transform of its input signals and evolves
137-
according to a two-factor update rule. The underlying synaptic filters are
138-
changed according to products between pre-synaptic compartment values (`pre`)
139-
and post-synaptic compartment (`post`) feature map values.
115+
This synapse performs a convolutional transform of its input signals and evolves according to a two-factor update rule. The underlying synaptic filters are changed according to products between pre-synaptic compartment values (`pre`) and post-synaptic compartment (`post`) feature map values.
140116

141117
```{eval-rst}
142118
.. autoclass:: ngclearn.components.HebbianConvSynapse
@@ -152,11 +128,7 @@ and post-synaptic compartment (`post`) feature map values.
152128

153129
### (Two-Factor) Hebbian Deconvolutional Synapse
154130

155-
This synapse performs a deconvolutional (transposed convolutional) transform of
156-
its input signals and evolves according to a two-factor update rule. The
157-
underlying synaptic filters are changed according to products between
158-
pre-synaptic compartment values (`pre`) and post-synaptic compartment (`post`)
159-
feature map values.
131+
This synapse performs a deconvolutional (transposed convolutional) transform of its input signals and evolves according to a two-factor update rule. The underlying synaptic filters are changed according to products between pre-synaptic compartment values (`pre`) and post-synaptic compartment (`post`) feature map values.
160132

161133
```{eval-rst}
162134
.. autoclass:: ngclearn.components.HebbianDeconvSynapse
@@ -172,31 +144,12 @@ feature map values.
172144

173145
## Spike-Timing-Dependent Plasticity (STDP) Synapse Types
174146

175-
Synapses that evolve according to a spike-timing-dependent plasticity (STDP)
176-
process operate, at a high level, much like multi-factor Hebbian rules (given
177-
that STDP is a generalization of Hebbian adjustment to spike trains) and share
178-
many of their properties. Nevertheless, a distinguishing factor for STDP-based
179-
synapses is that they must involve action potential pulses (spikes) in their
180-
calculations and they typically compute synaptic change according to the
181-
relative timing of spikes. In principle, any of the synapses in this grouping
182-
of components adapt their efficacies according to rules that are at least special
183-
four-factor terms, i.e., a pre-synaptic spike (an "event"), a pre-synaptic delta
184-
timing (which can come in the form of a trace), a post-synaptic spike (or event),
185-
and a post-synaptic delta timing (also can be a trace). In addition, STDP rules
186-
in ngc-learn typically enforce soft/hard synaptic strength bounding, i.e., there
187-
is a maximum magnitude allowed for any single synaptic efficacy, and, by default,
188-
an STDP synapse enforces that its synaptic strengths are non-negative.
147+
Synapses that evolve according to a spike-timing-dependent plasticity (STDP) process operate, at a high level, much like multi-factor Hebbian rules (given that STDP is a generalization of Hebbian adjustment to spike trains) and share many of their properties. Nevertheless, a distinguishing factor for STDP-based synapses is that they must involve action potential pulses (spikes) in their calculations and they typically compute synaptic change according to the relative timing of spikes. In principle, any of the synapses in this grouping of components adapt their efficacies according to rules that are at least special four terms, i.e., a pre-synaptic spike (an "event"), a pre-synaptic delta timing (which can come in the form of a trace), a post-synaptic spike (or event), and a post-synaptic delta timing (also can be a trace). In addition, STDP rules in ngc-learn typically enforce soft/hard synaptic strength bounding, i.e., there is a maximum magnitude allowed for any single synaptic efficacy, and, by default, an STDP synapse enforces its synaptic strengths to be non-negative.
148+
Note: these rules are technically considered to be "two-factor" rules since they only operate on pre- and post-synaptic activity (despite each factor being represented by two or more terms).
189149

190150
### Trace-based STDP
191151

192-
This is a four-factor STDP rule that adjusts the underlying synaptic strength
193-
matrix via a weighted combination of long-term depression (LTD) and long-term
194-
potentiation (LTP). For the LTP portion of the update, a pre-synaptic trace and
195-
a post-synaptic event/spike-trigger are used, and for the LTD portion of the
196-
update, a pre-synaptic event/spike-trigger and a post-synaptic trace are
197-
utilized. Note that this specific rule can be configured to use different forms
198-
of soft threshold bounding including a scheme that recovers a power-scaling
199-
form of STDP (via the hyper-parameter `mu`).
152+
This is a four-term STDP rule that adjusts the underlying synaptic strength matrix via a weighted combination of long-term depression (LTD) and long-term potentiation (LTP). For the LTP portion of the update, a pre-synaptic trace and a post-synaptic event/spike-trigger are used, and for the LTD portion of the update, a pre-synaptic event/spike-trigger and a post-synaptic trace are utilized. Note that this specific rule can be configured to use different forms of soft threshold bounding including a scheme that recovers a power-scaling form of STDP (via the hyper-parameter `mu`).
200153

201154
```{eval-rst}
202155
.. autoclass:: ngclearn.components.TraceSTDPSynapse
@@ -212,10 +165,7 @@ form of STDP (via the hyper-parameter `mu`).
212165

213166
### Exponential STDP
214167

215-
This is a four-factor STDP rule that directly incorporates a controllable
216-
exponential synaptic strength dependency into its dynamics. This synapse's LTP
217-
and LTD use traces and spike events in a manner similar to the trace-based STDP
218-
described above.
168+
This is a four-term STDP rule that directly incorporates a controllable exponential synaptic strength dependency into its dynamics. This synapse's LTP and LTD use traces and spike events in a manner similar to the trace-based STDP described above.
219169

220170
```{eval-rst}
221171
.. autoclass:: ngclearn.components.ExpSTDPSynapse
@@ -231,9 +181,7 @@ described above.
231181

232182
### Event-Driven Post-Synaptic STDP Synapse
233183

234-
This is a synaptic evolved under a two-factor STDP rule that is driven by
235-
only spike events.
236-
184+
This is a synaptic evolved under a two-term STDP rule that is driven by only spike events and operates within a defined pre-synaptic "window" of time.
237185

238186
```{eval-rst}
239187
.. autoclass:: ngclearn.components.EventSTDPSynapse
@@ -249,12 +197,7 @@ only spike events.
249197

250198
### Trace-based STDP Convolutional Synapse
251199

252-
This is a four-factor STDP rule for convolutional synapses that adjusts the
253-
underlying filters via a weighted combination of long-term depression (LTD) and
254-
long-term potentiation (LTP). For the LTP portion of the update, a pre-synaptic
255-
trace and a post-synaptic event/spike-trigger are used, and for the LTD portion
256-
of the update, a pre-synaptic event/spike-trigger and a post-synaptic trace are
257-
utilized.
200+
This is a four-term STDP rule for convolutional synapses/kernels that adjusts the underlying filters via a weighted combination of long-term depression (LTD) and long-term potentiation (LTP). For the LTP portion of the update, a pre-synaptic trace and a post-synaptic event/spike-trigger are used, and for the LTD portion of the update, a pre-synaptic event/spike-trigger and a post-synaptic trace are utilized.
258201

259202
```{eval-rst}
260203
.. autoclass:: ngclearn.components.TraceSTDPConvSynapse
@@ -270,12 +213,7 @@ utilized.
270213

271214
### Trace-based STDP Deonvolutional Synapse
272215

273-
This is a four-factor STDP rule for deconvolutional (transposed convolutional)
274-
synapses that adjusts the underlying filters via a weighted combination of
275-
long-term depression (LTD) and long-term potentiation (LTP). For the LTP portion
276-
of the update, a pre-synaptic trace and a post-synaptic event/spike-trigger are
277-
used, and for the LTD portion of the update, a pre-synaptic event/spike-trigger
278-
and a post-synaptic trace are utilized.
216+
This is a four-term STDP rule for deconvolutional (transposed convolutional) synapses that adjusts the underlying filters via a weighted combination of long-term depression (LTD) and long-term potentiation (LTP). For the LTP portion of the update, a pre-synaptic trace and a post-synaptic event/spike-trigger are used, and for the LTD portion of the update, a pre-synaptic event/spike-trigger and a post-synaptic trace are utilized.
279217

280218
```{eval-rst}
281219
.. autoclass:: ngclearn.components.TraceSTDPDeconvSynapse
@@ -292,21 +230,11 @@ and a post-synaptic trace are utilized.
292230
## Modulated Forms of Plasticity
293231

294232
This family of synapses implemented within ngc-learn support modulated, often
295-
at least three-factor, forms of synaptic adjustment. Modulators could include
296-
reward/dopamine values or scalar error signals, and are generally assumed to be
297-
administered to the synapse(s) externally (i.e., it is treated as another
298-
input provided by some other entity, e.g., another neural circuit).
233+
at least three-factor, forms of synaptic adjustment. Modulators could include reward/dopamine values or scalar error signals, and are generally assumed to be administered to the synapse(s) externally (i.e., it is treated as another input provided by some other entity, e.g., another neural circuit).
299234

300235
### Reward-Modulated Trace-based STDP (MSTDP-ET)
301236

302-
This is a modulated STDP (MSTDP) rule that adjusts the underlying synaptic strength
303-
matrix via a weighted combination of long-term depression (LTD) and long-term
304-
potentiation (LTP), scaled by an external signal such as a reward/dopamine value.
305-
The STDP element of this form of plasticity inherits from trace-based STDP
306-
(i.e. `TraceSTDPSynapse`). This synapse component further supports a configuration
307-
for MSTDP-ET, MSTPD with eligibility traces; this means the synapse will treat its
308-
synapses as two elements -- a synaptic efficacy and a coupled synaptic trace that
309-
maintains the dynamics of STDP updates encountered over time.
237+
This is a three-factor learning rule, i.e., pre-synaptic activity, post-synaptic activity, and a modulatory signal, known as modulated STDP (MSTDP). MSTDP adjusts the underlying synaptic strength matrix via a weighted combination of long-term depression (LTD) and long-term potentiation (LTP), scaled by an external signal such as a reward/dopamine value. The STDP element of this form of plasticity inherits from trace-based STDP (i.e. `TraceSTDPSynapse`). Note that this synapse component further supports a configuration for MSTDP-ET, or MSTPD with eligibility traces; MSTDP-ET means that the synapse will treat its synapses as two elements -- a synaptic efficacy and a coupled synaptic trace that maintains the dynamics of STDP updates encountered over time.
310238

311239
```{eval-rst}
312240
.. autoclass:: ngclearn.components.MSTDPETSynapse

0 commit comments

Comments
 (0)