This guide will detail the limitations that sPyNNaker imposes on users of PyNN, as well as detailing some extensions to the PyNN language that are supported.
sPyNNaker8 implements a subset of the PyNN 0.9 API.
We recommend using PyNN 0.9 for new work.
sPyNNaker currently supports the following model types:
IF_curr_exp: Current based leaky integrate and fire, with 1 excitatory and 1 inhibitory exponentially decaying synaptic input per neuron
IF_cond_exp: Conductance based leaky integrate and fire, with 1 excitatory and 1 inhibitory exponentially decaying synaptic input per neuron
IF_curr_alpha: Current based leaky integrate and fire, with 1 excitatory and 1 inhibitory alpha-function shaped synaptic input per neuron
extra_models.IF_curr_dual_exp: Current based, Leaky integrate and fire, with 2 excitatory and 1 inhibitory exponentially decaying synaptic input per neuron
Izhikevich: Current based Izhikevich with 1 excitatory and 1 inhibitory exponentially decaying synaptic input per neuron
extra_models.Izhikevich_cond(PyNN 0.8): Conductance based Izhikevich with 1 excitatory and 1 inhibitory exponentially decaying synaptic input per neuron
extra_models.IFCurDelta: Current based leaky integrate and fire with 1 excitatory and 1 inhibitory delta synaptic input per neuron
extra_models.IFCurrExpCa2Adaptive: Current based leaky integrate and fire with 1 excitatory and 1 inhibitory exponentially decaying, calcium-adaptive synaptic input per neuron
extra_models.IFCondExpStoc: Conductance-based leaky intergate and fire with a stochastic Maass threshold.
extra_models.IF_curr_exp_sEMD: Current based leaky integrate and fire with 1 excitatory and 1 inhibitory exponentially decaying synaptic input per neuron where the inhibitory input is scaled by a multiplicative factor defined by the user
Note that there are further restrictions on what plasticity types are supported when used with the above models.
All of our neural models have a limitation of 255 neurons per core. Depending on which SpiNNaker board you are using, this will limit the number of neurons that can be supported in any simulation.
sPyNNaker currently supports two models for injecting spikes into a PyNN model:
SpikeSourceArray: Input of a predefined set of spikes. The spikes to be input can be changed between calls to run.
SpikeSourcePoisson: Input of randomly generated spikes at a predefined mean rate generated from a Poisson distribution.
Currently, only the
i_offset parameter of the neural models can be used to inject current directly; there is no support for noisy or step-based current input. Step-based current input can be achieved by updating
i_offset between calls to
A third, non-standard PyNN interface, way of injecting current into a PyNN simulation executing on the hardware is through live injection from an external device (e.g., a robot). A description on how to use this functionality can be found here.
sPyNNaker currently supports the following connector types:
AllToAllConnector: All neurons in the pre-population are connected to all neurons in the post-population
ArrayConnector: The connectivity is set by passing in an explicit boolean array matrix of size (pre-population size, post-population size).
CSAConnector: The connectivity is set due to a Connection Set Algebra as defined by Djurfeldt (2012). For more information on the python implementation see github.com/INCF/csa.
DistanceDependentProbabilityConnector: The connectivity is defined by a probability that depends on the distance between the neurons in the pre- and post-populations.
FixedNumberPreConnector: A fixed number of randomly selected neurons in the pre-population are connected to all neurons in the post-population.
FixedNumberPostConnector: A fixed number of randomly selected neurons in the post-population are connected to all neurons in the pre-population.
FixedProbabilityConnector: The connectivity is random with a fixed probability of connection between any pair of neurons.
FromFileConnector: The connectivity is explicitly specified in a file, including all weights and delays. Note that this connector will result in slower operation of the tools.
FromListConnector: The connectivity is explicitly specified in a list, including all weights and delays. Note that this connector will result in slower operation of the tools.
FixedTotalNumberConnector: A fixed number of randomly selected connections are made.
IndedxDependentProbabilityConnector: The connectivity is defined by a probability that depends on the indices of the neurons in the pre- and post-populations.
KernelConnector: The pre- and post-populations are considered as a 2D array, and every post(row, col) neuron connects to many pre(row, col, kernel) using a (kernel) set of weights and/or delays.
OneToOneConnector: The neuron with index i in the pre-population is connected to the neuron with index i in the post-population.
SmallWorldConnector: Connect cells so as to create a small-world network.
sPyNNaker8 currently only supports plasticity described by an
STDPMechanism which is set as the
synapse_dynamics property of a
sPyNNaker supports the following STDP timing dependence rules:
SpikePairRule: The amount of potentiation or depression decays exponentially with the time between each pair of pre and post spikes.
extra_models.SpikeNearestPair: Similar to the SpikePairRule, but only the nearest pair of pre and post spikes are considered i.e. the pre-spike that immediately follows a post spike or the post spike that immediately follows a pre-spike
and the following STDP weight dependence rules:
AdditiveWeightDependence: The change in weight is related only to the timing between the spikes determined by the timing rule.
MultiplicativeWeightDependence: The change in weight is related additionally to difference between the current and the maximum / minimum allowed weight of the rule.
extra_models.WeightDependenceAdditiveTriplet: As with AdditiveWeightDependence but allows the use of triplet rules.
sPyNNaker execution limitations
- sPyNNaker supports the ability to call
run()multiple times with different combinations of runtime values.
- sPyNNaker supports the ability to call
reset()multiple times within the script with
- sPyNNaker supports the addition of Populations and Projections into the application space between a
- sPyNNaker does not support the addition of Populations and Projections between multiple calls to
reset()must be called before a Population or Projection is added.
PyNN missing functionality
- sPyNNaker does not support
PopulationViewor accessing individual neurons in a
- sPyNNaker does not support
- sPyNNaker does not support the changing of weights or delays between calls to
reset()must be called before changes to weights and delays are made.
All parameters and their ranges are under software control.
Weights are held as 16-bit integers, with their range determined at compile-time to suit the application; this limits the overall range of weights that can be represented, with the smallest representable weight being dependent on the largest weights specified.
There is a limit on the length of delays of between 1 and 144 time steps (i.e., 1 - 144ms when using 1ms time steps, or 0.1 - 14.4ms when using 0.1ms time steps). Delays of more than 16 time steps require an additional “delay population” to be added; this is done automatically by the software when such delays are detected.
Membrane voltages and other neuron parameters are generally held as 32-bit fixed point numbers in the s16.15 format. Membrane voltages are held in mV.
Synapse and neuron loss
Projection links between two sub-populations that were initially defined as connected are removed by the software the number if the number of connections between the two sub-populations is determined to be zero when the projection is realised in the software’s mapping process.
The SpiNNaker communication fabric can drop packets, so there is the chance that during execution that spikes might not reach their destination (or might only reach some of their destinations). The software attempts to recover from such losses through a reinjection mechanism, but this will only work if the overall spike rate is not high enough to overload the communications fabric in the first place.