2.5.0 release
Added
- Added a
n_neurons
property toNetwork
, which gives the number of neurons in the network, including all subnetworks. (#435, #1186) - Added a new example showing how adjusting ensemble tuning curves can improve function approximation. (#1129)
- Added a minimum magnitude option to
UniformHypersphere
. (#799) - Added documentation on RC settings. (#1130)
- Added documentation on improving performance. (#1119, #1130)
- Added
LinearFilter.combine
method to combine twoLinearFilter
instances. (#1312) - Added a method to all neuron types to compute ensemble
max_rates
andintercepts
givengain
andbias
. (#1334)
Changed
- Learning rules now have a
size_in
parameter and attribute, allowing both integers and strings to define the dimensionality of the learning rule. This replaces theerror_type
attribute. (#1307, #1310) EnsembleArray.n_neurons
now gives the total number of neurons in all ensembles, including those in subnetworks. To get the number of neurons in each ensemble, useEnsembleArray.n_neurons_per_ensemble
. (#1186)- The Nengo modelling API document now has summaries to help navigate the page. (#1304)
- The error raised when a
Connection
function returnsNone
is now more clear. (#1319) - We now raise an error when a
Connection
transform is set toNone
. (#1326)
Fixed
- Probe cache is now cleared on simulator reset. (#1324)
- Neural gains are now always applied after the synapse model. Previously, this was the case for decoded connections but not neuron-to-neuron connections. (#1330)
- Fixed a crash when a lock cannot be acquired while shrinking the cache. (#1335, #1336)