# Connectivity, Dynamics, and Memory in Reservoir Computing with Binary
and Analog Neurons

**L. Buesing, B. Schrauwen, and R. Legenstein**

### Abstract:

Reservoir Computing (RC) systems are powerful models for online computations on
input sequences. They consist of a memoryless readout neuron which is trained
on top of a randomly connected recurrent neural network. RC systems are
commonly used in two flavors: with analog or binary (spiking) neurons in the
recurrent circuits. Previous work indicated a fundamental difference in the
behavior of these two implementations of the RC idea. The performance of a RC
system built from binary neurons seems to depend strongly on the network
connectivity structure. In networks of analog neurons such clear dependency
has not been observed. In this article we address this apparent dichotomy by
investigating the influence of the network connectivity (parametrized by the
neuron in-degree) on a family of network models that interpolates between
analog and binary networks. Our analyses are based on a novel estimation of
the Lyapunov exponent of the network dynamics with the help of branching
process theory, rank measures which estimate the kernel-quality and
generalization capabilities of recurrent networks, and a novel mean-field
predictor for computational performance. These analyses reveal that the phase
transition between ordered and chaotic network behavior of binary circuits
qualitatively differs from the one in analog circuits, leading to differences
in the integration of information over short and long time scales. This
explains the decreased computational performance observed in binary circuits
that are densely connected. The mean-field predictor is also used to bound
the memory function of recurrent circuits of binary neurons.

**Reference:** L. Buesing, B. Schrauwen, and R. Legenstein.
Connectivity, dynamics, and memory in reservoir computing with binary and
analog neurons.
*Neural Computation*, 22(5):1272-1311, 2010.