Reservoir computing is a new approach to constructing neural networks that tries to combine several of the useful features of recurrent networks with a feedforward-like ease of training[3]. The typical network here consists of a large recurrent network (the "reservoir"), which receives the input and a smaller "readout" circuit, which receives connections from some or all of the neurons in the recurrent network but doesn't connect back to them. The connection weights in the recurrent network are fixed when the network is constructed and then don't change. Reservoir computing is based on a pair of insights, each of which is more or less relevant in different contexts.
Source: here