Reservoir Computing (RC) refers to a Recurrent Neural Networks (RNNs) framework, frequently used for sequence learning and time series prediction. The RC system consists of a random fixed-weight RNN (the input-hidden reservoir layer) and a classifier (the hidden-output readout layer). Here we focus on the sequence learning problem, and we explore a different approach to RC. More specifically, we remove the non-linear neural activation function, and we consider an orthogonal reservoir acting on normalized states on the unit hypersphere. Surprisingly, our numerical results show that the systems memory capacity exceeds the dimensionality of the reservoir, which is the upper bound for the typical RC approach based on Echo State Networks (ESNs). We also show how the proposed system can be applied to symmetric cryptography problems, and we include a numerical implementation.