This work proposes a computational procedure that uses a quantum walk in a complete graph to train classical artificial neural networks. The idea is to apply the quantum walk to search the weight set values. However, it is necessary to simulate a quantum machine to execute the quantum walk. In this way, to minimize the computational cost, the methodology employed to train the neural network will adjust the synaptic weights of the output layer, not altering the weights of the hidden layer, inspired in the method of Extreme Learning Machine. The quantum walk algorithm as a search algorithm is quadratically faster than its classic analog. The quantum walk variance is $O(t)$ while the variance of its classic analog is $O(sqrt{t})$, where $t$ is the time or iteration. In addition to computational gain, another advantage of the proposed procedure is to be possible to know textit{a priori} the number of iterations required to obtain the solutions, unlike the classical training algorithms based on gradient descendent.