We extend the geometrical approach to the Perceptron and show that, given n examples, learning is of maximal difficulty when the number of inputs d is such that n = 5d. We then present a new Perceptron algorithm that takes advantage of the peculiarities of the cost function. In our tests it is more than two times faster that the standard algorithm. More importantly it does not have fixed parameters, like the usual learning constant \eta, but it adapts them to the cost function. We show that there exist an optimal choice for \beta, the steepness of the transfer function. We present also a brief systematic study of the parameters \eta and \beta of the standard Perceptron algorithm.
Some Notes on Perceptron Learning
BUDINICH, MARCO
1993-01-01
Abstract
We extend the geometrical approach to the Perceptron and show that, given n examples, learning is of maximal difficulty when the number of inputs d is such that n = 5d. We then present a new Perceptron algorithm that takes advantage of the peculiarities of the cost function. In our tests it is more than two times faster that the standard algorithm. More importantly it does not have fixed parameters, like the usual learning constant \eta, but it adapts them to the cost function. We show that there exist an optimal choice for \beta, the steepness of the transfer function. We present also a brief systematic study of the parameters \eta and \beta of the standard Perceptron algorithm.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.