We extend the geometrical approach to the Perceptron and show that, given n examples, learning is of maximal difficulty when the number of inputs d is such that n = 5d. We then present a new Perceptron algorithm that takes advantage of the peculiarities of the cost function. In our tests it is more than two times faster that the standard algorithm. More importantly it does not have fixed parameters, like the usual learning constant \eta, but it adapts them to the cost function. We show that there exist an optimal choice for \beta, the steepness of the transfer function. We present also a brief systematic study of the parameters \eta and \beta of the standard Perceptron algorithm.

Some Notes on Perceptron Learning

BUDINICH, MARCO
1993-01-01

Abstract

We extend the geometrical approach to the Perceptron and show that, given n examples, learning is of maximal difficulty when the number of inputs d is such that n = 5d. We then present a new Perceptron algorithm that takes advantage of the peculiarities of the cost function. In our tests it is more than two times faster that the standard algorithm. More importantly it does not have fixed parameters, like the usual learning constant \eta, but it adapts them to the cost function. We show that there exist an optimal choice for \beta, the steepness of the transfer function. We present also a brief systematic study of the parameters \eta and \beta of the standard Perceptron algorithm.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11368/2558394
 Avviso

Registrazione in corso di verifica.
La registrazione di questo prodotto non è ancora stata validata in ArTS.

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact