Author

Pero Smrzlic

Date of Award

6-1993

Degree Name

Master of Science

Department

Computer Science

Access Setting

Masters Thesis-Open Access

Abstract

In this study, we introduce the Adaptive Back Propagation (ABP) learning algorithm, computationally superior to the standard Back Propagation. The ABP is based on the new activation function, with its corresponding adaptive learning parameter. By using the combination of computer simulations and analysis in the domain of activation function, the Method of One Hidden layer was developed for the effective utilization of units in one-hidden layer networks. A parallel version of the ABP was designed and implemented on a nCUBE-2 supercomputer with 128 processors.

The simulation results suggested a strong correlation between frequency of signals and the role of hidden units. The way patterns are presented during the training process has a significant impact on the obtained results. The results show that the architectures with the perfect distribution of units can be almost perfectly parallelized.

Share

COinS