TUTCRIS - Tampereen teknillinen yliopisto

TUTCRIS

Generalized model of biological neural networks: Progressive operational perceptrons

Tutkimustuotosvertaisarvioitu

Yksityiskohdat

AlkuperäiskieliEnglanti
Otsikko2017 International Joint Conference on Neural Networks, IJCNN 2017
KustantajaIEEE
Sivut2477-2485
Sivumäärä9
ISBN (elektroninen)9781509061815
DOI - pysyväislinkit
TilaJulkaistu - 30 kesäkuuta 2017
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisussa
TapahtumaINTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS -
Kesto: 1 tammikuuta 1900 → …

Julkaisusarja

Nimi
ISSN (elektroninen)2161-4407

Conference

ConferenceINTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS
Ajanjakso1/01/00 → …

Tiivistelmä

Traditional Artificial Neural Networks (ANNs) such as Multi-Layer Perceptrons (MLPs) and Radial Basis Functions (RBFs) were designed to simulate biological neural networks; however, they are based only loosely on biology and only provide a crude model. This in turn yields well-known limitations and drawbacks on the performance and robustness. In this paper we shall address them by introducing a novel feed-forward ANN model, Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-)linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration.

!!ASJC Scopus subject areas

Julkaisufoorumi-taso