TUTCRIS - Tampereen teknillinen yliopisto

TUTCRIS

Variance Preserving Initialization for Training Deep Neuromorphic Photonic Networks with Sinusoidal Activations

Tutkimustuotosvertaisarvioitu

Yksityiskohdat

AlkuperäiskieliEnglanti
Otsikko2019 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 - Proceedings
KustantajaIEEE
Sivut1483-1487
Sivumäärä5
ISBN (elektroninen)9781479981311
DOI - pysyväislinkit
TilaJulkaistu - 1 toukokuuta 2019
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisussa
TapahtumaIEEE International Conference on Acoustics, Speech, and Signal Processing - Brighton, Iso-Britannia
Kesto: 12 toukokuuta 201917 toukokuuta 2019

Julkaisusarja

Nimi
ISSN (elektroninen)2379-190X

Conference

ConferenceIEEE International Conference on Acoustics, Speech, and Signal Processing
MaaIso-Britannia
KaupunkiBrighton
Ajanjakso12/05/1917/05/19

Tiivistelmä

Photonic neuromorphic hardware can provide significant performance benefits for Deep Learning (DL) applications by accelerating and reducing the energy requirements of DL models. However, photonic neuromorphic architectures employ different activation elements than those traditionally used in DL, slowing down the convergence of the training process for such architectures. An initialization scheme that can be used to efficiently train deep photonic networks that employ quadratic sinusoidal activation functions is proposed in this paper. The proposed initialization scheme can overcome these limitations, leading to faster and more stable training of deep photonic neural networks. The ability of the proposed method to improve the convergence of the training process is experimentally demonstrated using two different DL architectures and two datasets.

Tutkimusalat

Julkaisufoorumi-taso