Tampere University of Technology

TUTCRIS Research Portal

Variance Preserving Initialization for Training Deep Neuromorphic Photonic Networks with Sinusoidal Activations

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Details

Original languageEnglish
Title of host publication2019 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2019 - Proceedings
PublisherIEEE
Pages1483-1487
Number of pages5
ISBN (Electronic)9781479981311
DOIs
Publication statusPublished - 1 May 2019
Publication typeA4 Article in a conference publication
EventIEEE International Conference on Acoustics, Speech, and Signal Processing - Brighton, United Kingdom
Duration: 12 May 201917 May 2019

Publication series

Name
ISSN (Electronic)2379-190X

Conference

ConferenceIEEE International Conference on Acoustics, Speech, and Signal Processing
CountryUnited Kingdom
CityBrighton
Period12/05/1917/05/19

Abstract

Photonic neuromorphic hardware can provide significant performance benefits for Deep Learning (DL) applications by accelerating and reducing the energy requirements of DL models. However, photonic neuromorphic architectures employ different activation elements than those traditionally used in DL, slowing down the convergence of the training process for such architectures. An initialization scheme that can be used to efficiently train deep photonic networks that employ quadratic sinusoidal activation functions is proposed in this paper. The proposed initialization scheme can overcome these limitations, leading to faster and more stable training of deep photonic neural networks. The ability of the proposed method to improve the convergence of the training process is experimentally demonstrated using two different DL architectures and two datasets.

Keywords

  • Neuromorphic Hardware, Photonic Neural Networks, Sinusoidal Activations

Publication forum classification

Field of science, Statistics Finland