Header menu link for other important links
X
Transfer learning and distillation techniques to improve the acoustic modeling of low resource languages
Published in International Speech Communication Association
2017
Volume: 2017-August
   
Pages: 2158 - 2162
Abstract
Deep neural networks (DNN) require large amount of train- ing data to build robust acoustic models for speech recognition tasks. Our work is intended in improving the low-resource lan- guage acoustic model to reach a performance comparable to that of a high-resource scenario with the help of data/model param- eters from other high-resource languages. we explore trans- fer learning and distillation methods, where a complex high resource model guides or supervises the training of low re- source model. The techniques include (i) multi-lingual frame- work of borrowing data from high-resource language while training the low-resource acoustic model. The KL divergence based constraints are added to make the model biased towards low-resource language, (ii) distilling knowledge from the com- plex high-resource model to improve the low-resource acoustic model. The experiments were performed on three Indian lan- guages namely Hindi, Tamil and Kannada. All the techniques gave improved performance and the multi-lingual framework with KL divergence regularization giving the best results. In all the three languages a performance close to or better than high- resource scenario was obtained. Copyright © 2017 ISCA.
About the journal
JournalProceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH
PublisherInternational Speech Communication Association
ISSN2308457X
Open AccessYes
Concepts (15)
  •  related image
    Deep learning
  •  related image
    Deep neural networks
  •  related image
    Distillation
  •  related image
    Linguistics
  •  related image
    Modeling languages
  •  related image
    Speech communication
  •  related image
    Acoustic model
  •  related image
    DISTILLATION METHOD
  •  related image
    KL-DIVERGENCE
  •  related image
    Large amounts
  •  related image
    Low resource languages
  •  related image
    RE SOURCES
  •  related image
    RESOURCE MODEL
  •  related image
    Transfer learning
  •  related image
    Speech recognition