Authors
Jui-Ting Huang, Jinyu Li, Dong Yu, Li Deng, Yifan Gong
Publication date
2013/5/26
Conference
2013 IEEE international conference on acoustics, speech and signal processing
Pages
7304-7308
Publisher
IEEE
Description
In the deep neural network (DNN), the hidden layers can be considered as increasingly complex feature transformations and the final softmax layer as a log-linear classifier making use of the most abstract features computed in the hidden layers. While the loglinear classifier should be different for different languages, the feature transformations can be shared across languages. In this paper we propose a shared-hidden-layer multilingual DNN (SHL-MDNN), in which the hidden layers are made common across many languages while the softmax layers are made language dependent. We demonstrate that the SHL-MDNN can reduce errors by 3-5%, relatively, for all the languages decodable with the SHL-MDNN, over the monolingual DNNs trained using only the language specific data. Further, we show that the learned hidden layers sharing across languages can be transferred to improve recognition accuracy of …
Total citations
Scholar articles
JT Huang, J Li, D Yu, L Deng, Y Gong - 2013 IEEE international conference on acoustics …, 2013