Initial Explorations on Regularizing the SCRN Model

Loading...
Thumbnail Image

Date

2018-05

Authors

Kabdolov, Olzhas

Journal Title

Journal ISSN

Volume Title

Publisher

Nazarbayev University School of Science and Technology

Abstract

Recurrent neural networks are very powerful sequence models which are used for language modeling as well. Under correct regularization such as naive dropout these models are able to achieve substantial improvement in their performance. We regularize the Structurally Constrained Recurrent Network (SCRN) model and show that despite its simplicity it can achieve the performance comparable to the ubiquitous LSTM model in language modeling task while being smaller in size and up to 2x faster to train. Further analysis shows that regularizing both context and hidden states of the SCRN is crucial.

Description

Keywords

recurrent neural networks, words embeddings, regularization, naive dropout

Citation