Initial Explorations on Regularizing the SCRN Model
Loading...
Files
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Nazarbayev University School of Science and Technology
Abstract
Recurrent neural networks are very powerful sequence models which are used for language modeling as well. Under correct regularization such as naive dropout these models are able to achieve substantial improvement in their performance. We regularize the Structurally Constrained Recurrent Network (SCRN) model and show that despite its simplicity it can achieve the performance comparable to the ubiquitous LSTM model in language modeling task while being smaller in size and up to 2x faster to train. Further analysis shows that regularizing both context and hidden states of the SCRN is crucial.
Description
Citation
Collections
Endorsement
Review
Supplemented By
Referenced By
Creative Commons license
Except where otherwised noted, this item's license is described as Attribution-NonCommercial-ShareAlike 3.0 United States
