Speeding Up Entmax

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

Association for Computational Linguistics

Abstract

Softmax is the de facto standard for normalizing logits in modern neural networks for language processing. However, by producing a dense probability distribution each token in the vocabulary has a nonzero chance of being selected at each generation step, leading to a variety of reported problems in text generation. α entmax of Peters et al. (2019) solves this problem, but is unfortunately slower than softmax. In this paper, we propose an alternative to α entmax, which keeps its virtuous characteristics, but is as fast as optimized softmax and achieves on par or better performance in machine translation task.

Description

Keywords

Citation

Tezekbayev Maxat; Nikoulina Vassilina; Gallé Matthias; Assylbekov Zhenisbek. (2022). Speeding Up Entmax. Findings of the Association for Computational Linguistics: NAACL 2022. https://doi.org/10.18653/v1/2022.findings-naacl.86

Collections

Endorsement

Review

Supplemented By

Referenced By