Scheduled Sampling for Transformers

Tsvetomila Mihaylova, André F. T. Martins


Abstract
Scheduled sampling is a technique for avoiding one of the known problems in sequence-to-sequence generation: exposure bias. It consists of feeding the model a mix of the teacher forced embeddings and the model predictions from the previous step in training time. The technique has been used for improving model performance with recurrent neural networks (RNN). In the Transformer model, unlike the RNN, the generation of a new word attends to the full sentence generated so far, not only to the last word, and it is not straightforward to apply the scheduled sampling technique. We propose some structural changes to allow scheduled sampling to be applied to Transformer architectures, via a two-pass decoding strategy. Experiments on two language pairs achieve performance close to a teacher-forcing baseline and show that this technique is promising for further exploration.
Anthology ID:
P19-2049
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Fernando Alva-Manchego, Eunsol Choi, Daniel Khashabi
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
351–356
Language:
URL:
https://aclanthology.org/P19-2049
DOI:
10.18653/v1/P19-2049
Bibkey:
Cite (ACL):
Tsvetomila Mihaylova and André F. T. Martins. 2019. Scheduled Sampling for Transformers. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 351–356, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Scheduled Sampling for Transformers (Mihaylova & Martins, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-2049.pdf
Code
 additional community code