"Transformer-XL: Attentive Language Models beyond a Fixed-Length Context."

Zihang Dai et al. (2019)

Details and statistics

DOI: 10.18653/V1/P19-1285

access: open

type: Conference or Workshop Paper

metadata version: 2021-08-06

a service of  Schloss Dagstuhl - Leibniz Center for Informatics