"BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 Languages."

Benjamin Heinzerling, Michael Strube (2017)

Details and statistics

DOI:

access: open

type: Informal or Other Publication

metadata version: 2018-08-13

a service of  Schloss Dagstuhl - Leibniz Center for Informatics