"MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing ..."

Wenhui Wang et al. (2021)

Details and statistics

DOI: 10.18653/V1/2021.FINDINGS-ACL.188

access: open

type: Conference or Workshop Paper

metadata version: 2024-04-19

a service of  Schloss Dagstuhl - Leibniz Center for Informatics