"Compressing Pre-trained Models of Code into 3 MB."

Jieke Shi et al. (2022)

Details and statistics

DOI: 10.1145/3551349.3556964

access: closed

type: Conference or Workshop Paper

metadata version: 2023-09-30

a service of  Schloss Dagstuhl - Leibniz Center for Informatics