"Preserve Pre-trained Knowledge: Transfer Learning With Self-Distillation ..."

Yang Zhou et al. (2022)

Details and statistics

DOI: 10.48550/ARXIV.2205.00506

access: open

type: Informal or Other Publication

metadata version: 2022-05-03

a service of  Schloss Dagstuhl - Leibniz Center for Informatics