"Reminding the incremental language model via data-free self-distillation."

Han Wang et al. (2023)

Details and statistics

DOI: 10.1007/S10489-022-03678-Y

access: closed

type: Journal Article

metadata version: 2024-08-19