"Efficient Knowledge Distillation: Empowering Small Language Models with ..."

Mohamad Ballout et al. (2024)

Details and statistics

DOI: 10.1007/978-3-031-70239-6_3

access: closed

type: Conference or Workshop Paper

metadata version: 2024-09-26