Decoder-based Sense Knowledge Distillation for LLMs
A novel framework, Decoder-based Sense Knowledge Distillation (DSKD), integrates structured lexical resources into the training of decoder-style large language models (LLMs). This approach enhances performance without requiring dictionary lookups at ...