Recent advancements in deep learning have significantly improved performance on computer vision tasks. Previous image classification methods primarily modify model architectures or add features, and ...
Abstract: Self-knowledge distillation has emerged as a powerful method, notably boosting the prediction accuracy of deep neural networks while being resource-efficient, setting it apart from ...
Abstract: Self-distillation, as a knowledge distillation method that does not rely on a teacher network, enhances the overall performance of a neural network by enabling self-learning through its own ...
The codes and datasets for "Can Large Models Teach Student Models to Solve Mathematical Problems Like Human Beings? A Reasoning Distillation Method via Multi-LoRA Interaction" [IJCAI 2025]. All ...