TY - JOUR
T1 - Toward Effective Knowledge Distillation for Fine-Grained Object Recognition in Remote Sensing
AU - Gao, Yangte
AU - Deng, Chenwei
AU - Chen, Liang
N1 - Publisher Copyright:
© 2004-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - With advancements in on-board computing devices deployed on remote sensing platforms, the demand for efficiently processing remote sensing imagery has become increasingly prominent. Knowledge distillation, as an effective lightweight method, has been introduced into this domain. Intuitively, distillation from a larger teacher model is expected to yield better performance. However, in our investigation of fine-grained object recognition in remote sensing imagery, we observed a counter-intuitive phenomenon: as the size of the teacher model increases, the performance of the student model initially improves but then degrades. This capacity gap issue hinders effective utilization of stronger teacher models. To address this issue, we propose a novel distillation framework named BL-KD. It integrates two tailored components: the class-level learnable orthogonal projection (CLOP) module and the object rebalance (ORB) module, which are jointly optimized to mitigate the negative impact of the capacity gap while effectively adapting to the unique distributional patterns and challenges inherent in remote sensing imagery. Experiments conducted on multiple fine-grained object recognition tasks in remote sensing demonstrate that our method consistently improves student performance, particularly in scenarios involving large teacher–student gaps, and outperforms several widely used distillation baselines.
AB - With advancements in on-board computing devices deployed on remote sensing platforms, the demand for efficiently processing remote sensing imagery has become increasingly prominent. Knowledge distillation, as an effective lightweight method, has been introduced into this domain. Intuitively, distillation from a larger teacher model is expected to yield better performance. However, in our investigation of fine-grained object recognition in remote sensing imagery, we observed a counter-intuitive phenomenon: as the size of the teacher model increases, the performance of the student model initially improves but then degrades. This capacity gap issue hinders effective utilization of stronger teacher models. To address this issue, we propose a novel distillation framework named BL-KD. It integrates two tailored components: the class-level learnable orthogonal projection (CLOP) module and the object rebalance (ORB) module, which are jointly optimized to mitigate the negative impact of the capacity gap while effectively adapting to the unique distributional patterns and challenges inherent in remote sensing imagery. Experiments conducted on multiple fine-grained object recognition tasks in remote sensing demonstrate that our method consistently improves student performance, particularly in scenarios involving large teacher–student gaps, and outperforms several widely used distillation baselines.
KW - Fine-grained object recognition
KW - knowledge distillation
KW - on-orbit processing
KW - remote sensing
UR - http://www.scopus.com/pages/publications/105011762987
U2 - 10.1109/LGRS.2025.3591045
DO - 10.1109/LGRS.2025.3591045
M3 - Article
AN - SCOPUS:105011762987
SN - 1545-598X
VL - 22
JO - IEEE Geoscience and Remote Sensing Letters
JF - IEEE Geoscience and Remote Sensing Letters
M1 - 6011405
ER -