Abstract
To address the challenges of deploying deep learning models in resource-constrained environments and to ensure the compatibility between diagnostic accuracy and compression ratio, we propose a lightweight Multi-Teacher Knowledge Distillation Fault Diagnosis (BFD-KD) method for rolling bearing fault diagnosis. This method employs a dynamic weighting mechanism based on cross-entropy loss between teacher model predictions and true labels, while integrating intermediate-layer features from multiple teacher models to effectively resolve issues of knowledge competition and fusion. Experimental results on the publicly available Huazhong University of Science and Technology bearing data set demonstrate, in the case of small samples, the BFD-KD achieves a significantly higher compression ratio while maintaining superior diagnostic accuracy compared to existing multi-teacher knowledge distillation methods. Furthermore, its lightweight design and robustness are validated on the UORED-VAFCLS industrial data set, confirming its effectiveness and practicality for real-world deployment in resource-limited devices.
Get full access to this article
View all access options for this article.
