Результаты исследований: Научные публикации в периодических изданиях › статья › Рецензирование
Lightweight bearing fault diagnosis via decoupled distillation and low rank adaptation. / Petrosian, O.; Li, P.; He, Y.; Liu, J.; Sun, Z.; Fu, G.; Meng, L.
в: Scientific Reports, Том 15, № 1, 01.12.2025.Результаты исследований: Научные публикации в периодических изданиях › статья › Рецензирование
}
TY - JOUR
T1 - Lightweight bearing fault diagnosis via decoupled distillation and low rank adaptation
AU - Petrosian, O.
AU - Li, P.
AU - He, Y.
AU - Liu, J.
AU - Sun, Z.
AU - Fu, G.
AU - Meng, L.
N1 - Export Date: 01 November 2025; Cited By: 0; Correspondence Address: P. Li; St.Petersburg State University, St Petersburg, 7-9 Universitetskaya Embankment, 199034, Russian Federation; email: st112719@student.spbu.ru
PY - 2025/12/1
Y1 - 2025/12/1
N2 - Rolling bearing fault detection has developed rapidly in the field of fault diagnosis technology, and it occupies a very important position in this field. Deep learning-based bearing fault diagnosis models have achieved significant success. At the same time, with the continuous improvement of new signal processing technologies such as Fourier transform, wavelet transform and empirical mode decomposition, the fault diagnosis technology of rolling bearings has also been greatly developed, and it can be said that it has entered a new research stage. However, most of the existing methods are limited to varying degrees in the industrial field. The main ones are fast feature extraction and computational complexity. The key to this paper is to propose a lightweight bearing fault diagnosis model DKDL-Net to solve these challenges. The model is trained on the CWRU data set by decoupling knowledge distillation and low rank adaptive fine tuning. Specifically, we built and trained a teacher model based on a 6-layer neural network with 69,626 trainable parameters, and on this basis, using decoupling knowledge distillation (DKD) and Low-Rank adaptive (LoRA) fine-tuning, we trained the student sag model DKDL-Net, which has only 6838 parameters. Experiments show that DKDL-Net achieves 99.48% accuracy in computational complexity on the test set while maintaining model performance, which is 0.58% higher than the state-of-the-art (SOTA) model, and our model has lower parameters. © 2025 Elsevier B.V., All rights reserved.
AB - Rolling bearing fault detection has developed rapidly in the field of fault diagnosis technology, and it occupies a very important position in this field. Deep learning-based bearing fault diagnosis models have achieved significant success. At the same time, with the continuous improvement of new signal processing technologies such as Fourier transform, wavelet transform and empirical mode decomposition, the fault diagnosis technology of rolling bearings has also been greatly developed, and it can be said that it has entered a new research stage. However, most of the existing methods are limited to varying degrees in the industrial field. The main ones are fast feature extraction and computational complexity. The key to this paper is to propose a lightweight bearing fault diagnosis model DKDL-Net to solve these challenges. The model is trained on the CWRU data set by decoupling knowledge distillation and low rank adaptive fine tuning. Specifically, we built and trained a teacher model based on a 6-layer neural network with 69,626 trainable parameters, and on this basis, using decoupling knowledge distillation (DKD) and Low-Rank adaptive (LoRA) fine-tuning, we trained the student sag model DKDL-Net, which has only 6838 parameters. Experiments show that DKDL-Net achieves 99.48% accuracy in computational complexity on the test set while maintaining model performance, which is 0.58% higher than the state-of-the-art (SOTA) model, and our model has lower parameters. © 2025 Elsevier B.V., All rights reserved.
KW - Bearing fault diagnosis
KW - Convolutional neural network
KW - Low-Rank Adaptation
KW - Model compression
KW - adaptation
KW - article
KW - compression
KW - controlled study
KW - convolutional neural network
KW - deep learning
KW - diagnosis
KW - distillation
KW - empirical mode decomposition
KW - feature extraction
KW - Fourier transform
KW - human
KW - nerve cell network
KW - signal processing
KW - wavelet transform
UR - https://www.mendeley.com/catalogue/3692d6c9-0745-3860-bca3-1151ffe7413a/
U2 - 10.1038/s41598-025-06734-y
DO - 10.1038/s41598-025-06734-y
M3 - статья
VL - 15
JO - Scientific Reports
JF - Scientific Reports
SN - 2045-2322
IS - 1
ER -
ID: 143467466