описание

The prostate cancer is the second most common cancer in Russian men with an incidence rate of 12.3%. Its incidence rate in China has increased significantly with an average annual growth rate of 12.07%. Recently, the incidence rate in such cities Beijing and Shanghai is close to that in European countries.
Puncture biopsy is the gold standard for the definite diagnosis of prostate cancer. However, the manual puncture is difficult to meet the clinical needs in terms of accuracy and efficiency. Prostate puncture robot can improve surgery quality and reduce medical costs. Up to now, numerous prostate surgical robots have been developed. However, these robots lack the perception and control of the puncture force between surgical instruments and soft tissues, which will make the operation process unsafe and may lead to irreversible damage. Therefore, study on the force perception and control methods in robot assisted puncture surgery is significant for facilitating the safe application of surgical robot and eventually improving people’s well-being.
The puncture force sensing methods include the sensor-based direct sensing and the vision-based indirect sensing. The former involves such issues as biological and size compatibility, disinfection and cost of sensors. The latter is difficult to accurately perceive the complicated puncture force because they are based on 2D visual information or the reconstructed 3D surface information of tissues. On the other hand, the existing force control methods such as the active rigid control, impact control, tolerance control, implicit and explicit control are difficult to be used for the complex puncture force model, and cannot overcome the adverse effects of physiological motion of patients. The real-time 3D ultrasound imaging can reflect the 3D structural information of soft tissues, the deformation and position change of soft tissues in real time, thereby providing a solid foundation for addressing the above issues. Therefore, this project aims at studying the force perception and control methods based on real-time 3D ultrasound to ensure the effectiveness and safety of puncture operation.
Chinese researchers have done solid research on the development of real-time 3D ultrasound imaging system, prostate puncture robot and deep learning based 2D puncture force estimation. Russian researchers have done deep research on nonlinear system modeling and identification, stochastic adaptive control and optimal control theory. These preliminary researches will ensure the smooth progress of this project.

The published papers related to this project on both sides include:
[1]Yibo Wang, Zhichao Ye, Mingwei Wen, Huageng Liang, Xuming Zhang*. TransVFS: A spatio-temporal local-global transformer for vision-based force sensing during ultrasound-guided prostate biopsy. Medical Image Analysis, 2024. https://doi.org/10.1016/j.media.2024.103130
[2]Quan Zhou,Bin Yu, Feng Xiao, Mingyue Ding, Zhiwei Wang*, Xuming Zhang*. Robust semi-supervised 3D medical image segmentation with diverse joint-task learning and decoupled inter-student learning. IEEE Transactions on Medical Imaging, 2024.
https://ieeexplore.ieee.org/document/10422981
[3]Shaozhuang Ye, Tuo Wang, Mingyue Ding, Xuming Zhang*. F-DARTS: Foveated differentiable architecture search based multimodal medical image fusion. IEEE Transactions on Medical Imaging, 2023, 42(11): 3348-3361.
[4]Mingwei Wen, Quan Zhou, Bo Tao, Pavel Shcherbakov, Yang Xu, Xuming Zhang*. Short-term and long-term memory self-attention network for segmentation of tumours in 3D medical images. CAAI Transactions on Intelligence Technology, 2023, 8: 1524-1537.
[5]Quan Zhou, Zhiwen Huang, Mingyue Ding, Xuming Zhang*. Medical image classification using light-weight CNN with spiking cortical model based attention module. IEEE Journal of Biomedical and Health Informatics, 2023, 27(4): 1991-2002.
[6]Xingxing Zhu, Mingyue Ding, Xuming Zhang*. Free form deformation and symmetry constraint-based multi-modal brain image registration using generative adversarial nets. CAAI Transactions on Intelligence Technology, 2023, 8:1492-1506.
[7]Quan Zhou, Shaozhuang Ye, Mingwei Wen, Zhiwen Huang, Mingyue Ding, Xuming Zhang*. Multi-modal medical image fusion based on densely-connected high-resolution CNN and hybrid transformer. Neural Computing & Applications, 2022, 34(24): 21741-21761.
[8]Xingxing Zhu, Zhiwen Huang, Mingyue Ding, Xuming Zhang*. Non-rigid multi-modal brain image registration based on two-stage generative adversarial nets. Neurocomputing, 2022, 505: 44-57.
[9]Fei Zhu, Xingxing Zhu, Zhiwen Huang, Mingyue Ding, Qiang Li*, Xuming Zhang*. Deep learning based data-adaptive descriptor for non-rigid multi-modal medical image registration. Signal Processing, 2021, 183: 108023.
[10]Hongzhang Hong, Xiaojuan Qin, Shengwei Zhang, Feixiang Xiang, Yujie Xu, Haibing Xiao, Gallina Kazobinka, Wen Ju, Fuqing Zeng, Xiaoping Zhang, Mingyue Ding, Huageng Liang*, Xuming Zhang*. Usefulness of real-time three-dimensional ultrasound in percutaneous nephrostomy: an animal study. BJU International, 2018, 122(4): 1-5.
[11]Fei Zhu, Mingyue Ding, Xuming Zhang*. Self-similarity inspired local descriptor for non-rigid multi-modal image registration. Information Sciences, 2016, 372: 16-31.
[12]Feng Yang, Mingyue Ding, Xuming Zhang*, Wenguang Hou, Cheng Zhong. Non-rigid multi-modal medical image registration by combining L-BFGS-B with cat swarm optimization. Information Sciences, 2015, 316: 440-456.
[13]Oleg Granichin, Victoria Erofeeva, Yury Ivanskiy, Yuming Jiang. Simultaneous perturbation stochastic approximation-based consensus for tracking under unknown-but-bounded disturbances. IEEE Transactions on Automatic Control, 2021, 66(8): 3710-3717.
[16]Konstantin Amelin, Oleg Granichin, Natalia Kizhaeva, Zeev Volkovich. Patterning of writing
style evolution by means of dynamic similarity pattern recognition. Pattern Recognition, 2018,77,45-64.
[17]Amelin Konstantin, Oleg Granichin*. Randomized control strategies under arbitrary external noise. IEEE Transactions on Automatic Control, 2016, 61(5): 1328-1333.

The planned activities:
The scheme of this project is shown in Fig. 1. Totally, the project will involve the following researches.
(1)Real-time estimation of prostate deformations in 3D ultrasound images using the unsupervised registration method (Chinese side)
(2)Real-time estimation of puncture force using the channel attention network (Chinese side)
(3)Modeling of resistance force and identification of model parameters using randomized control strategies (Russian side)
(4)Design of the force control method using the liquid neural network (Russian side)
(5) Experimental evaluation of force sensing and control methods (Both sides)

As for the research on force sensing, it involves training and testing phases. In the training phase, the deep learning (DL) model for image registration is trained based on the collected ultrasound images, and then used to generate the prostate deformation samples. The obtained deformation samples and the corresponding puncture force (i.e., labels) are used to train the force sensing DL model. In the testing phase, the tested 3D ultrasound images are used to realize the real-time force estimation using the two trained DL models.
As for the force control method, the resistance force model is firstly built using the finite element analysis. Then the model parameters are identified using maximum likelihood estimation based on 3D ultrasound images and the estimated force provided by Chinese researchers. Finally, the control method is designed to realize its precise force control using the liquid neural network, which will be trained using the randomized stochastic approximation methods.
As for evaluation of the force sensing and control methods, experiments on prostate phantom will be done by both sides to appreciate the performance of the proposed force sensing and control methods.


It is expected that this project will not only contribute to the development of mathematics, artificial intelligence, biomedical engineering and other disciplines, but also promote SPBU’s Values & HUST Global Strategy 2030, thereby benefiting the advancement of Science and Technology and leap-forward development of HUST and SPBU. Meanwhile, this project will facilitate improving the safety, reliability and accuracy of robot-assisted prostate biopsy. In this way, the planned cooperation will eventually facilitate the well-being of Chinese and Russian people by providing more reliable prostate cancer diagnosis technology.

Through the cooperation among the Chinese and Russian researchers, we expect to construct the new theory and methods of puncture force sensing and control during the robot-assisted prostate biopsy. Meanwhile, the cooperation project will facilitate building an international innovative research team from both sides in the field of force sensing and control. The researchers from both sides will publish 1-2 SCI papers in the international journals through the cooperation research.

For further implementation of the project. In order to attract additional funding, it is planned to submit a join application for the 2024 RSF-NSFC competition “Conducting fundamental scientific research and exploratory scientific research by international scientific teams” (NSFC).
АкронимJSF HUST 2024
СтатусНе запущен

    Области исследований

  • Стохастические системы, Рандомизированные алгоритмы, Адаптивное управление, Стохастические системы, Рандомизированные алгоритмы, Адаптивное управление Идентификация систем, Хирургическая навигация, Хирургический робот

ID: 121071271