Loading [MathJax]/extensions/MathMenu.js
Gaze-Guided Robotic Vascular Ultrasound Leveraging Human Intention Estimation | IEEE Journals & Magazine | IEEE Xplore

Gaze-Guided Robotic Vascular Ultrasound Leveraging Human Intention Estimation


Abstract:

Medical ultrasound has been widely used to examine vascular structure in modern clinical practice. However, traditional ultrasound examination often faces challenges rela...Show More

Abstract:

Medical ultrasound has been widely used to examine vascular structure in modern clinical practice. However, traditional ultrasound examination often faces challenges related to inter- and intra-operator variation. The robotic ultrasound system (RUSS) appears as a potential solution for such challenges because of its superiority in stability and reproducibility. Given the complex anatomy of human vasculature, multiple vessels often appear in ultrasound images, or a single vessel bifurcates into branches, complicating the examination process. To tackle this challenge, this work presents a gaze-guided RUSS for vascular applications. A gaze tracker captures the eye movements of the operator. The extracted gaze signal guides the RUSS to follow the correct vessel when it bifurcates. Additionally, a gaze-guided segmentation network is proposed to enhance segmentation robustness by exploiting gaze information. However, gaze signals are often noisy, requiring interpretation to accurately discern the operator's true intentions. To this end, this study proposes a stabilization module to process raw gaze data. The inferred attention heatmap is utilized as a region proposal to aid segmentation and serve as a trigger signal when the operator needs to adjust the scanning target, such as when a bifurcation appears. To ensure appropriate contact between the probe and surface during scanning, an automatic ultrasound confidence-based orientation correction method is developed. In experiments, we demonstrated the efficiency of the proposed gaze-guided segmentation pipeline by comparing it with other methods. Besides, the performance of the proposed gaze-guided RUSS was also validated as a whole on a realistic arm phantom with an uneven surface.
Published in: IEEE Robotics and Automation Letters ( Volume: 10, Issue: 4, April 2025)
Page(s): 3078 - 3085
Date of Publication: 07 February 2025

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

Medical ultrasound, valued for its non-invasiveness, portability, real-time performance, and affordability, is widely used in clinical practice for screening and intra-operative guidance. However, traditional free-hand ultrasound suffers from inter- and intra-operator variances. Image quality depends on acquisition parameters like contact forces, angles, and probe positioning [1], [2], making it highly operator-dependent and reducing result reproducibility [3]. To tackle such a dilemma, robotic ultrasound systems (RUSS) offer a promising solution to address these challenges [4], [5], [6].

Select All
1.
J. Tan et al., "Autonomous trajectory planning for ultrasound-guided real-time tracking of suspicious breast tumor targets", IEEE Trans. Autom. Sci. Eng., vol. 21, no. 3, pp. 2478-2493, Jul. 2024.
2.
Q. Huang, J. Lan and X. Li, "Robotic arm based automatic ultrasound scanning for three-dimensional imaging", IEEE Trans. Ind. Inform., vol. 15, no. 2, pp. 1173-1182, Feb. 2019.
3.
J. Tan et al., "A flexible and fully autonomous breast ultrasound scanning system", IEEE Trans. Autom. Sci. Eng., vol. 20, no. 3, pp. 1920-1933, Jul. 2023.
4.
Z. Jiang et al., "Robotic ultrasound imaging: State-of-the-art and future perspectives", Med. Image Anal., 2023.
5.
M. Akbari et al., "Robot-assisted breast ultrasound scanning using geometrical analysis of the seroma and image segmentation", Proc. 2021 IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 3784-3791, 2021.
6.
Y. Bi, Z. Jiang, F. Duelmer, D. Huang and N. Navab, "Machine learning in robotic ultrasound imaging: Challenges and perspectives", Annu. Rev. Control Robot. Auton. Syst., vol. 7, pp. 335-357, 2024.
7.
K. Li, Y. Xu and M. Q. -H. Meng, "An overview of systems and techniques for autonomous robotic ultrasound acquisitions", IEEE Trans. Med. Robot. Bionics, vol. 3, no. 2, pp. 510-524, May 2021.
8.
F. von Haxthausen et al., "Medical robotics for ultrasound imaging: Current systems and future trends", Curr. Robot. Rep., vol. 2, no. 1, pp. 55-71, 2021.
9.
Q. Huang, J. Zhou and Z. Li, "Review of robot-assisted medical ultrasound imaging systems: Technology and clinical applications", Neurocomputing, vol. 559, 2023.
10.
Z. Jiang et al., "Autonomous robotic screening of tubular structures based only on real-time ultrasound imaging feedback", IEEE Trans. Ind. Electron., vol. 69, no. 7, pp. 7064-7075, Jul. 2022.
11.
Q. Huang, B. Gao and M. Wang, "Robot-assisted autonomous ultrasound imaging for carotid artery", IEEE Trans. Instrum. Meas., vol. 73, 2024.
12.
W. Wahood, S. Ghozy, A. Al-Abdulghani and D. F. Kallmes, "Radial artery diameter: A comprehensive systematic review of anatomy", J. neurointerventional Surg., vol. 14, no. 12, pp. 1274-1278, 2022.
13.
D. P. Noonan, G. P. Mylonas, A. Darzi and G.-Z. Yang, "Gaze contingent articulated robot control for robot assisted minimally invasive surgery", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 1186-1191, 2008.
14.
I. Tong, O. Mohareri, S. Tatasurya, C. Hennessey and S. Salcudean, "A retrofit eye gaze tracker for the da Vinci and its integration in task execution using the da vinci research kit", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 2043-2050, 2015.
15.
J. Guo et al., "A novel robotic guidance system with eye-gaze tracking control for needle-based interventions", IEEE Trans. Cogn. Dev. Syst., vol. 13, no. 1, pp. 179-188, Mar. 2021.
16.
F. Pierrot et al., "Hippocrate: A safe robot arm for medical applications with force feedback", Med. Image Anal., vol. 3, no. 3, pp. 285-300, 1999.
17.
Z. Jiang, M. Grimm, M. Zhou, Y. Hu, J. Esteban and N. Navab, "Automatic force-based probe positioning for precise robotic ultrasound acquisition", IEEE Trans. Ind. Electron., vol. 68, no. 11, pp. 11200-11211, Nov. 2021.
18.
P. Chatelain, A. Krupa and N. Navab, "Confidence-driven control of an ultrasound probe", IEEE Trans. Robot., vol. 33, no. 6, pp. 1410-1424, Dec. 2017.
19.
A. Karamalis, W. Wein, T. Klein and N. Navab, "Ultrasound confidence maps using random walks", Med. Image Anal., vol. 16, no. 6, pp. 1101-1112, 2012.
20.
Z. Jiang et al., "Precise repositioning of robotic ultrasound: Improving registration-based motion compensation using ultrasound confidence optimization", IEEE Trans. Instrum. Meas., vol. 71, 2022.
21.
Y. Bi, Z. Jiang, Y. Gao, T. Wendler, A. Karlas and N. Navab, "VesNet-RL: Simulation-based reinforcement learning for real-world us probe navigation", IEEE Robot. Autom. Lett., vol. 7, no. 3, pp. 6638-6645, Jul. 2022.
22.
R. Goel, F. Abhimanyu, K. Patel, J. Galeotti and H. Choset, "Autonomous ultrasound scanning using Bayesian optimization and hybrid force control", Proc. IEEE Int. Conf. Robot. Automat., pp. 8396-8402, 2022.
23.
Y. Fu, W. Lin, X. Yu, J. J. Rodríguez-Andina and H. Gao, "Robot-assisted teleoperation ultrasound system based on fusion of augmented reality and predictive force", IEEE Trans. Ind. Electron., vol. 70, no. 7, pp. 7449-7456, Jul. 2023.
24.
D. Black, Y. Oloumi, A. H. Yazdi, H. Hosseinabadi and S. Salcudean, "Human teleoperation-a haptically enabled mixed reality system for teleultrasound", Hum.–Comput. Interact., vol. 39, no. 5-6, pp. 529-552, 2024.
25.
D. Huang, C. Yang, M. Zhou, A. Karlas, N. Navab and Z. Jiang, "Robot-assisted deep venous thrombosis ultrasound examination using virtual fixture", IEEE Trans. Automat. Sci. Eng., vol. 22, pp. 381-392, 2024.
26.
D. P. Noonan, G. P. Mylonas, J. Shang, C. J. Payne, A. Darzi and G.-Z. Yang, "Gaze contingent control for an articulated mechatronic laparoscope", Proc. 2010 3rd IEEE RAS EMBS Int. Conf. Biomed. Robot. Biomechatronics, pp. 759-764, 2010.
27.
K. Fujii, G. Gras, A. Salerno and G.-Z. Yang, "Gaze gesture based human robot interaction for laparoscopic surgery", Med. Image Anal., vol. 44, pp. 196-214, 2018.
28.
N. T. Clancy, G. P. Mylonas, G.-Z. Yang and D. S. Elson, "Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery", Proc. 2011 Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., pp. 5396-5399, 2011.
29.
G. P. Mylonas et al., "Gaze-contingent motor channelling haptic constraints and associated cognitive demand for robotic MIS", Med. Image Anal., vol. 16, no. 3, pp. 612-631, 2012.
30.
K.-W. Kwok, L.-W. Sun, G. P. Mylonas, D. R. James, F. Orihuela-Espina and G.-Z. Yang, "Collaborative gaze channelling for improved cooperation during robotic assisted surgery", Ann. Biomed. Eng., vol. 40, pp. 2156-2167, 2012.
Contact IEEE to Subscribe

References

References is not available for this document.