Ultrasound-guided needle tracking with deep learning: A novel approach with photoacoustic ground truth

dc.contributor.author Hui, Xie
dc.contributor.author Rajendran, Praveenbalaji
dc.contributor.author Ling, Tong
dc.contributor.author Dai, Xianjin
dc.contributor.author Xing, Lei
dc.contributor.author Pramanik, Manojit
dc.contributor.department Department of Electrical and Computer Engineering
dc.date.accessioned 2024-07-23T15:56:19Z
dc.date.available 2024-07-23T15:56:19Z
dc.date.issued 2023-12-02
dc.description.abstract Accurate needle guidance is crucial for safe and effective clinical diagnosis and treatment procedures. Conventional ultrasound (US)-guided needle insertion often encounters challenges in consistency and precisely visualizing the needle, necessitating the development of reliable methods to track the needle. As a powerful tool in image processing, deep learning has shown promise for enhancing needle visibility in US images, although its dependence on manual annotation or simulated data as ground truth can lead to potential bias or difficulties in generalizing to real US images. Photoacoustic (PA) imaging has demonstrated its capability for high-contrast needle visualization. In this study, we explore the potential of PA imaging as a reliable ground truth for deep learning network training without the need for expert annotation. Our network (UIU-Net), trained on ex vivo tissue image datasets, has shown remarkable precision in localizing needles within US images. The evaluation of needle segmentation performance extends across previously unseen ex vivo data and in vivo human data (collected from an open-source data repository). Specifically, for human data, the Modified Hausdorff Distance (MHD) value stands at approximately 3.73, and the targeting error value is around 2.03, indicating the strong similarity and small needle orientation deviation between the predicted needle and actual needle location. A key advantage of our method is its applicability beyond US images captured from specific imaging systems, extending to images from other US imaging systems.
dc.description.comments This article is published as Hui, Xie, Praveenbalaji Rajendran, Tong Ling, Xianjin Dai, Lei Xing, and Manojit Pramanik. "Ultrasound-guided needle tracking with deep learning: A novel approach with photoacoustic ground truth." Photoacoustics 34 (2023): 100575. doi: https://doi.org/10.1016/j.pacs.2023.100575. © 2023 The Author(s). This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
dc.identifier.uri https://dr.lib.iastate.edu/handle/20.500.12876/dv6lM2xz
dc.language.iso en
dc.publisher Elsevier GmbH
dc.source.uri https://doi.org/10.1016/j.pacs.2023.100575 *
dc.subject.disciplines DegreeDisciplines::Medicine and Health Sciences::Nanotechnology
dc.subject.disciplines DegreeDisciplines::Medicine and Health Sciences::Analytical, Diagnostic and Therapeutic Techniques and Equipment
dc.subject.disciplines DegreeDisciplines::Engineering::Electrical and Computer Engineering
dc.subject.keywords Needle tracking
dc.subject.keywords Ultrasound imaging
dc.subject.keywords Photoacoustic imaging
dc.subject.keywords Deep learning
dc.title Ultrasound-guided needle tracking with deep learning: A novel approach with photoacoustic ground truth
dc.type article
dc.type.genre article
dspace.entity.type Publication
relation.isAuthorOfPublication 9d566b71-4eda-4bf6-9c9c-dbc2f9051c20
relation.isOrgUnitOfPublication a75a044c-d11e-44cd-af4f-dab1d83339ff
File
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
2023-Pramanik-UltrasoundGuided.pdf
Size:
9.8 MB
Format:
Adobe Portable Document Format
Description:
Collections