Academic Journal
Needle tracking in low-resolution ultrasound volumes using deep learning
| Title: | Needle tracking in low-resolution ultrasound volumes using deep learning |
|---|---|
| Authors: | Sarah Grube, Sarah Latus, Finn Behrendt, Oleksandra Riabova, Maximilian Neidhardt, Alexander Schlaefer |
| Source: | Int J Comput Assist Radiol Surg |
| Publisher Information: | Springer Science and Business Media LLC, 2024. |
| Publication Year: | 2024 |
| Subject Terms: | 0206 medical engineering, Deep learning, 02 engineering and technology, Sparse feature learning, Volumetric ultrasound imaging, 03 medical and health sciences, Deep Learning, Imaging, Three-Dimensional, 0302 clinical medicine, Liver, MLE@TUHH, Needles, Needle tip detection, Animals, Original Article, Chickens [MeSH], Deep Learning [MeSH], Ultrasonography, Interventional/methods [MeSH], Animals [MeSH], Needles [MeSH], Real-time, Imaging, Three-Dimensional/methods [MeSH], Liver/diagnostic imaging [MeSH], Chickens, Ultrasonography, Interventional |
| Description: | Purpose Clinical needle insertion into tissue, commonly assisted by 2D ultrasound imaging for real-time navigation, faces the challenge of precise needle and probe alignment to reduce out-of-plane movement. Recent studies investigate 3D ultrasound imaging together with deep learning to overcome this problem, focusing on acquiring high-resolution images to create optimal conditions for needle tip detection. However, high-resolution also requires a lot of time for image acquisition and processing, which limits the real-time capability. Therefore, we aim to maximize the US volume rate with the trade-off of low image resolution. We propose a deep learning approach to directly extract the 3D needle tip position from sparsely sampled US volumes. Methods We design an experimental setup with a robot inserting a needle into water and chicken liver tissue. In contrast to manual annotation, we assess the needle tip position from the known robot pose. During insertion, we acquire a large data set of low-resolution volumes using a 16 $$\times $$ × 16 element matrix transducer with a volume rate of 4 Hz. We compare the performance of our deep learning approach with conventional needle segmentation. Results Our experiments in water and liver show that deep learning outperforms the conventional approach while achieving sub-millimeter accuracy. We achieve mean position errors of 0.54 mm in water and 1.54 mm in liver for deep learning. Conclusion Our study underlines the strengths of deep learning to predict the 3D needle positions from low-resolution ultrasound volumes. This is an important milestone for real-time needle navigation, simplifying the alignment of needle and ultrasound probe and enabling a 3D motion analysis. |
| Document Type: | Article Other literature type |
| Language: | English |
| ISSN: | 1861-6429 |
| DOI: | 10.1007/s11548-024-03234-8 |
| DOI: | 10.15480/882.13574 |
| Access URL: | https://pubmed.ncbi.nlm.nih.gov/39002100 https://repository.publisso.de/resource/frl:6495691 |
| Rights: | CC BY |
| Accession Number: | edsair.doi.dedup.....f7fa58d7f4a8f4c28c3c3a74f3365a60 |
| Database: | OpenAIRE |
| ISSN: | 18616429 |
|---|---|
| DOI: | 10.1007/s11548-024-03234-8 |