[179098] |
Title: Optical force estimation for interactions between tool and soft tissues. |
Written by: M. Neidhardt and R. Mieling and M. Bengs and A. Schlaefer |
in: <em>Scientific Reports</em>. (2023). |
Volume: <strong>13</strong>. Number: (1), |
on pages: 506 |
Chapter: |
Editor: |
Publisher: |
Series: |
Address: |
Edition: |
ISBN: |
how published: |
Organization: |
School: |
Institution: |
Type: |
DOI: 10.1038/s41598-022-27036-7 |
URL: https://doi.org/10.1038/s41598-022-27036-7 |
ARXIVID: |
PMID: |
Note:
Abstract: Robotic assistance in minimally invasive surgery offers numerous advantages for both patient and surgeon. However, the lack of force feedback in robotic surgery is a major limitation, and accurately estimating tool-tissue interaction forces remains a challenge. Image-based force estimation offers a promising solution without the need to integrate sensors into surgical tools. In this indirect approach, interaction forces are derived from the observed deformation, with learning-based methods improving accuracy and real-time capability. However, the relationship between deformation and force is determined by the stiffness of the tissue. Consequently, both deformation and local tissue properties must be observed for an approach applicable to heterogeneous tissue. In this work, we use optical coherence tomography, which can combine the detection of tissue deformation with shear wave elastography in a single modality. We present a multi-input deep learning network for processing of local elasticity estimates and volumetric image data. Our results demonstrate that accounting for elastic properties is critical for accurate image-based force estimation across different tissue types and properties. Joint processing of local elasticity information yields the best performance throughout our phantom study. Furthermore, we test our approach on soft tissue samples that were not present during training and show that generalization to other tissue properties is possible.