PurposeLength and width measurements of the kidneys aid in the detection and monitoring of structural abnormalities and organ disease. Manual measurement results in intra- and inter-rater variability, is complex and time-consuming, and is fraught with error. We propose an automated approach based on machine learning for quantifying kidney dimensions from two-dimensional (2D) ultrasound images in both native and transplanted kidneys.ApproachAn nnU-net machine learning model was trained on 514 images to segment the kidney capsule in standard longitudinal and transverse views. Two expert sonographers and three medical students manually measured the maximal kidney length and width in 132 ultrasound cines. The segmentation algorithm was then applied to the same cines, region fitting was performed, and the maximum kidney length and width were measured. Additionally, single kidney volume for 16 patients was estimated using either manual or automatic measurements.ResultsThe experts resulted in length of 84.8 ± 26.4 mm [95% CI: 80.0, 89.6] and a width of 51.8 ± 10.5 mm [49.9, 53.7]. The algorithm resulted a length of 86.3 ± 24.4 [81.5, 91.1] and a width of 47.1 ± 12.8 [43.6, 50.6]. Experts, novices, and the algorithm did not statistically significant differ from one another (p > 0.05). Bland–Altman analysis showed the algorithm produced a mean difference of 2.6 mm (SD = 1.2) from experts, compared to novices who had a mean difference of 3.7 mm (SD = 2.9 mm). For volumes, mean absolute difference was 47 mL (31%) consistent with ∼1 mm error in all three dimensions.ConclusionsThis pilot study demonstrates the feasibility of an automatic tool to measure in vivo kidney biometrics of length, width, and volume from standard 2D ultrasound views with comparable accuracy and reproducibility to expert sonographers. Such a tool may enhance workplace efficiency, assist novices, and aid in tracking disease progression.
The desire to improve patient safety and clinical precision has prompted research in the development of a real-time, single operator, image-guided solution for neuraxial anesthesia. Ultrasound is ideal for this application given that it is real-time, non-ionizing, and with recent advances, ultra-portable. Previous work has investigated the use of 3D ultrasound and 2D in-plane imaging to track needle insertions but faced barriers to successful clinical translation. The EpiGuide 2D is a novel multi-channel out-of-plane needle guide that addresses deficiencies observed in prior designs. Specifically, it leverages beam thickness, an inherent imaging artefact, to provide needle visibility over a range of depths. The current work investigates the ability of the EpiGuide 2D to visualize out-of-plane needle insertions. Two different needle types are explored with 9 needle angles over 5 distinct imaging depths. Benchtop testing is performed to assess stability of the guide’s open channels. Subsequent water bath testing is used to establish baseline visibility metrics across all angles. Finally, testing on an ex vivo porcine model is performed. A total of n=424 needle insertions are performed. Visible range and contrast-to-noise ratios are measured for each insertion. As needle angle approached parallel to the imaging plane, visible range increased. Needle echogenicity also increased the visible range of the needle in the water bath setting but was not found to have a statistically significant effect on visible range in the porcine model. The EpiGuide 2D accommodates needle visualization in tissue for depths of 21 mm to 53 mm. Further in vivo studies are warranted.
A projector-based augmented reality intracorporeal system (PARIS) is presented that includes a miniature tracked projector, tracked marker, and laparoscopic ultrasound (LUS) transducer. PARIS was developed to improve the efficacy and safety of laparoscopic partial nephrectomy (LPN). In particular, it has been demonstrated to effectively assist in the identification of tumor boundaries during surgery and to improve the surgeon’s understanding of the underlying anatomy. PARIS achieves this by displaying the orthographic projection of the cancerous tumor on the kidney’s surface. The performance of PARIS was evaluated in a user study with two surgeons who performed 32 simulated robot-assisted partial nephrectomies. They performed 16 simulated partial nephrectomies with PARIS for guidance and 16 simulated partial nephrectomies with only an LUS transducer for guidance. With PARIS, there was a significant reduction [30% (p<0.05)] in the amount of healthy tissue excised and a trend toward a more accurate dissection around the tumor and more negative margins. The combined point tracking and reprojection root-mean-square error of PARIS was 0.8 mm. PARIS’ proven ability to improve key metrics of LPN surgery and qualitative feedback from surgeons about PARIS supports the hypothesis that it is an effective surgical navigation tool.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.