PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Fluorescence imaging has been widely used during tumor surgery which allows the surgeon to recognize the tumor’s location and edges better. However, traditional fluorescence-guided surgery (FGS) usually uses steady-state fluorescence images which only could provide a flat view of the target but lack depth information. Here we use the Time-Of-Flight (TOF) method to measure the distance between the sensor and the target, which would allow us to distinguish objects surrounded by background fluorescence. By comparing the temporal profiles of different fluorescence embeddings, we could colormap the fluorescence images and show differences in depths. Moreover, we proposed a deep learning model that combines CNN and time-based model (LSTM) to capture more precise depth maps and 3D information. We trained and validated this new network using the Monte-Carlo-based simulation datasets.
Shiru Wang andPetr Bruza
"Time-of-flight fluorescence imaging in deep tissue: towards machine learning assisted depth sensing", Proc. SPIE PC12827, Multiscale Imaging and Spectroscopy V, PC128270F (13 March 2024); https://doi.org/10.1117/12.3003257
ACCESS THE FULL ARTICLE
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
The alert did not successfully save. Please try again later.
Shiru Wang, Petr Bruza, "Time-of-flight fluorescence imaging in deep tissue: towards machine learning assisted depth sensing," Proc. SPIE PC12827, Multiscale Imaging and Spectroscopy V, PC128270F (13 March 2024); https://doi.org/10.1117/12.3003257