PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
We have recently (BOE July 2022) proposed an interferometric approach called full-field transmission tomography (FFOTT) based on the use of the Gouy phase shift that manifests at the focus of microscope objectives. Forward scattering of cellular structures larger than 100 nm being much greater than backscattering (used in OCT) performances constraints of the imaging system are strongly relaxed and setups using cheap microscopes and a smartphones become possible. Note that good quality 100X, NA+1.25 objectives are available for less than $100.
We show cells and tissues images through their morphological or metabolic contrasts modified by a chemical or biological environments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Histochemical staining is traditionally performed using chemical labeling, which can be time consuming and expensive, particularly when multiple stains are needed. We present a technique which can be used to virtually stain histological tissues using deep learning. As this technique is performed computationally, multiple stains can be performed on each tissue, allowing pathologists to get more information out of a single tissue section. These stains can be performed using autofluorescence images of unlabeled tissue sections, or with scans of stained H&E stained tissues, which fits into existing pathology workflows. These stains have been validated in blind studies by board-certified pathologists.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Assessing mechanical properties of tissue plays an important role in disease diagnosis and clinical examination. Here, we present a low resource and cost-effective method of using digital camera technologies to map mechanical properties of tissue, termed camera-based optical palpation. We applied this technique to breast cancer detection and burn scar assessment, validating its capability of generating high mechanical contrast between various tissue regions for clinical applications. We also implemented camera-based optical palpation in a smartphone, demonstrating its potential for telehealth applications in rural and remote areas, improving equity of access to optimal treatment for people all around the world.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Advances in Automated Visual Evaluation in Cervical Applications
The focus of our research efforts is to improve the prevention and treatment of cancer, with an emphasis on women’s cancers, which is disproportionately prevalent in low- and middle-income countries. We strive to accelerate the impact of technology and innovation to address global health inequities, with cervical cancer prevention serving as an exemplar. Specifically, we have created an imaging device called the Pocket colposcope, which transforms the complex, costly and cumbersome clinical colposcope (for cervical cancer diagnosis) used in specialized settings to a simple, inexpensive, and hand-held device, which can be deployed in a local clinic. A companion deep learning algorithm that our team has developed has the capacity to assist primary care providers, particularly, midwives and nurses, when there is limited access to specialists. We are currently collaborating with partners in Peru and Kenya to integrate our technologies into community-based models of care delivery to provide timely screening, diagnosis, and treatment/referral. Our technology initiative ensures that there is continuum of care throughout the patient journey.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Background: Cervical cancer disproportionally harms women in low and middle income countries (LMICs). There is increasing interest in automated visual evaluation (AVE) – using artificial intelligence to analyze cervical images at the point of care (PoC) – for managing patients in LMICs. AVE has a diagnostic component (for pathology) and a quality component (to ensure image adequacy). The quality component must run on the imaging device at the PoC, and is limited by its processors for computation. Methods: A novel, multiple-module algorithm for assessing cervical image quality was developed in an Android application. One module located the cervix and another sought objects obstructing the transformation zone. The cervix locator module is an object detection model that determined the bounding box of the cervix. Models trained on multiple architectures (YOLOv5 and EfficientDet-Lite2) with the same data were compared. For obstructions classification, a multi-task model was trained to detect 5 common obstructions (blood, SCJ inside of os, loose vaginal walls, mucus, blur/glare) and obstruction-free cervix. Performance of the model’s tasks were compared for 2 different imaging devices. Results and Discussion: The cervix locator performed better and was faster for YOLOv5, although differences were minimal. In the obstructions classifier, 4 different tasks (loose vaginal walls, blood, SCJ inside of os, and obstruction-free) performed satisfactorily. For all modules, the full computation time was <10 sec. Both modules met the desired performance thresholds for image adequacy assessment. The algorithm shown here is to our knowledge, the first AVE quality classifier running on a mobile device.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Background: Cervical cancer is a significant burden in many health systems in low and middle income countries
(LMICs). Recently, automated visual evaluation (AVE) – using artificial intelligence to analyze cervical images at
the point of care (PoC) – has been gaining interest as a new diagnostic test in LMICs. Multiple studies showed that
blur (defocus) is the most common challenge to capturing cervical images that are adequate for evaluation by AVE.
Methods to reduce blur in cervical images are critical, yet auto-focus functionality degrades when placing an auxiliary
lens on a phone. Methods: A cervical image quality analysis algorithm that included blur assessment was developed
into an Android application. This algorithm includes an auto-focus module, and secondary blur assessment using
deep learning (DL). The auto-focus module was evaluated by bench testing on static cervix images. Two DL
approaches (supervised and self-supervised models) were compared against an external dataset. Results and
Discussion: A frame by frame analysis on the Samsung J530 and A52, each imaging 3 static images, verified the frame
with the least blurry image was selected. The average time for one auto-focus sweep was 8367 ± 630 ms and 7555 ±
146 ms for the J530 and A52, respectively. Within the obstructions detector, the self-supervised model performed
better under high blur, with area under the receiver operating characteristic (ROC) curve (AUC) as high as 0.888,
while the supervised model performed better with less blur, with ROC AUC values reaching 0.735. To our knowledge,
this is the first working targeted auto-focus for cervical imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Anti-microbial resistance has emerged as a major global threat. Due to lack of rapid AST tests, generic antibiotics are typically prescribed. Pathogens possess various metabolic biomarkers such as NAD(P)H and flavins which exhibit auto fluorescence. A rapid phenotypic point-of-care AST device is developed that leverages the changes in the autofluorescence when the pathogens encounter antibiotic stress. The device is integrated with multiple wavelength sources to excite various biomarkers, and a CMOS camera integrated with optical filters to capture the emitted autofluorescence intensity. The results show that the device is capable of determining antibiotic susceptibility with its significant minimum inhibitory concentration in under 5 hours suitable for point-of-care testing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present a high-throughput and automated system for the early detection and classification of bacterial colony-forming units (CFUs) using a thin-film transistor (TFT) image sensor. A lens-free imager was built using the TFT sensor with a ~7 cm2 field-of-view to collect the time-lapse images of bacterial colonies. Two trained neural networks were used to detect and classify the bacterial colonies based on their spatio-temporal features. Our system achieved an average CFU detection rate of 97.3% at 9 hours of incubation and an average CFU recovery rate of 91.6% at ~12 hours, saving ~12 hours compared to the EPA-approved method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The complete blood count (CBC) is a foundational diagnostic test, but its accessibility is limited due to the blood draw, expensive laboratory equipment, and trained personnel required. Here, we present a cell phone microscope design for achieving phase contrast in high resolution capillary imaging, which allows individual blood cells to be imaged for a non-invasive CBC. The cell phone microscope uses a reversed lens as an objective to maintain a high resolution. Relay lenses create space for incorporation of an offset LED that can be critically imaged to produce oblique back illumination, resulting in phase contrast.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We demonstrate a computational paper-based vertical flow assay (VFA) for point-of-care serodiagnosis of Lyme Disease (LD). We leveraged the multiplexed nature of the VFA and functionalized it using different antigen panels specific to LD. The paper-based VFA operation takes <20min, after which a hand-held reader captures an image of the sensing membrane. A deep learning-based algorithm processes the signals from multiple immunoreactions to output a diagnostic decision (i.e., positive/negative). This cost-effective computational VFA platform achieved a sensitivity and a specificity of 90.5% and 87%, respectively, demonstrating its promising potential for point-of-care diagnosis of LD even in resource-limited settings.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present a stain-free, rapid, and automated viral plaque assay using deep learning and holography, which needs significantly less sample incubation time than traditional plaque assays. A portable and cost-effective lens-free imaging prototype was built to record the spatio-temporal features of the plaque-forming units (PFUs) during their growth, without the need for staining. Our system detected the first cell lysing events as early as 5 hours of incubation and achieved >90% PFU detection rate with 100% specificity in <20 hours, saving >24 hours compared to the traditional viral plaque assays that take ≥48 hours.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present a computational mobile imaging device that captures holograms of aerosols through a virtual impactor, a flow-based device designed to detect aerosols. A differential detection scheme localizes all the flowing particles in air, and their auto-focused holograms are used to classify them using a trained neural network without any labels/stains. To test this cost-effective mobile device, we aerosolized different types of pollen (Bermuda, Elm, Oak, Pine, Sycamore, and Wheat) and achieved a blind testing classification accuracy of 92.91%. This cost-effective mobile system can be used as a long-term air quality monitor to automatically count/sense particulate matter and various allergens.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Optical coherence tomography (OCT) is standard of care technology for ophthalmic diagnostics. It is widely used for managing expensive and complex therapies in age-related macular degeneration (AMD). The extension of the technique for home use has tremendous utility. The high heterogeneity of the patient population requires optimization of expensive and complex treatment protocols that can only be achieved via high-frequency monitoring at home. Extending a complex 3D imaging technology for home use requires innovation at multiple fronts. The technical and usability methods are developed to allow self-imaging in patients with poor vision. Machine learning based techniques are used to manage high throughput of data and extract essential information for the physicians. Patient compliance over an extended period is essential for success of any home monitoring program and that requires a sophisticated support infrastructure. The presentation will discuss results of performing self-imaging using home OCT on over 600 eyes. The machine learning based analysis of the retinal images to quantify fluid in the diseased eyes to understand temporal dynamics will be demonstrated. The performance of the technique in longitudinal studies to demonstrate compliance using infrastructure of home monitoring as a service will be discussed. Home OCT development carries essential lessons for personalizing and decentralizing a number of complex technologies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present a miniaturized optical coherence tomography (OCT) setup based on photonic integrated circuits (PIC) for the 850 nm range. We designed a 512-channel arrayed waveguide grating (AWG) on a PIC for spectral domain OCT (SD-OCT) that is co-integrated with PIN-photodiodes and analog-to-digital-converters on one single chip. This image sensor is combined with all the necessary electronics to act as a camera. It is integrated into a fiber-based OCT system, achieving a sensitivity of >80dB and various samples are imaged. This optoelectronic system will allow building small and cost-effective OCT systems to monitor retinal diseases.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A low-cost swept-source OCT system for retinal imaging was achieved, based on a thermally tuned vertical-cavity surface-emitting laser (VCSEL). Its center wavelength can be tuned by adjusting the operating temperature through modulation of the injection current. Sweep rates of 50-100 kHz with a sensitivity 97dB and an axial resolution of about 50μm in air have been achieved. We present the results of a human retina in vivo, using such a thermally tuned VCSEL-based SS-OCT system. Based on our results, we believe that this technology can be used as a cost-effective OCT alternative for point-of-care diagnostics.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Photodynamic therapy (PDT) is a therapeutical modality which offers a minimally invasive alternative for high-grade cervical intraepithelial neoplasia (CIN) treatment. This prospective randomized controlled clinical trial aims to compare two PDT protocols for the treatment of histopathologically similar high-grade CIN. The patients were followed 60 days and two years after the treatment by cytology, hybrid capture and pathology analysis after the Excision of Transformation Zone (ETZ). Preliminary results are indicating, after two years follow-up, that combining ecto and endocervix treatment leads to better cure rates of high-grade CIN.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.