Following resection of cancerous tissues, specimens are excised from the surgical margins to be examined post-operatively for the presence of residual cancer cells. Hematoxylin and eosin (H&E) staining is the gold standard of histopathological assessment. Ultraviolet photoacoustic microscopy (UV-PARS), combined with scattering microscopy, provides virtual nuclei and cytoplasm contrast similar to H&E staining. A generative adversarial network (GAN) deep learning approach, specifically a CycleGAN, was used to perform style transfer to improve the histological realism of UV-PARS generated images. Post-CycleGAN images are easier for a pathologist to examine and can be input into existing machine learning pipelines for H&E-stained images.
|