H&E Staining Deep Learning Models May Preserve Prostate Cancer Tissue Samples

Article

Researchers detailed methods and processes that may be useful for additional research and validation of computational hematoxylin and eosin staining deep learning models and the images generated by them.

A cross sectional study published in JAMA Network Open detailed methods and processes that may be useful for additional research and validation of computational hematoxylin and eosin (H&E) staining deep learning models and the images generated by them. 

Researchers suggested that the adoption of such systems may reduce the time and effort necessary for manual staining and slide preparation. More importantly, these systems may enable the preservation of precious tissue samples, which could be used in a targeted fashion for biomarker evaluation. 

“By describing explainable algorithms and quantitative methods that can consistently, rapidly, and accurately perform computational staining and destaining of prostate biopsy (red, green and blue channel images; RGB) (whole slide images; WSI), this study communicates a detailed method and process that may be useful to generate evidence for clinical and regulatory authentication of computationally H&E stained images,” the authors wrote. “However, greater numbers of virtually stained H&E images sourced from larger pools of patients are needed for prospective evaluation of such models.”

Researchers used hundreds of thousands of native non-stained RGB WSI patches of prostate core tissue biopsies obtained from excess tissue material from prostate core biopsies performed in the course of routine clinical care from the Brigham and Women’s Hospital in Boston, Massachusetts. Biopsies were registered with their H&E-stained versions. 

Thereafter, conditional generative adversarial neural networks (cGANs) that automate conversion of native non-stained RGB WSI to computational H&E-stained images were trained. Moreover, deidentified WSI of prostate core biopsy and medical record data were transferred to the Massachusetts Institute of Technology, Cambridge, for computational research and the results were shared with physicians for clinical evaluation.

Of 38 patients who provided samples, single core biopsy images were extracted from each whole slide, which resulted in 102 individual non-stained and H&E dye-stained image pairs that were compared with matched computationally stained and unstained images. Calculations found high similarities between computationally and H&E dye-stained images, with a mean (SD) structural similarity index (SSIM) of 0.902 (0.026), Pearson correlation coefficient (PCC) of 0.962 (0.096), and peak signal to noise ratio (PSNR) oof 22.821 (1.232) dB. 

Even further, a second cGAN performed accurate computational destaining of H&E-stained images back to their original non-stained form, with a mean (SD) SSIM of 0.900 (0.030), PCC of 0.963 (0.011), and PSNR of 25.646 (1.943) dB compared with native non-stained images. 

“Evaluation by trained pathologists showed tumorous and healthy tissues were morphologically well represented most of the computationally stained images with high accuracy,” the authors wrote. “The glands and stroma of benign prostatic tissue and carcinoma were identifiable, showing preserved architectural features (ie, location and shape of the glands), defined gland/stromal interface, and cytological characteristics (ie, location and appearance of the nuclei and nucleoli, if present). Most of the differences in annotations were observed either on the tumor/nontumor interface or boundary or the biopsy boundary.” 

A single blind prospective study computed approximately 95% pixel-by-pixel overlap among prostate tumor annotations provided by 5 board certified pathologists on computationally stained images, compared with those on H&E dye-stained images. This study also used the first visualization and explanation of neural network kernel activation maps during H&E staining and destaining of RGB images by cGANs. Importantly, high similarities between kernel activation maps of computationally and H&E-stained images (mean-squared errors < 0.0005) provided additional mathematical and mechanistic validation of the staining system. 

“Activation maps of our trained neural network models during computational staining or destaining of test images were highly similar to H&E dye–stained or native nonstained images,” the authors wrote. “Thus, by visualizing and comparing activation feature maps of kernels of trained models, this work also presents the first explainable deep neural network framework for computationally H&E staining or destaining of native RGB images, to our knowledge.”

Notably, the researchers suggested that additional fine-grained image annotation tools are needed for precise validation of results generated by computational staining algorithms. Moreover, the amount of data or number of patients used in the study was not exhaustive for clinical trials or other regulatory evaluations. 

“Greater numbers of virtually stained H&E images sourced from larger pools of patients are needed before prospective clinical evaluation of models described can begin,” the authors wrote. 

Reference:

Rana A, Lowe A, Lithgow M, et al. Use of Deep Learning to Develop and Analyze Computational Hematoxylin and Eosin Staining of Prostate Core Biopsy Images for Tumor Diagnosis. JAMA Network Open. doi:10.1001/jamanetworkopen.2020.5111. 

Related Videos
Two women in genitourinary oncology discuss their experiences with figuring out when to begin a family and how to prioritize both work and children.
Over the past few decades, the prostate cancer space has evolved with increased funding for clinical trial creation and enrollment.
Rohit Gosain, MD; Rahul Gosain, MD; and Rana R. McKay, MD, presenting slides
Rohit Gosain, MD; Rahul Gosain, MD; and Rana R. McKay, MD, presenting slides
Rohit Gosain, MD; Rahul Gosain, MD; and Rana R. McKay, MD, presenting slides
Rohit Gosain, MD; Rahul Gosain, MD; and Rana R. McKay, MD, presenting slides
Anemia in patients who receive talazoparib plus enzalutamide for metastatic castration-resistant prostate cancer appears to be manageable without any compromises in patient-reported outcomes and quality of life.
Artificial intelligence models may be “seamlessly incorporated” into clinical workflow in the management of prostate cancer, says Eric Li, MD.
Robust genetic testing guidelines in the prostate cancer space must be supported by strong clinical research before they can be properly implemented, says William J. Catalona, MD.
Related Content